Sample records for event logging system

  1. The NetLogger Toolkit V2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunter, Dan; Lee, Jason; Stoufer, Martin

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation ofmore » application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and penodically try to reconnect broken TCP pipe. A typical use for this is to store data on local disk while net is down. An event archiver can log one or more incoming NetLogger streams to a local disk file (netlogd) or to a mySQL database (netarchd). We have found exploratory, visual analysis of the log event data to be the most useful means of determining the causes of performance anomalies The NetLogger Visualization tool, niv, has been developed to provide a flexible and interactive graphical representation of system-level and application-level events.« less

  2. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  3. Analysis of Effects of Sensor Multithreading to Generate Local System Event Timelines

    DTIC Science & Technology

    2014-03-27

    works on logs highlights the importance of logs [17, 18]. The two aforementioned works both reference the same 2009 Data Breach Investigations Report...the data breaches report on, the logs contained evidence of events leading up to 82% of those data breaches . This means that preventing 82% of the data ...report states that of the data breaches reported on, the logs contained evidence of events leading up to 66% of those data breaches . • The 2010 DBIR

  4. 3-Dimensional Root Cause Diagnosis via Co-analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Ziming; Lan, Zhiling; Yu, Li

    2012-01-01

    With the growth of system size and complexity, reliability has become a major concern for large-scale systems. Upon the occurrence of failure, system administrators typically trace the events in Reliability, Availability, and Serviceability (RAS) logs for root cause diagnosis. However, RAS log only contains limited diagnosis information. Moreover, the manual processing is time-consuming, error-prone, and not scalable. To address the problem, in this paper we present an automated root cause diagnosis mechanism for large-scale HPC systems. Our mechanism examines multiple logs to provide a 3-D fine-grained root cause analysis. Here, 3-D means that our analysis will pinpoint the failure layer,more » the time, and the location of the event that causes the problem. We evaluate our mechanism by means of real logs collected from a production IBM Blue Gene/P system at Oak Ridge National Laboratory. It successfully identifies failure layer information for 219 failures during 23-month period. Furthermore, it effectively identifies the triggering events with time and location information, even when the triggering events occur hundreds of hours before the resulting failures.« less

  5. Creative Analytics of Mission Ops Event Messages

    NASA Technical Reports Server (NTRS)

    Smith, Dan

    2017-01-01

    Historically, tremendous effort has been put into processing and displaying mission health and safety telemetry data; and relatively little attention has been paid to extracting information from missions time-tagged event log messages. Todays missions may log tens of thousands of messages per day and the numbers are expected to dramatically increase as satellite fleets and constellations are launched, as security monitoring continues to evolve, and as the overall complexity of ground system operations increases. The logs may contain information about orbital events, scheduled and actual observations, device status and anomalies, when operators were logged on, when commands were resent, when there were data drop outs or system failures, and much much more. When dealing with distributed space missions or operational fleets, it becomes even more important to systematically analyze this data. Several advanced information systems technologies make it appropriate to now develop analytic capabilities which can increase mission situational awareness, reduce mission risk, enable better event-driven automation and cross-mission collaborations, and lead to improved operations strategies: Industry Standard for Log Messages. The Object Management Group (OMG) Space Domain Task Force (SDTF) standards organization is in the process of creating a formal standard for industry for event log messages. The format is based on work at NASA GSFC. Open System Architectures. The DoD, NASA, and others are moving towards common open system architectures for mission ground data systems based on work at NASA GSFC with the full support of the commercial product industry and major integration contractors. Text Analytics. A specific area of data analytics which applies statistical, linguistic, and structural techniques to extract and classify information from textual sources. This presentation describes work now underway at NASA to increase situational awareness through the collection of non-telemetry mission operations information into a common log format and then providing display and analytics tools to provide in-depth assessment of the log contents. The work includes: Common interface formats for acquiring time-tagged text messages Conversion of common files for schedules, orbital events, and stored commands to the common log format Innovative displays to depict thousands of messages on a single display Structured English text queries against the log message data store, extensible to a more mature natural language query capability Goal of speech-to-text and text-to-speech additions to create a personal mission operations assistant to aid on-console operations. A wide variety of planned uses identified by the mission operations teams will be discussed.

  6. Aligning observed and modelled behaviour based on workflow decomposition

    NASA Astrophysics Data System (ADS)

    Wang, Lu; Du, YuYue; Liu, Wei

    2017-09-01

    When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.

  7. From IHE Audit Trails to XES Event Logs Facilitating Process Mining.

    PubMed

    Paster, Ferdinand; Helm, Emmanuel

    2015-01-01

    Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.

  8. Developing Surveillance Methodology for Agricultural and Logging Injury in New Hampshire Using Electronic Administrative Data Sets.

    PubMed

    Scott, Erika E; Hirabayashi, Liane; Krupa, Nicole L; Sorensen, Julie A; Jenkins, Paul L

    2015-08-01

    Agriculture and logging rank among industries with the highest rates of occupational fatality and injury. Establishing a nonfatal injury surveillance system is a top priority in the National Occupational Research Agenda. Sources of data such as patient care reports (PCRs) and hospitalization data have recently transitioned to electronic databases. Using narrative and location codes from PCRs, along with International Classification of Diseases, 9th Revision, external cause of injury codes (E-codes) in hospital data, researchers are designing a surveillance system to track farm and logging injury. A total of 357 true agricultural or logging cases were identified. These data indicate that it is possible to identify agricultural and logging injury events in PCR and hospital data. Multiple data sources increase catchment; nevertheless, limitations in methods of identification of agricultural and logging injury contribute to the likely undercount of injury events.

  9. HS.Register - An Audit-Trail Tool to Respond to the General Data Protection Regulation (GDPR).

    PubMed

    Gonçalves-Ferreira, Duarte; Leite, Mariana; Santos-Pereira, Cátia; Correia, Manuel E; Antunes, Luis; Cruz-Correia, Ricardo

    2018-01-01

    Introduction The new General Data Protection Regulation (GDPR) compels health care institutions and their software providers to properly document all personal data processing and provide clear evidence that their systems are inline with the GDPR. All applications involved in personal data processing should therefore produce meaningful event logs that can later be used for the effective auditing of complex processes. Aim This paper aims to describe and evaluate HS.Register, a system created to collect and securely manage at scale audit logs and data produced by a large number of systems. Methods HS.Register creates a single audit log by collecting and aggregating all kinds of meaningful event logs and data (e.g. ActiveDirectory, syslog, log4j, web server logs, REST, SOAP and HL7 messages). It also includes specially built dashboards for easy auditing and monitoring of complex processes, crossing different systems in an integrated way, as well as providing tools for helping on the auditing and on the diagnostics of difficult problems, using a simple web application. HS.Register is currently installed at five large Portuguese Hospitals and is composed of the following open-source components: HAproxy, RabbitMQ, Elasticsearch, Logstash and Kibana. Results HS.Register currently collects and analyses an average of 93 million events per week and it is being used to document and audit HL7 communications. Discussion Auditing tools like HS.Register are likely to become mandatory in the near future to allow for traceability and detailed auditing for GDPR compliance.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilsche, Thomas; Schuchart, Joseph; Cope, Joseph

    Event tracing is an important tool for understanding the performance of parallel applications. As concurrency increases in leadership-class computing systems, the quantity of performance log data can overload the parallel file system, perturbing the application being observed. In this work we present a solution for event tracing at leadership scales. We enhance the I/O forwarding system software to aggregate and reorganize log data prior to writing to the storage system, significantly reducing the burden on the underlying file system for this type of traffic. Furthermore, we augment the I/O forwarding system with a write buffering capability to limit the impactmore » of artificial perturbations from log data accesses on traced applications. To validate the approach, we modify the Vampir tracing tool to take advantage of this new capability and show that the approach increases the maximum traced application size by a factor of 5x to more than 200,000 processors.« less

  11. Tethered to the EHR: Primary Care Physician Workload Assessment Using EHR Event Log Data and Time-Motion Observations.

    PubMed

    Arndt, Brian G; Beasley, John W; Watkinson, Michelle D; Temte, Jonathan L; Tuan, Wen-Jan; Sinsky, Christine A; Gilchrist, Valerie J

    2017-09-01

    Primary care physicians spend nearly 2 hours on electronic health record (EHR) tasks per hour of direct patient care. Demand for non-face-to-face care, such as communication through a patient portal and administrative tasks, is increasing and contributing to burnout. The goal of this study was to assess time allocated by primary care physicians within the EHR as indicated by EHR user-event log data, both during clinic hours (defined as 8:00 am to 6:00 pm Monday through Friday) and outside clinic hours. We conducted a retrospective cohort study of 142 family medicine physicians in a single system in southern Wisconsin. All Epic (Epic Systems Corporation) EHR interactions were captured from "event logging" records over a 3-year period for both direct patient care and non-face-to-face activities, and were validated by direct observation. EHR events were assigned to 1 of 15 EHR task categories and allocated to either during or after clinic hours. Clinicians spent 355 minutes (5.9 hours) of an 11.4-hour workday in the EHR per weekday per 1.0 clinical full-time equivalent: 269 minutes (4.5 hours) during clinic hours and 86 minutes (1.4 hours) after clinic hours. Clerical and administrative tasks including documentation, order entry, billing and coding, and system security accounted for nearly one-half of the total EHR time (157 minutes, 44.2%). Inbox management accounted for another 85 minutes (23.7%). Primary care physicians spend more than one-half of their workday, nearly 6 hours, interacting with the EHR during and after clinic hours. EHR event logs can identify areas of EHR-related work that could be delegated, thus reducing workload, improving professional satisfaction, and decreasing burnout. Direct time-motion observations validated EHR-event log data as a reliable source of information regarding clinician time allocation. © 2017 Annals of Family Medicine, Inc.

  12. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > NOAH > PEOPLE Home Operational Products Experimental Data Verification / Development Contacts Change Log Events Calendar Events People Numerical Forecast Systems Coming Soon. NOAA

  13. CLARET user's manual: Mainframe Logs. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frobose, R.H.

    1984-11-12

    CLARET (Computer Logging and RETrieval) is a stand-alone PDP 11/23 system that can support 16 terminals. It provides a forms-oriented front end by which operators enter online activity logs for the Lawrence Livermore National Laboratory's OCTOPUS computer network. The logs are stored on the PDP 11/23 disks for later retrieval, and hardcopy reports are generated both automatically and upon request. Online viewing of the current logs is provided to management. As each day's logs are completed, the information is automatically sent to a CRAY and included in an online database system. The terminal used for the CLARET system is amore » dual-port Hewlett Packard 2626 terminal that can be used as either the CLARET logging station or as an independent OCTOPUS terminal. Because this is a stand-alone system, it does not depend on the availability of the OCTOPUS network to run and, in the event of a power failure, can be brought up independently.« less

  14. Logs Perl Module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owen, R. K.

    2007-04-04

    A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.

  15. Extracting the Textual and Temporal Structure of Supercomputing Logs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jain, S; Singh, I; Chandra, A

    2009-05-26

    Supercomputers are prone to frequent faults that adversely affect their performance, reliability and functionality. System logs collected on these systems are a valuable resource of information about their operational status and health. However, their massive size, complexity, and lack of standard format makes it difficult to automatically extract information that can be used to improve system management. In this work we propose a novel method to succinctly represent the contents of supercomputing logs, by using textual clustering to automatically find the syntactic structures of log messages. This information is used to automatically classify messages into semantic groups via an onlinemore » clustering algorithm. Further, we describe a methodology for using the temporal proximity between groups of log messages to identify correlated events in the system. We apply our proposed methods to two large, publicly available supercomputing logs and show that our technique features nearly perfect accuracy for online log-classification and extracts meaningful structural and temporal message patterns that can be used to improve the accuracy of other log analysis techniques.« less

  16. Real World Experience With Ion Implant Fault Detection at Freescale Semiconductor

    NASA Astrophysics Data System (ADS)

    Sing, David C.; Breeden, Terry; Fakhreddine, Hassan; Gladwin, Steven; Locke, Jason; McHugh, Jim; Rendon, Michael

    2006-11-01

    The Freescale automatic fault detection and classification (FDC) system has logged data from over 3.5 million implants in the past two years. The Freescale FDC system is a low cost system which collects summary implant statistics at the conclusion of each implant run. The data is collected by either downloading implant data log files from the implant tool workstation, or by exporting summary implant statistics through the tool's automation interface. Compared to the traditional FDC systems which gather trace data from sensors on the tool as the implant proceeds, the Freescale FDC system cannot prevent scrap when a fault initially occurs, since the data is collected after the implant concludes. However, the system can prevent catastrophic scrap events due to faults which are not detected for days or weeks, leading to the loss of hundreds or thousands of wafers. At the Freescale ATMC facility, the practical applications of the FD system fall into two categories: PM trigger rules which monitor tool signals such as ion gauges and charge control signals, and scrap prevention rules which are designed to detect specific failure modes that have been correlated to yield loss and scrap. PM trigger rules are designed to detect shifts in tool signals which indicate normal aging of tool systems. For example, charging parameters gradually shift as flood gun assemblies age, and when charge control rules start to fail a flood gun PM is performed. Scrap prevention rules are deployed to detect events such as particle bursts and excessive beam noise, events which have been correlated to yield loss. The FDC system does have tool log-down capability, and scrap prevention rules often use this capability to automatically log the tool into a maintenance state while simultaneously paging the sustaining technician for data review and disposition of the affected product.

  17. 10 CFR Appendix G to Part 73 - Reportable Safeguards Events

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... vulnerability in a safeguard system that could allow unauthorized or undetected access to a protected area... the safeguards event log. (a) Any failure, degradation, or discovered vulnerability in a safeguards...

  18. Process modeling and bottleneck mining in online peer-review systems.

    PubMed

    Premchaiswadi, Wichian; Porouhan, Parham

    2015-01-01

    This paper is divided into three main parts. In the first part of the study, we captured, collected and formatted an event log describing the handling of reviews for proceedings of an international conference in Thailand. In the second part, we used several process mining techniques in order to discover process models, social, organizational, and hierarchical structures from the proceeding's event log. In the third part, we detected the deviations and bottlenecks of the peer review process by comparing the observed events (i.e., authentic dataset) with a pre-defined model (i.e., master map). Finally, we investigated the performance information as well as the total waiting time in order to improve the effectiveness and efficiency of the online submission and peer review system for the prospective conferences and seminars. Consequently, the main goals of the study were as follows: (1) to convert the collected event log into the appropriate format supported by process mining analysis tools, (2) to discover process models and to construct social networks based on the collected event log, and (3) to find deviations, discrepancies and bottlenecks between the collected event log and the master pre-defined model. The results showed that although each paper was initially sent to three different reviewers; it was not always possible to make a decision after the first round of reviewing; therefore, additional reviewers were invited. In total, all the accepted and rejected manuscripts were reviewed by an average of 3.9 and 3.2 expert reviewers, respectively. Moreover, obvious violations of the rules and regulations relating to careless or inappropriate peer review of a manuscript-committed by the editorial board and other staff-were identified. Nine blocks of activity in the authentic dataset were not completely compatible with the activities defined in the master model. Also, five of the activity traces were not correctly enabled, and seven activities were missed within the online submission system. On the other hand, dealing with the feedback (comments) received from the first and the third reviewers; the conference committee members and the organizers did not attend to those feedback/comments in a timely manner.

  19. WikiLeaks and Iraq Body Count: the sum of parts may not add up to the whole-a comparison of two tallies of Iraqi civilian deaths.

    PubMed

    Carpenter, Dustin; Fuller, Tova; Roberts, Les

    2013-06-01

    Introduction The number of civilians killed in Iraq following the 2003 invasion has proven difficult to measure and contentious in recent years. The release of the Wikileaks War Logs (WL) has created the potential to conduct a sensitivity analysis of the commonly-cited Iraq Body Count's (IBC's) tally, which is based on press, government, and other public sources. Hypothesis The 66,000 deaths reported in the Wikileaks War Logs are mostly the same events as those previously reported in the press and elsewhere as tallied by iraqbodycount.org. A systematic random sample of 2500 violent fatal War Log incidents was selected and evaluated to determine whether these incidents were also found in IBC's press-based listing. Each selected event was ranked on a scale of 0 (no match present) to 3 (almost certainly matched) with regard to the likelihood it was listed in the IBC database. Of the two thousand four hundred and nine War Log records, 488 (23.8%) were found to have likely matches in IBC records. Events that killed more people were far more likely to appear in both datasets, with 94.1% of events in which ≥20 people were killed being likely matches, as compared with 17.4% of singleton killings. Because of this skew towards the recording of large events in both datasets, it is estimated that 2035 (46.3%) of the 4394 deaths reported in the Wikileaks War Logs had been previously reported in IBC. Passive surveillance systems, widely seen as incomplete, may also be selective in the types of events detected in times of armed conflict. Bombings and other events during which many people are killed, and events in less violent areas, appear to be detected far more often, creating a skewed image of the mortality profile in Iraq. Members of the press and researchers should be hesitant to draw conclusions about the nature or extent of violence from passive surveillance systems of low or unknown sensitivity.

  20. Improving linear accelerator service response with a real- time electronic event reporting system.

    PubMed

    Hoisak, Jeremy D P; Pawlicki, Todd; Kim, Gwe-Ya; Fletcher, Richard; Moore, Kevin L

    2014-09-08

    To track linear accelerator performance issues, an online event recording system was developed in-house for use by therapists and physicists to log the details of technical problems arising on our institution's four linear accelerators. In use since October 2010, the system was designed so that all clinical physicists would receive email notification when an event was logged. Starting in October 2012, we initiated a pilot project in collaboration with our linear accelerator vendor to explore a new model of service and support, in which event notifications were also sent electronically directly to dedicated engineers at the vendor's technical help desk, who then initiated a response to technical issues. Previously, technical issues were reported by telephone to the vendor's call center, which then disseminated information and coordinated a response with the Technical Support help desk and local service engineers. The purpose of this work was to investigate the improvements to clinical operations resulting from this new service model. The new and old service models were quantitatively compared by reviewing event logs and the oncology information system database in the nine months prior to and after initiation of the project. Here, we focus on events that resulted in an inoperative linear accelerator ("down" machine). Machine downtime, vendor response time, treatment cancellations, and event resolution were evaluated and compared over two equivalent time periods. In 389 clinical days, there were 119 machine-down events: 59 events before and 60 after introduction of the new model. In the new model, median time to service response decreased from 45 to 8 min, service engineer dispatch time decreased 44%, downtime per event decreased from 45 to 20 min, and treatment cancellations decreased 68%. The decreased vendor response time and reduced number of on-site visits by a service engineer resulted in decreased downtime and decreased patient treatment cancellations.

  1. Tethered to the EHR: Primary Care Physician Workload Assessment Using EHR Event Log Data and Time-Motion Observations

    PubMed Central

    Arndt, Brian G.; Beasley, John W.; Watkinson, Michelle D.; Temte, Jonathan L.; Tuan, Wen-Jan; Sinsky, Christine A.; Gilchrist, Valerie J.

    2017-01-01

    PURPOSE Primary care physicians spend nearly 2 hours on electronic health record (EHR) tasks per hour of direct patient care. Demand for non–face-to-face care, such as communication through a patient portal and administrative tasks, is increasing and contributing to burnout. The goal of this study was to assess time allocated by primary care physicians within the EHR as indicated by EHR user-event log data, both during clinic hours (defined as 8:00 am to 6:00 pm Monday through Friday) and outside clinic hours. METHODS We conducted a retrospective cohort study of 142 family medicine physicians in a single system in southern Wisconsin. All Epic (Epic Systems Corporation) EHR interactions were captured from “event logging” records over a 3-year period for both direct patient care and non–face-to-face activities, and were validated by direct observation. EHR events were assigned to 1 of 15 EHR task categories and allocated to either during or after clinic hours. RESULTS Clinicians spent 355 minutes (5.9 hours) of an 11.4-hour workday in the EHR per weekday per 1.0 clinical full-time equivalent: 269 minutes (4.5 hours) during clinic hours and 86 minutes (1.4 hours) after clinic hours. Clerical and administrative tasks including documentation, order entry, billing and coding, and system security accounted for nearly one-half of the total EHR time (157 minutes, 44.2%). Inbox management accounted for another 85 minutes (23.7%). CONCLUSIONS Primary care physicians spend more than one-half of their workday, nearly 6 hours, interacting with the EHR during and after clinic hours. EHR event logs can identify areas of EHR-related work that could be delegated, thus reducing workload, improving professional satisfaction, and decreasing burnout. Direct time-motion observations validated EHR-event log data as a reliable source of information regarding clinician time allocation. PMID:28893811

  2. Psychometric assessment of the IBS-D Daily Symptom Diary and Symptom Event Log.

    PubMed

    Rosa, Kathleen; Delgado-Herrera, Leticia; Zeiher, Bernie; Banderas, Benjamin; Arbuckle, Rob; Spears, Glen; Hudgens, Stacie

    2016-12-01

    Diarrhea-predominant irritable bowel syndrome (IBS-D) can considerably impact patients' lives. Patient-reported symptoms are crucial in understanding the diagnosis and progression of IBS-D. This study psychometrically evaluates the newly developed IBS-D Daily Symptom Diary and Symptom Event Log (hereafter, "Event Log") according to US regulatory recommendations. A US-based observational field study was conducted to understand cross-sectional psychometric properties of the IBS-D Daily Symptom Diary and Event Log. Analyses included item descriptive statistics, item-to-item correlations, reliability, and construct validity. The IBS-D Daily Symptom Diary and Event Log had no items with excessive missing data. With the exception of two items ("frequency of gas" and "accidents"), moderate to high inter-item correlations were observed among all items of the IBS-D Daily Symptom Diary and Event Log (day 1 range 0.67-0.90). Item scores demonstrated reliability, with the exception of the "frequency of gas" and "accidents" items of the Diary and "incomplete evacuation" item of the Event Log. The pattern of correlations of the IBS-D Daily Symptom Diary and Event Log item scores with generic and disease-specific measures was as expected, moderate for similar constructs and low for dissimilar constructs, supporting construct validity. Known-groups methods showed statistically significant differences and monotonic trends in each of the IBS-D Daily Symptom Diary item scores among groups defined by patients' IBS-D severity ratings ("none"/"mild," "moderate," or "severe"/"very severe"), supporting construct validity. Initial psychometric results support the reliability and validity of the items of the IBS-D Daily Symptom Diary and Event Log.

  3. User Centric Job Monitoring - a redesign and novel approach in the STAR experiment

    NASA Astrophysics Data System (ADS)

    Arkhipkin, D.; Lauret, J.; Zulkarneeva, Y.

    2014-06-01

    User Centric Monitoring (or UCM) has been a long awaited feature in STAR, whereas programs, workflows and system "events" could be logged, broadcast and later analyzed. UCM allows to collect and filter available job monitoring information from various resources and present it to users in a user-centric view rather than an administrative-centric point of view. The first attempt and implementation of "a" UCM approach was made in STAR 2004 using a log4cxx plug-in back-end and then further evolved with an attempt to push toward a scalable database back-end (2006) and finally using a Web-Service approach (2010, CSW4DB SBIR). The latest showed to be incomplete and not addressing the evolving needs of the experiment where streamlined messages for online (data acquisition) purposes as well as the continuous support for the data mining needs and event analysis need to coexists and unified in a seamless approach. The code also revealed to be hardly maintainable. This paper presents the next evolutionary step of the UCM toolkit, a redesign and redirection of our latest attempt acknowledging and integrating recent technologies and a simpler, maintainable and yet scalable manner. The extended version of the job logging package is built upon three-tier approach based on Task, Job and Event, and features a Web-Service based logging API, a responsive AJAX-powered user interface, and a database back-end relying on MongoDB, which is uniquely suited for STAR needs. In addition, we present details of integration of this logging package with the STAR offline and online software frameworks. Leveraging on the reported experience and work from the ATLAS and CMS experience on using the ESPER engine, we discuss and show how such approach has been implemented in STAR for meta-data event triggering stream processing and filtering. An ESPER based solution seems to fit well into the online data acquisition system where many systems are monitored.

  4. A New Essential Functions Installed DWH in Hospital Information System: Process Mining Techniques and Natural Language Processing.

    PubMed

    Honda, Masayuki; Matsumoto, Takehiro

    2017-01-01

    Several kinds of event log data produced in daily clinical activities have yet to be used for secure and efficient improvement of hospital activities. Data Warehouse systems in Hospital Information Systems used for the analysis of structured data such as disease, lab-tests, and medications, have also shown efficient outcomes. This article is focused on two kinds of essential functions: process mining using log data and non-structured data analysis via Natural Language Processing.

  5. 10 CFR 73.46 - Fixed site physical protection systems, subsystems, components, and procedures.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... barrier in the event of its penetration. If parking facilities are provided for employees or visitors... bullet resisting walls, doors, ceiling, floor, and windows. (ii) When the licensee has cause to suspect..., and name of the individual to be visited in a log. The licensee shall retain each log as a record for...

  6. 10 CFR 73.46 - Fixed site physical protection systems, subsystems, components, and procedures.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... barrier in the event of its penetration. If parking facilities are provided for employees or visitors... bullet resisting walls, doors, ceiling, floor, and windows. (ii) When the licensee has cause to suspect..., and name of the individual to be visited in a log. The licensee shall retain each log as a record for...

  7. Post-wildfire management

    Treesearch

    Jonathan W. Long; Carl Skinner; Susan Charnley; Ken Hubbert; Lenya Quinn-Davidson; Marc Meyer

    2014-01-01

    Wildfires, especially large, severe, and unmanageable events, exert major influences on socioecological systems, not only through risks to life and property, but also losses of important values associated with mature forest stands. These events prompt decisions about post-wildfire management interventions, including short-term emergency responses, salvage logging, and...

  8. Introducing high performance distributed logging service for ACS

    NASA Astrophysics Data System (ADS)

    Avarias, Jorge A.; López, Joao S.; Maureira, Cristián; Sommer, Heiko; Chiozzi, Gianluca

    2010-07-01

    The ALMA Common Software (ACS) is a software framework that provides the infrastructure for the Atacama Large Millimeter Array and other projects. ACS, based on CORBA, offers basic services and common design patterns for distributed software. Every properly built system needs to be able to log status and error information. Logging in a single computer scenario can be as easy as using fprintf statements. However, in a distributed system, it must provide a way to centralize all logging data in a single place without overloading the network nor complicating the applications. ACS provides a complete logging service infrastructure in which every log has an associated priority and timestamp, allowing filtering at different levels of the system (application, service and clients). Currently the ACS logging service uses an implementation of the CORBA Telecom Log Service in a customized way, using only a minimal subset of the features provided by the standard. The most relevant feature used by ACS is the ability to treat the logs as event data that gets distributed over the network in a publisher-subscriber paradigm. For this purpose the CORBA Notification Service, which is resource intensive, is used. On the other hand, the Data Distribution Service (DDS) provides an alternative standard for publisher-subscriber communication for real-time systems, offering better performance and featuring decentralized message processing. The current document describes how the new high performance logging service of ACS has been modeled and developed using DDS, replacing the Telecom Log Service. Benefits and drawbacks are analyzed. A benchmark is presented comparing the differences between the implementations.

  9. Increasing the Operational Value of Event Messages

    NASA Technical Reports Server (NTRS)

    Li, Zhenping; Savkli, Cetin; Smith, Dan

    2003-01-01

    Assessing the health of a space mission has traditionally been performed using telemetry analysis tools. Parameter values are compared to known operational limits and are plotted over various time periods. This presentation begins with the notion that there is an incredible amount of untapped information contained within the mission s event message logs. Through creative advancements in message handling tools, the event message logs can be used to better assess spacecraft and ground system status and to highlight and report on conditions not readily apparent when messages are evaluated one-at-a-time during a real-time pass. Work in this area is being funded as part of a larger NASA effort at the Goddard Space Flight Center to create component-based, middleware-based, standards-based general purpose ground system architecture referred to as GMSEC - the GSFC Mission Services Evolution Center. The new capabilities and operational concepts for event display, event data analyses and data mining are being developed by Lockheed Martin and the new subsystem has been named GREAT - the GMSEC Reusable Event Analysis Toolkit. Planned for use on existing and future missions, GREAT has the potential to increase operational efficiency in areas of problem detection and analysis, general status reporting, and real-time situational awareness.

  10. Data processing and case identification in an agricultural and logging morbidity surveillance study: Trends over time.

    PubMed

    Scott, Erika; Bell, Erin; Krupa, Nicole; Hirabayashi, Liane; Jenkins, Paul

    2017-09-01

    Agriculture and logging are dangerous industries, and though data on fatal injury exists, less is known about non-fatal injury. Establishing a non-fatal injury surveillance system is a top priority. Pre-hospital care reports and hospitalization data were explored as a low-cost option for ongoing surveillance of occupational injury. Using pre-hospital care report free-text and location codes, along with hospital ICD-9-CM external cause of injury codes, we created a surveillance system that tracked farm and logging injuries. In Maine and New Hampshire, 1585 injury events were identified (2008-2010). The incidence of injuries was 12.4/1000 for agricultural workers, compared to 10.4/1000 to 12.2/1000 for logging workers. These estimates are consistent with other recent estimates. This system is limited to traumatic injury for which medical treatment is administered, and is limited by the accuracy of coding and spelling. This system has the potential to be both sustainable and low cost. © 2017 Wiley Periodicals, Inc.

  11. Amino-terminal pro-B-type natriuretic peptide and high-sensitivity C-reactive protein but not cystatin C predict cardiovascular events in male patients with peripheral artery disease independently of ambulatory pulse pressure.

    PubMed

    Skoglund, Per H; Arpegård, Johannes; Ostergren, Jan; Svensson, Per

    2014-03-01

    Patients with peripheral arterial disease (PAD) are at high risk for cardiovascular (CV) events. We have previously shown that ambulatory pulse pressure (APP) predicts CV events in PAD patients. The biomarkers amino-terminal pro-B-type natriuretic peptide (NT-proBNP), high-sensitivity C-reactive protein (hs-CRP), and cystatin C are related to a worse outcome in patients with CV disease, but their predictive values have not been studied in relation to APP. Blood samples and 24-hour measurements of ambulatory blood pressure were examined in 98 men referred for PAD evaluation during 1998-2001. Patients were followed for a median of 71 months. The outcome variable was CV events defined as either CV mortality or any hospitalization for myocardial infarction, stroke, or coronary revascularization. The predictive values of log(NT-proBNP), log(hs-CRP), and log(cystatin C) alone and together with APP were assessed by multivariable Cox regression. Area under the curve (AUC) and net reclassification improvement (NRI) were calculated compared with a model containing other significant risk factors. During follow-up, 36 patients had at least 1 CV event. APP, log(NT-proBNP), and log(hs-CRP) all predicted CV events in univariable analysis, whereas log(cystatin C) did not. In multivariable analysis log(NT-proBNP) (hazard ratio (HR) = 1.62; 95% confidence interval (CI) = 1.05-2.51) and log(hs-CRP) (HR = 1.63; 95% CI = 1.19-2.24) predicted events independently of 24-hour PP. The combination of log(NT-proBNP), log(hs-CRP), and average day PP improved risk discrimination (AUC = 0.833 vs. 0.736; P < 0.05) and NRI (37%; P < 0.01) when added to other significant risk factors. NT-proBNP and hs-CRP predict CV events independently of APP and the combination of hs-CRP, NT-proBNP, and day PP improves risk discrimination in PAD patients.

  12. 76 FR 4463 - Privacy Act of 1974; Report of Modified or Altered System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... occupationally related mortality or morbidity is occurring. In the event of litigation where the defendant is: (a... diseases and which provides for the confidentiality of the information. In the event of litigation..., limited log-ins, virus protection, and user rights/file attribute restrictions. Password protection...

  13. Security middleware infrastructure for DICOM images in health information systems.

    PubMed

    Kallepalli, Vijay N V; Ehikioya, Sylvanus A; Camorlinga, Sergio; Rueda, Jose A

    2003-12-01

    In health care, it is mandatory to maintain the privacy and confidentiality of medical data. To achieve this, a fine-grained access control and an access log for accessing medical images are two important aspects that need to be considered in health care systems. Fine-grained access control provides access to medical data only to authorized persons based on priority, location, and content. A log captures each attempt to access medical data. This article describes an overall middleware infrastructure required for secure access to Digital Imaging and Communication in Medicine (DICOM) images, with an emphasis on access control and log maintenance. We introduce a hybrid access control model that combines the properties of two existing models. A trust relationship between hospitals is used to make the hybrid access control model scalable across hospitals. We also discuss events that have to be logged and where the log has to be maintained. A prototype of security middleware infrastructure is implemented.

  14. National Centers for Environmental Prediction

    Science.gov Websites

    Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Contacts Change Log Events Calendar People Numerical Forecast Systems Ensemble and Post Processing Team

  15. National Centers for Environmental Prediction

    Science.gov Websites

    Contacts Change Log Events Calendar Numerical Forecast Systems Link to NOAA/ESRL Rapid Refresh page [< ;--click here] Link to NOAA/ESRL High-Resolution Rapid Refresh page [<--click here] NOAA / National

  16. Integrated system for well-to-well correlation with geological knowledge base

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saito, K.; Doi, E.; Uchiyama, T.

    1987-05-01

    A task of well-to-well correlation is an essential part of the reservoir description study. Since the task is involved with diverse data such as logs, dipmeter, seismic, and reservoir engineering, a system with simultaneous access to such data is desirable. A system is developed to aid stratigraphic correlation under a Xerox 1108 workstation, written in INTERLISP-D. The system uses log, dipmeter, seismic, and computer-processed results such as Litho-Analysis and LSA (Log Shape Analyzer). The system first defines zones which are segmentations of log data into consistent layers using Litho-Analysis and LSA results. Each zone is defined as a minimum unitmore » for correlation with slot values of lithology, thickness, log values, and log shape such as bell, cylinder, and funnel. Using a user's input of local geological knowledge such as depositional environment, the system selects marker beds and performs correlation among the wells chosen from the base map. Correlation is performed first with markers and then with sandstones of lesser lateral extent. Structural dip and seismic horizon are guides for seeking a correlatable event. Knowledge of sand body geometry such as ratio of thickness and width is also used to provide a guide on how far a correlation should be made. Correlation results performed by the system are displayed on the screen for the user to examine and modify. The system has been tested with data sets from several depositional settings and has shown to be a useful tool for correlation work. The results are stored as a data base for structural mapping and reservoir engineering study.« less

  17. Robust Covariate-Adjusted Log-Rank Statistics and Corresponding Sample Size Formula for Recurrent Events Data

    PubMed Central

    Song, Rui; Kosorok, Michael R.; Cai, Jianwen

    2009-01-01

    Summary Recurrent events data are frequently encountered in clinical trials. This article develops robust covariate-adjusted log-rank statistics applied to recurrent events data with arbitrary numbers of events under independent censoring and the corresponding sample size formula. The proposed log-rank tests are robust with respect to different data-generating processes and are adjusted for predictive covariates. It reduces to the Kong and Slud (1997, Biometrika 84, 847–862) setting in the case of a single event. The sample size formula is derived based on the asymptotic normality of the covariate-adjusted log-rank statistics under certain local alternatives and a working model for baseline covariates in the recurrent event data context. When the effect size is small and the baseline covariates do not contain significant information about event times, it reduces to the same form as that of Schoenfeld (1983, Biometrics 39, 499–503) for cases of a single event or independent event times within a subject. We carry out simulations to study the control of type I error and the comparison of powers between several methods in finite samples. The proposed sample size formula is illustrated using data from an rhDNase study. PMID:18162107

  18. Requirements-Driven Log Analysis Extended Abstract

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2012-01-01

    Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing logging infrastructure, which after all is part of most systems put in production. A log is a sequence of events, generated by special log recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a log using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such log analysis, compared to the entire testing team changing behavior.

  19. GTSO: Global Trace Synchronization and Ordering Mechanism for Wireless Sensor Network Monitoring Platforms.

    PubMed

    Navia, Marlon; Campelo, José Carlos; Bonastre, Alberto; Ors, Rafael

    2017-12-23

    Monitoring is one of the best ways to evaluate the behavior of computer systems. When the monitored system is a distributed system-such as a wireless sensor network (WSN)-the monitoring operation must also be distributed, providing a distributed trace for further analysis. The temporal sequence of occurrence of the events registered by the distributed monitoring platform (DMP) must be correctly established to provide cause-effect relationships between them, so the logs obtained in different monitor nodes must be synchronized. Many of synchronization mechanisms applied to DMPs consist in adjusting the internal clocks of the nodes to the same value as a reference time. However, these mechanisms can create an incoherent event sequence. This article presents a new method to achieve global synchronization of the traces obtained in a DMP. It is based on periodic synchronization signals that are received by the monitor nodes and logged along with the recorded events. This mechanism processes all traces and generates a global post-synchronized trace by scaling all times registered proportionally according with the synchronization signals. It is intended to be a simple but efficient offline mechanism. Its application in a WSN-DMP demonstrates that it guarantees a correct ordering of the events, avoiding the aforementioned issues.

  20. Identification and Management of Pump Thrombus in the HeartWare Left Ventricular Assist Device System: A Novel Approach Using Log File Analysis.

    PubMed

    Jorde, Ulrich P; Aaronson, Keith D; Najjar, Samer S; Pagani, Francis D; Hayward, Christopher; Zimpfer, Daniel; Schlöglhofer, Thomas; Pham, Duc T; Goldstein, Daniel J; Leadley, Katrin; Chow, Ming-Jay; Brown, Michael C; Uriel, Nir

    2015-11-01

    The study sought to characterize patterns in the HeartWare (HeartWare Inc., Framingham, Massachusetts) ventricular assist device (HVAD) log files associated with successful medical treatment of device thrombosis. Device thrombosis is a serious adverse event for mechanical circulatory support devices and is often preceded by increased power consumption. Log files of the pump power are easily accessible on the bedside monitor of HVAD patients and may allow early diagnosis of device thrombosis. Furthermore, analysis of the log files may be able to predict the success rate of thrombolysis or the need for pump exchange. The log files of 15 ADVANCE trial patients (algorithm derivation cohort) with 16 pump thrombus events treated with tissue plasminogen activator (tPA) were assessed for changes in the absolute and rate of increase in power consumption. Successful thrombolysis was defined as a clinical resolution of pump thrombus including normalization of power consumption and improvement in biochemical markers of hemolysis. Significant differences in log file patterns between successful and unsuccessful thrombolysis treatments were verified in 43 patients with 53 pump thrombus events implanted outside of clinical trials (validation cohort). The overall success rate of tPA therapy was 57%. Successful treatments had significantly lower measures of percent of expected power (130.9% vs. 196.1%, p = 0.016) and rate of increase in power (0.61 vs. 2.87, p < 0.0001). Medical therapy was successful in 77.7% of the algorithm development cohort and 81.3% of the validation cohort when the rate of power increase and percent of expected power values were <1.25% and 200%, respectively. Log file parameters can potentially predict the likelihood of successful tPA treatments and if validated prospectively, could substantially alter the approach to thrombus management. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  1. Automatically Identifying and Predicting Unplanned Wind Turbine Stoppages Using SCADA and Alarms System Data: Case Study and Results

    NASA Astrophysics Data System (ADS)

    Leahy, Kevin; Gallagher, Colm; Bruton, Ken; O'Donovan, Peter; O'Sullivan, Dominic T. J.

    2017-11-01

    Using 10-minute wind turbine SCADA data for fault prediction offers an attractive way of gaining additional prognostic capabilities without needing to invest in extra hardware. To use these data-driven methods effectively, the historical SCADA data must be labelled with the periods when the turbine was in faulty operation as well the sub-system the fault was attributed to. Manually identifying faults using maintenance logs can be effective, but is also highly time consuming and tedious due to the disparate nature of these logs across manufacturers, operators and even individual maintenance events. Turbine alarm systems can help to identify these periods, but the sheer volume of alarms and false positives generated makes analysing them on an individual basis ineffective. In this work, we present a new method for automatically identifying historical stoppages on the turbine using SCADA and alarms data. Each stoppage is associated with either a fault in one of the turbine’s sub-systems, a routine maintenance activity, a grid-related event or a number of other categories. This is then checked against maintenance logs for accuracy and the labelled data fed into a classifier for predicting when these stoppages will occur. Results show that the automated labelling process correctly identifies each type of stoppage, and can be effectively used for SCADA-based prediction of turbine faults.

  2. A log-Weibull spatial scan statistic for time to event data.

    PubMed

    Usman, Iram; Rosychuk, Rhonda J

    2018-06-13

    Spatial scan statistics have been used for the identification of geographic clusters of elevated numbers of cases of a condition such as disease outbreaks. These statistics accompanied by the appropriate distribution can also identify geographic areas with either longer or shorter time to events. Other authors have proposed the spatial scan statistics based on the exponential and Weibull distributions. We propose the log-Weibull as an alternative distribution for the spatial scan statistic for time to events data and compare and contrast the log-Weibull and Weibull distributions through simulation studies. The effect of type I differential censoring and power have been investigated through simulated data. Methods are also illustrated on time to specialist visit data for discharged patients presenting to emergency departments for atrial fibrillation and flutter in Alberta during 2010-2011. We found northern regions of Alberta had longer times to specialist visit than other areas. We proposed the spatial scan statistic for the log-Weibull distribution as a new approach for detecting spatial clusters for time to event data. The simulation studies suggest that the test performs well for log-Weibull data.

  3. Detection of anomalous events

    DOEpatents

    Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.

    2016-06-07

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCord, Jason

    WLS gathers all known relevant contextual data along with standard event log information, processes it into an easily consumable format for analysis by 3rd party tools, and forwards the logs to any compatible log server.

  5. An open-source data storage and visualization back end for experimental data.

    PubMed

    Nielsen, Kenneth; Andersen, Thomas; Jensen, Robert; Nielsen, Jane H; Chorkendorff, Ib

    2014-04-01

    In this article, a flexible free and open-source software system for data logging and presentation will be described. The system is highly modular and adaptable and can be used in any laboratory in which continuous and/or ad hoc measurements require centralized storage. A presentation component for the data back end has furthermore been written that enables live visualization of data on any device capable of displaying Web pages. The system consists of three parts: data-logging clients, a data server, and a data presentation Web site. The logging of data from independent clients leads to high resilience to equipment failure, whereas the central storage of data dramatically eases backup and data exchange. The visualization front end allows direct monitoring of acquired data to see live progress of long-duration experiments. This enables the user to alter experimental conditions based on these data and to interfere with the experiment if needed. The data stored consist both of specific measurements and of continuously logged system parameters. The latter is crucial to a variety of automation and surveillance features, and three cases of such features are described: monitoring system health, getting status of long-duration experiments, and implementation of instant alarms in the event of failure.

  6. GTSO: Global Trace Synchronization and Ordering Mechanism for Wireless Sensor Network Monitoring Platforms

    PubMed Central

    Bonastre, Alberto; Ors, Rafael

    2017-01-01

    Monitoring is one of the best ways to evaluate the behavior of computer systems. When the monitored system is a distributed system—such as a wireless sensor network (WSN)—the monitoring operation must also be distributed, providing a distributed trace for further analysis. The temporal sequence of occurrence of the events registered by the distributed monitoring platform (DMP) must be correctly established to provide cause-effect relationships between them, so the logs obtained in different monitor nodes must be synchronized. Many of synchronization mechanisms applied to DMPs consist in adjusting the internal clocks of the nodes to the same value as a reference time. However, these mechanisms can create an incoherent event sequence. This article presents a new method to achieve global synchronization of the traces obtained in a DMP. It is based on periodic synchronization signals that are received by the monitor nodes and logged along with the recorded events. This mechanism processes all traces and generates a global post-synchronized trace by scaling all times registered proportionally according with the synchronization signals. It is intended to be a simple but efficient offline mechanism. Its application in a WSN-DMP demonstrates that it guarantees a correct ordering of the events, avoiding the aforementioned issues. PMID:29295494

  7. Anomalous event diagnosis for environmental satellite systems

    NASA Technical Reports Server (NTRS)

    Ramsay, Bruce H.

    1993-01-01

    The National Oceanic and Atmospheric Administration's (NOAA) National Environmental Satellite, Data, and Information Service (NESDIS) is responsible for the operation of the NOAA geostationary and polar orbiting satellites. NESDIS provides a wide array of operational meteorological and oceanographic products and services and operates various computer and communication systems on a 24-hour, seven days per week schedule. The Anomaly Reporting System contains a database of anomalous events regarding the operations of the Geostationary Operational Environmental Satellite (GOES), communication, or computer systems that have degraded or caused the loss of GOES imagery. Data is currently entered manually via an automated query user interface. There are 21 possible symptoms (e.g., No Data), and 73 possible causes (e.g., Sectorizer - World Weather Building) of an anomalous event. The determination of an event's cause(s) is made by the on-duty computer operator, who enters the event in a paper based daily log, and by the analyst entering the data into the reporting system. The determination of the event's cause(s) impacts both the operational status of these systems, and the performance evaluation of the on-site computer and communication operations contractor.

  8. Leveraging Semantic Labels for Multi-level Abstraction in Medical Process Mining and Trace Comparison.

    PubMed

    Leonardi, Giorgio; Striani, Manuel; Quaglini, Silvana; Cavallini, Anna; Montani, Stefania

    2018-05-21

    Many medical information systems record data about the executed process instances in the form of an event log. In this paper, we present a framework, able to convert actions in the event log into higher level concepts, at different levels of abstraction, on the basis of domain knowledge. Abstracted traces are then provided as an input to trace comparison and semantic process discovery. Our abstraction mechanism is able to manage non trivial situations, such as interleaved actions or delays between two actions that abstract to the same concept. Trace comparison resorts to a similarity metric able to take into account abstraction phase penalties, and to deal with quantitative and qualitative temporal constraints in abstracted traces. As for process discovery, we rely on classical algorithms embedded in the framework ProM, made semantic by the capability of abstracting the actions on the basis of their conceptual meaning. The approach has been tested in stroke care, where we adopted abstraction and trace comparison to cluster event logs of different stroke units, to highlight (in)correct behavior, abstracting from details. We also provide process discovery results, showing how the abstraction mechanism allows to obtain stroke process models more easily interpretable by neurologists. Copyright © 2018. Published by Elsevier Inc.

  9. TraceContract

    NASA Technical Reports Server (NTRS)

    Kavelund, Klaus; Barringer, Howard

    2012-01-01

    TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.

  10. myBlackBox: Blackbox Mobile Cloud Systems for Personalized Unusual Event Detection.

    PubMed

    Ahn, Junho; Han, Richard

    2016-05-23

    We demonstrate the feasibility of constructing a novel and practical real-world mobile cloud system, called myBlackBox, that efficiently fuses multimodal smartphone sensor data to identify and log unusual personal events in mobile users' daily lives. The system incorporates a hybrid architectural design that combines unsupervised classification of audio, accelerometer and location data with supervised joint fusion classification to achieve high accuracy, customization, convenience and scalability. We show the feasibility of myBlackBox by implementing and evaluating this end-to-end system that combines Android smartphones with cloud servers, deployed for 15 users over a one-month period.

  11. myBlackBox: Blackbox Mobile Cloud Systems for Personalized Unusual Event Detection

    PubMed Central

    Ahn, Junho; Han, Richard

    2016-01-01

    We demonstrate the feasibility of constructing a novel and practical real-world mobile cloud system, called myBlackBox, that efficiently fuses multimodal smartphone sensor data to identify and log unusual personal events in mobile users’ daily lives. The system incorporates a hybrid architectural design that combines unsupervised classification of audio, accelerometer and location data with supervised joint fusion classification to achieve high accuracy, customization, convenience and scalability. We show the feasibility of myBlackBox by implementing and evaluating this end-to-end system that combines Android smartphones with cloud servers, deployed for 15 users over a one-month period. PMID:27223292

  12. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  13. Application of accelerator sources for pulsed neutron logging of oil and gas wells

    NASA Astrophysics Data System (ADS)

    Randall, R. R.

    1985-05-01

    Dresser Atlas introduced the first commercial pulsed neutron oil well log in the early 1960s. This log had the capability of differentiating oil from salt water in a completed well. In the late 1970s the first continuous carbon/oxygen (C/O) log capable of differentiating oil from fresh water was introduced. The sources used in these commercial logs are radial geometry deuterium-tritium reaction devices with Cockcroft-Walton voltage multipliers providing the accelerator voltage. The commercial logging tools using these accelerators are comprised of scintillators detectors, power supplies, line drivers and receivers, and various timing and communications electronics. They are used to measure either the time decay or energy spectra of neutron-induced gamma events. The time decay information is useful in determining the neutron capture cross section, and the energy spectra is used to characterize inelastic neutron events.

  14. Decision-support information system to manage mass casualty incidents at a level 1 trauma center.

    PubMed

    Bar-El, Yaron; Tzafrir, Sara; Tzipori, Idan; Utitz, Liora; Halberthal, Michael; Beyar, Rafael; Reisner, Shimon

    2013-12-01

    Mass casualty incidents are probably the greatest challenge to a hospital. When such an event occurs, hospitals are required to instantly switch from their routine activity to conditions of great uncertainty and confront needs that exceed resources. We describe an information system that was uniquely designed for managing mass casualty events. The web-based system is activated when a mass casualty event is declared; it displays relevant operating procedures, checklists, and a log book. The system automatically or semiautomatically initiates phone calls and public address announcements. It collects real-time data from computerized clinical and administrative systems in the hospital, and presents them to the managing team in a clear graphic display. It also generates periodic reports and summaries of available or scarce resources that are sent to predefined recipients. When the system was tested in a nationwide exercise, it proved to be an invaluable tool for informed decision making in demanding and overwhelming situations such as mass casualty events.

  15. Why, What, and How to Log? Lessons from LISTEN

    ERIC Educational Resources Information Center

    Mostow, Jack; Beck, Joseph E.

    2009-01-01

    The ability to log tutorial interactions in comprehensive, longitudinal, fine-grained detail offers great potential for educational data mining--but what data is logged, and how, can facilitate or impede the realization of that potential. We propose guidelines gleaned over 15 years of logging, exploring, and analyzing millions of events from…

  16. Impact detection and analysis/health monitoring system for composites

    NASA Astrophysics Data System (ADS)

    Child, James E.; Kumar, Amrita; Beard, Shawn; Qing, Peter; Paslay, Don G.

    2006-05-01

    This manuscript includes information from test evaluations and development of a smart event detection system for use in monitoring composite rocket motor cases for damaging impacts. The primary purpose of the system as a sentry for case impact event logging is accomplished through; implementation of a passive network of miniaturized piezoelectric sensors, logger with pre-determined force threshold levels, and analysis software. Empirical approaches to structural characterizations and network calibrations along with implementation techniques were successfully evaluated, testing was performed on both unloaded (less propellants) as well as loaded rocket motors with the cylindrical areas being of primary focus. The logged test impact data with known physical network parameters provided for impact location as well as force determination, typically within 3 inches of actual impact location using a 4 foot network grid and force accuracy within 25%of an actual impact force. The simplistic empirical characterization approach along with the robust / flexible sensor grids and battery operated portable logger show promise of a system that can increase confidence in composite integrity for both new assets progressing through manufacturing processes as well as existing assets that may be in storage or transportation.

  17. Examining the Return on Investment of a Security Information and Event Management Solution in a Notional Department of Defense Network Environment

    DTIC Science & Technology

    2013-06-01

    collection are the facts that devices the lack encryption or compression methods and that the log file must be saved on the host system prior to transfer...time. Statistical correlation utilizes numerical algorithms to detect deviations from normal event levels and other routine activities (Chuvakin...can also assist in detecting low volume threats. Although easy and logical to implement, the implementation of statistical correlation algorithms

  18. Increased capture of pediatric surgical complications utilizing a novel case-log web application to enhance quality improvement.

    PubMed

    Fisher, Jason C; Kuenzler, Keith A; Tomita, Sandra S; Sinha, Prashant; Shah, Paresh; Ginsburg, Howard B

    2017-01-01

    Documenting surgical complications is limited by multiple barriers and is not fostered in the electronic health record. Tracking complications is essential for quality improvement (QI) and required for board certification. Current registry platforms do not facilitate meaningful complication reporting. We developed a novel web application that improves accuracy and reduces barriers to documenting complications. We deployed a custom web application that allows pediatric surgeons to maintain case logs. The program includes a module for entering complication data in real time. Reminders to enter outcome data occur at key postoperative intervals to optimize recall of events. Between October 1, 2014, and March 31, 2015, frequencies of surgical complications captured by the existing hospital reporting system were compared with data aggregated by our application. 780 cases were captured by the web application, compared with 276 cases registered by the hospital system. We observed an increase in the capture of major complications when compared to the hospital dataset (14 events vs. 4 events). This web application improved real-time reporting of surgical complications, exceeding the accuracy of administrative datasets. Custom informatics solutions may help reduce barriers to self-reporting of adverse events and improve the data that presently inform pediatric surgical QI. Diagnostic study/Retrospective study. Level III - case control study. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Design and development of an automatic data acquisition system for a balance study using a smartcard system.

    PubMed

    Ambrozy, C; Kolar, N A; Rattay, F

    2010-01-01

    For measurement value logging of board angle values during balance training, it is necessary to develop a measurement system. This study will provide data for a balance study using the smartcard. The data acquisition comes automatically. An individually training plan for each proband is necessary. To store the proband identification a smartcard with an I2C data bus protocol and an E2PROM memory system is used. For reading the smartcard data a smartcard reader is connected via universal serial bus (USB) to a notebook. The data acquisition and smartcard read programme is designed with Microsoft® Visual C#. A training plan file contains the individual training plan for each proband. The data of the test persons are saved in a proband directory. Each event is automatically saved as a log-file for the exact documentation. This system makes study development easy and time-saving.

  20. Twelve Years of Interviews with the Inupiat people of Arctic Alaska: Report from a Community Workshop

    NASA Astrophysics Data System (ADS)

    Eisner, W. R.; Hinkel, K. M.; Cuomo, C.

    2015-12-01

    On 20 August 2015, a workshop was held in Barrow, Alaska, which presented the highlights of 12 years of research connecting local indigenous knowledge of landscape processes with scientific research on arctic lakes, tundra changes, and permafrost stability. Seventy-six Iñupiat elders, hunters, and other knowledge-holders from the North Slope villages of Barrow, Atqasuk, Wainwright, Nuiqsut, and Anaktuvuk Pass were interviewed, and over 75 hours of videotaped interviews were produced. The interviews provided information and observations on landforms, lakes, erosion, permafrost degradation and thermokarst, changes in the environment and in animal behavior, human modification of lakes, tundra damage from 4-wheel off-road vehicles, tundra trail expansion, and other phenomena. Community concerns regarding the impact of environmental change on food procurement, animal migration, human travel routes, and the future of subsistence practices were also prominent themes. Following an interview, each videotaped session was logged. Each time an elder pointed to a location on a map and explained a landscape event/observation or told a story, the time-stamp in the video was recorded. Each logged event consisted of a code and a short account of the observation. From these reference sheets, a Geographic Information System (GIS) dataset was created. A logged account for each videotape, with geographic coordinates, event code, and event description is available for each videotape. The goal of the workshop was to report on our findings, thank the community for their support, and collaboratively develop plans for archiving and disseminating this data. A complete video library and searchable, printed and digital issues of the logging dataset for archiving in the communities were also produced. Discussions with administrative personnel at the Tuzzy Library in Barrow and the Inupiat Heritage Center have enabled us to set standards and develop a timeline for turning over the library of videos and GIS data to the North Slope community.

  1. 22nd Annual Logistics Conference and Exhibition

    DTIC Science & Technology

    2006-04-20

    Prognostics & Health Management at GE Dr. Piero P.Bonissone Industrial AI Lab GE Global Research NCD Select detection model Anomaly detection results...Mode 213 x Failure mode histogram 2130014 Anomaly detection from event-log data Anomaly detection from event-log data Diagnostics/ Prognostics Using...Failure Monitoring & AssessmentTactical C4ISR Sense Respond 7 •Diagnostics, Prognostics and health management

  2. Event Driven Messaging with Role-Based Subscriptions

    NASA Technical Reports Server (NTRS)

    Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Kim, rachel; Allen, Christopher; Luong, Ivy; Chang, George; Zendejas, Silvino; Sadaqathulla, Syed

    2009-01-01

    Event Driven Messaging with Role-Based Subscriptions (EDM-RBS) is a framework integrated into the Service Management Database (SMDB) to allow for role-based and subscription-based delivery of synchronous and asynchronous messages over JMS (Java Messaging Service), SMTP (Simple Mail Transfer Protocol), or SMS (Short Messaging Service). This allows for 24/7 operation with users in all parts of the world. The software classifies messages by triggering data type, application source, owner of data triggering event (mission), classification, sub-classification and various other secondary classifying tags. Messages are routed to applications or users based on subscription rules using a combination of the above message attributes. This program provides a framework for identifying connected users and their applications for targeted delivery of messages over JMS to the client applications the user is logged into. EDMRBS provides the ability to send notifications over e-mail or pager rather than having to rely on a live human to do it. It is implemented as an Oracle application that uses Oracle relational database management system intrinsic functions. It is configurable to use Oracle AQ JMS API or an external JMS provider for messaging. It fully integrates into the event-logging framework of SMDB (Subnet Management Database).

  3. Log-correlated random-energy models with extensive free-energy fluctuations: Pathologies caused by rare events as signatures of phase transitions

    NASA Astrophysics Data System (ADS)

    Cao, Xiangyu; Fyodorov, Yan V.; Le Doussal, Pierre

    2018-02-01

    We address systematically an apparent nonphysical behavior of the free-energy moment generating function for several instances of the logarithmically correlated models: the fractional Brownian motion with Hurst index H =0 (fBm0) (and its bridge version), a one-dimensional model appearing in decaying Burgers turbulence with log-correlated initial conditions and, finally, the two-dimensional log-correlated random-energy model (logREM) introduced in Cao et al. [Phys. Rev. Lett. 118, 090601 (2017), 10.1103/PhysRevLett.118.090601] based on the two-dimensional Gaussian free field with background charges and directly related to the Liouville field theory. All these models share anomalously large fluctuations of the associated free energy, with a variance proportional to the log of the system size. We argue that a seemingly nonphysical vanishing of the moment generating function for some values of parameters is related to the termination point transition (i.e., prefreezing). We study the associated universal log corrections in the frozen phase, both for logREMs and for the standard REM, filling a gap in the literature. For the above mentioned integrable instances of logREMs, we predict the nontrivial free-energy cumulants describing non-Gaussian fluctuations on the top of the Gaussian with extensive variance. Some of the predictions are tested numerically.

  4. Autobiographical memory sources of threats in dreams.

    PubMed

    Lafrenière, Alexandre; Lortie-Lussier, Monique; Dale, Allyson; Robidoux, Raphaëlle; De Koninck, Joseph

    2018-02-01

    Temporal sources of dream threats were examined through the paradigm of the Threat Simulation Theory. Two groups of young adults (18-24 years old), who did not experience severe threatening events in the year preceding their dream and reported a dream either with or without threats, were included. Participants (N = 119) kept a log of daily activities and a dream diary, indicating whether dream components referred to past experiences. The occurrence of oneiric threats correlated with the reporting of threats in the daily logs, their average severity, and the stress level experienced the day preceding the dream. The group whose dreams contained threats had significantly more references to temporal categories beyond one year than the group with dreams without threats. Our findings suggest that in the absence of recent highly negative emotional experiences, the threat simulation system selects memory traces of threatening events experienced in the past. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Accident diagnosis system based on real-time decision tree expert system

    NASA Astrophysics Data System (ADS)

    Nicolau, Andressa dos S.; Augusto, João P. da S. C.; Schirru, Roberto

    2017-06-01

    Safety is one of the most studied topics when referring to power stations. For that reason, sensors and alarms develop an important role in environmental and human protection. When abnormal event happens, it triggers a chain of alarms that must be, somehow, checked by the control room operators. In this case, diagnosis support system can help operators to accurately identify the possible root-cause of the problem in short time. In this article, we present a computational model of a generic diagnose support system based on artificial intelligence, that was applied on the dataset of two real power stations: Angra1 Nuclear Power Plant and Santo Antônio Hydroelectric Plant. The proposed system processes all the information logged in the sequence of events before a shutdown signal using the expert's knowledge inputted into an expert system indicating the chain of events, from the shutdown signal to its root-cause. The results of both applications showed that the support system is a potential tool to help the control room operators identify abnormal events, as accidents and consequently increase the safety.

  6. National Centers for Environmental Prediction

    Science.gov Websites

    Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Contacts Change Log Events Calendar Numerical Forecast Systems NCEP Model Analysis and Guidance Page [< Modeling Center NOAA Center for Weather and Climate Prediction (NCWCP) 5830 University Research Court

  7. WebCIS: large scale deployment of a Web-based clinical information system.

    PubMed

    Hripcsak, G; Cimino, J J; Sengupta, S

    1999-01-01

    WebCIS is a Web-based clinical information system. It sits atop the existing Columbia University clinical information system architecture, which includes a clinical repository, the Medical Entities Dictionary, an HL7 interface engine, and an Arden Syntax based clinical event monitor. WebCIS security features include authentication with secure tokens, authorization maintained in an LDAP server, SSL encryption, permanent audit logs, and application time outs. WebCIS is currently used by 810 physicians at the Columbia-Presbyterian center of New York Presbyterian Healthcare to review and enter data into the electronic medical record. Current deployment challenges include maintaining adequate database performance despite complex queries, replacing large numbers of computers that cannot run modern Web browsers, and training users that have never logged onto the Web. Although the raised expectations and higher goals have increased deployment costs, the end result is a far more functional, far more available system.

  8. Escherichia coli bacteria density in relation to turbidity, streamflow characteristics, and season in the Chattahoochee River near Atlanta, Georgia, October 2000 through September 2008—Description, statistical analysis, and predictive modeling

    USGS Publications Warehouse

    Lawrence, Stephen J.

    2012-01-01

    Regression analyses show that E. coli density in samples was strongly related to turbidity, streamflow characteristics, and season at both sites. The regression equation chosen for the Norcross data showed that 78 percent of the variability in E. coli density (in log base 10 units) was explained by the variability in turbidity values (in log base 10 units), streamflow event (dry-weather flow or stormflow), season (cool or warm), and an interaction term that is the cross product of streamflow event and turbidity. The regression equation chosen for the Atlanta data showed that 76 percent of the variability in E. coli density (in log base 10 units) was explained by the variability in turbidity values (in log base 10 units), water temperature, streamflow event, and an interaction term that is the cross product of streamflow event and turbidity. Residual analysis and model confirmation using new data indicated the regression equations selected at both sites predicted E. coli density within the 90 percent prediction intervals of the equations and could be used to predict E. coli density in real time at both sites.

  9. Preserving anonymity in e-voting system using voter non-repudiation oriented scheme

    NASA Astrophysics Data System (ADS)

    Hamid, Isredza Rahmi A.; Radzi, Siti Nafishah Md; Rahman, Nurul Hidayah Ab; Wen, Chuah Chai; Abdullah, Nurul Azma

    2017-10-01

    The voting system has been developed from traditional paper ballot to electronic voting (e-voting). The e-voting system has high potential to be widely used in election event. However, the e-voting system still does not meet the most important security properties which are voter's authenticity and non-repudiation. This is because voters can simply vote again by entering other people's identification number. In this project, an electronic voting using voter non-repudiation oriented scheme will be developed. This system contains ten modules which are log in, vote session, voter, candidate, open session, voting results, user account, initial score, logs and reset vote count. In order to ensure there would be no non-repudiation issue, a voter non-repudiation oriented scheme concept will be adapted and implemented in the system. This system will be built using Microsoft Visual Studio 2013 which only can be accessed using personal computers at the voting center. This project will be beneficial for future use in order to overcome non-repudiation issue.

  10. EARS : Repositioning data management near data acquisition.

    NASA Astrophysics Data System (ADS)

    Sinquin, Jean-Marc; Sorribas, Jordi; Diviacco, Paolo; Vandenberghe, Thomas; Munoz, Raquel; Garcia, Oscar

    2016-04-01

    The EU FP7 Projects Eurofleets and Eurofleets2 are an European wide alliance of marine research centers that aim to share their research vessels, to improve information sharing on planned, current and completed cruises, on details of ocean-going research vessels and specialized equipment, and to durably improve cost-effectiveness of cruises. Within this context logging of information on how, when and where anything happens on board of the vessel is crucial information for data users in a later stage. This forms a primordial step in the process of data quality control as it could assist in the understanding of anomalies and unexpected trends recorded in the acquired data sets. In this way completeness of the metadata is improved as it is recorded accurately at the origin of the measurement. The collection of this crucial information has been done in very different ways, using different procedures, formats and pieces of software in the context of the European Research Fleet. At the time that the Eurofleets project started, every institution and country had adopted different strategies and approaches, which complicated the task of users that need to log general purpose information and events on-board whenever they access a different platform loosing the opportunity to produce this valuable metadata on-board. Among the many goals the Eurofleets project has, a very important task is the development of an "event log software" called EARS (Eurofleets Automatic Reporting System) that enables scientists and operators to record what happens during a survey. EARS will allow users to fill, in a standardized way, the gap existing at the moment in metadata description that only very seldom links data with its history. Events generated automatically by acquisition instruments will also be handled, enhancing the granularity and precision of the event annotation. The adoption of a common procedure to log survey events and a common terminology to describe them is crucial to provide a friendly and successfully metadata on-board creation procedure for the whole the European Fleet. The possibility of automatically reporting metadata and general purpose data, will simplify the work of scientists and data managers with regards to data transmission. An improved accuracy and completeness of metadata is expected when events are recorded at acquisition time. This will also enhance multiple usages of the data as it allows verification of the different requirements existing in different disciplines.

  11. Distributed Intrusion Detection for Computer Systems Using Communicating Agents

    DTIC Science & Technology

    2000-01-01

    Log for a variety of suspicious events (like repeated failed login attempts), and alerts the IDAgent processes immediately via pipes when it finds...UX, IBM LAN Server, Raptor Eagle Firewalls, ANS Interlock Firewalls, and SunOS BSM. This program appears to be robust across many platforms. EMERALD ...Neumann, 1999] is a system developed by SRI International with research funding from DARPA. The EMERALD project will be the successor to Next

  12. Assuring image authenticity within a data grid using lossless digital signature embedding and a HIPAA-compliant auditing system

    NASA Astrophysics Data System (ADS)

    Lee, Jasper C.; Ma, Kevin C.; Liu, Brent J.

    2008-03-01

    A Data Grid for medical images has been developed at the Image Processing and Informatics Laboratory, USC to provide distribution and fault-tolerant storage of medical imaging studies across Internet2 and public domain. Although back-up policies and grid certificates guarantee privacy and authenticity of grid-access-points, there still lacks a method to guarantee the sensitive DICOM images have not been altered or corrupted during transmission across a public domain. This paper takes steps toward achieving full image transfer security within the Data Grid by utilizing DICOM image authentication and a HIPAA-compliant auditing system. The 3-D lossless digital signature embedding procedure involves a private 64 byte signature that is embedded into each original DICOM image volume, whereby on the receiving end the signature can to be extracted and verified following the DICOM transmission. This digital signature method has also been developed at the IPILab. The HIPAA-Compliant Auditing System (H-CAS) is required to monitor embedding and verification events, and allows monitoring of other grid activity as well. The H-CAS system federates the logs of transmission and authentication events at each grid-access-point and stores it into a HIPAA-compliant database. The auditing toolkit is installed at the local grid-access-point and utilizes Syslog [1], a client-server standard for log messaging over an IP network, to send messages to the H-CAS centralized database. By integrating digital image signatures and centralized logging capabilities, DICOM image integrity within the Medical Imaging and Informatics Data Grid can be monitored and guaranteed without loss to any image quality.

  13. Singular and interactive effects of blowdown, salvage logging, and wildfire in sub-boreal pine systems

    USGS Publications Warehouse

    D'Amato, A.W.; Fraver, S.; Palik, B.J.; Bradford, J.B.; Patty, L.

    2011-01-01

    The role of disturbance in structuring vegetation is widely recognized; however, we are only beginning to understand the effects of multiple interacting disturbances on ecosystem recovery and development. Of particular interest is the impact of post-disturbance management interventions, particularly in light of the global controversy surrounding the effects of salvage logging on forest ecosystem recovery. Studies of salvage logging impacts have focused on the effects of post-disturbance salvage logging within the context of a single natural disturbance event. There have been no formal evaluations of how these effects may differ when followed in short sequence by a second, high severity natural disturbance. To evaluate the impact of this management practice within the context of multiple disturbances, we examined the structural and woody plant community responses of sub-boreal Pinus banksiana systems to a rapid sequence of disturbances. Specifically, we compared responses to Blowdown (B), Fire (F), Blowdown-Fire, and Blowdown-Salvage-Fire (BSF) and compared these to undisturbed control (C) stands. Comparisons between BF and BSF indicated that the primary effect of salvage logging was a decrease in the abundance of structural legacies, such as downed woody debris and snags. Both of these compound disturbance sequences (BF and BSF), resulted in similar woody plant communities, largely dominated by Populus tremuloides; however, there was greater homogeneity in community composition in salvage logged areas. Areas experiencing solely fire (F stands) were dominated by P. banksiana regeneration, and blowdown areas (B stands) were largely characterized by regeneration from shade tolerant conifer species. Our results suggest that salvage logging impacts on woody plant communities are diminished when followed by a second high severity disturbance; however, impacts on structural legacies persist. Provisions for the retention of snags, downed logs, and surviving trees as part of salvage logging operations will minimize these structural impacts and may allow for greater ecosystem recovery following these disturbance combinations. ?? 2011 Elsevier B.V.

  14. Industrial application of semantic process mining

    NASA Astrophysics Data System (ADS)

    Espen Ingvaldsen, Jon; Atle Gulla, Jon

    2012-05-01

    Process mining relates to the extraction of non-trivial and useful information from information system event logs. It is a new research discipline that has evolved significantly since the early work on idealistic process logs. Over the last years, process mining prototypes have incorporated elements from semantics and data mining and targeted visualisation techniques that are more user-friendly to business experts and process owners. In this article, we present a framework for evaluating different aspects of enterprise process flows and address practical challenges of state-of-the-art industrial process mining. We also explore the inherent strengths of the technology for more efficient process optimisation.

  15. Process mining techniques: an application to time management

    NASA Astrophysics Data System (ADS)

    Khowaja, Ali Raza

    2018-04-01

    In an environment people have to make sure that all of their work are completed within a given time in accordance with its quality. In order to achieve the real phenomenon of process mining one needs to understand all of these processes in a detailed manner. Personal Information and communication has always been a highlighting issue on internet but for now information and communication tools within factual life refers to their daily schedule, location analysis, environmental analysis and, more generally, social media applications support these systems which makes data available for data analysis generated through event logs, but also for process analysis which combines environmental and location analysis. Process mining can be used to exploit all these real live processes with the help of the event logs which are already available in those datasets through user censored data or may be user labeled data. These processes could be used to redesign a user's flow and understand all these processes in a bit more detailed manner. In order to increase the quality of each of the processes that we go through our daily lives is to give a closer look to each of the processes and after analyzing them, one should make changes to get better results. On the contrarily, we applied process mining techniques on seven different subjects combined in a single dataset collected from Korea. Above all, the following paper comments on the efficiency of processes in the event logs referring to time management's sphere of influence.

  16. A wireless high-speed data acquisition system for geotechnical centrifuge model testing

    NASA Astrophysics Data System (ADS)

    Gaudin, C.; White, D. J.; Boylan, N.; Breen, J.; Brown, T.; DeCatania, S.; Hortin, P.

    2009-09-01

    This paper describes a novel high-speed wireless data acquisition system (WDAS) developed at the University of Western Australia for operation onboard a geotechnical centrifuge, in an enhanced gravitational field of up to 300 times Earth's gravity. The WDAS system consists of up to eight separate miniature units distributed around the circumference of a 0.8 m diameter drum centrifuge, communicating with the control room via wireless Ethernet. Each unit is capable of powering and monitoring eight instrument channels at a sampling rate of up to 1 MHz at 16-bit resolution. The data are stored within the logging unit in solid-state memory, but may also be streamed in real-time at low frequency (up to 10 Hz) to the centrifuge control room, via wireless transmission. The high-speed logging runs continuously within a circular memory (buffer), allowing for storage of a pre-trigger segment of data prior to an event. To suit typical geotechnical modelling applications, the system can record low-speed data continuously, until a burst of high-speed acquisition is triggered when an experimental event occurs, after which the system reverts back to low-speed acquisition to monitor the aftermath of the event. Unlike PC-based data acquisition solutions, this system performs the full sequence of amplification, conditioning, digitization and storage on a single circuit board via an independent micro-controller allocated to each pair of instrumented channels. This arrangement is efficient, compact and physically robust to suit the centrifuge environment. This paper details the design specification of the WDAS along with the software interface developed to control the units. Results from a centrifuge test of a submarine landslide are used to illustrate the performance of the new WDAS.

  17. Remote Sensing Analysis of Forest Disturbances

    NASA Technical Reports Server (NTRS)

    Asner, Gregory P. (Inventor)

    2015-01-01

    The present invention provides systems and methods to automatically analyze Landsat satellite data of forests. The present invention can easily be used to monitor any type of forest disturbance such as from selective logging, agriculture, cattle ranching, natural hazards (fire, wind events, storms), etc. The present invention provides a large-scale, high-resolution, automated remote sensing analysis of such disturbances.

  18. Remote sensing analysis of forest disturbances

    NASA Technical Reports Server (NTRS)

    Asner, Gregory P. (Inventor)

    2012-01-01

    The present invention provides systems and methods to automatically analyze Landsat satellite data of forests. The present invention can easily be used to monitor any type of forest disturbance such as from selective logging, agriculture, cattle ranching, natural hazards (fire, wind events, storms), etc. The present invention provides a large-scale, high-resolution, automated remote sensing analysis of such disturbances.

  19. 77 FR 8160 - Quality Assurance Requirements for Continuous Opacity Monitoring Systems at Stationary Sources

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-14

    ..., glass windows (uncoated or anti-reflection coated, but with no curvature), lenses with mounts where such... requirements must I meet if I use a substitute opacity monitor? In the event that your certified opacity... the above in the maintenance log or in other appropriate permanently maintained records. 10.7 When do...

  20. Measurement of diffraction dissociation cross sections in pp collisions at $$\\sqrt{s}$$ = 7 TeV

    DOE PAGES

    Khachatryan, Vardan

    2015-07-06

    Measurements of diffractive dissociation cross sections in pp collisions at √s=7 TeV are presented in kinematic regions defined by the masses M X and M Y of the two final-state hadronic systems separated by the largest rapidity gap in the event. The differential cross sections are measured as a function of ξ X = M2 X /s in the region -5.5 < log 10ξ X < -2.5, for log 10M Y < 0.5, dominated by single dissociation (SD), and 0.5 < log10M Y < 1.1, dominated by double dissociation (DD), where M X and M Y are given in GeV.more » The inclusive pp cross section is also measured as a function of the width of the central pseudorapidity gap Δη for Δη > 3, log 10 M X > 1.1, and log 10M Y > 1.1, a region dominated by DD. The cross sections integrated over these regions are found to be, respectively, 2.99 ± 0.02(stat) +0.32 -0.29(syst) mb, 1.18 ± 0.02(stat) ± 0.13(syst) mb, and 0.58 ± 0.01(stat) +0.13 -0.11(syst) mb, and are used to extract extrapolated total SD and DD cross sections. In addition, the inclusive differential cross section, dσ/dΔη F, for events with a pseudorapidity gap adjacent to the edge of the detector, is measured over Δη F = 8.4 units of pseudorapidity. The results are compared to those of other experiments and to theoretical predictions and found compatible with slowly rising diffractive cross sections as a function of center-of-mass energy.« less

  1. Measurement of diffractive dissociation cross sections in p p collisions at √{s }=7 TeV

    NASA Astrophysics Data System (ADS)

    Khachatryan, V.; Sirunyan, A. M.; Tumasyan, A.; Adam, W.; Bergauer, T.; Dragicevic, M.; Erö, J.; Fabjan, C.; Friedl, M.; Frühwirth, R.; Ghete, V. M.; Hartl, C.; Hörmann, N.; Hrubec, J.; Jeitler, M.; Kiesenhofer, W.; Knünz, V.; Krammer, M.; Krätschmer, I.; Liko, D.; Mikulec, I.; Rabady, D.; Rahbaran, B.; Rohringer, H.; Schöfbeck, R.; Strauss, J.; Taurok, A.; Treberer-Treberspurg, W.; Waltenberger, W.; Wulz, C.-E.; Mossolov, V.; Shumeiko, N.; Suarez Gonzalez, J.; Alderweireldt, S.; Bansal, M.; Bansal, S.; Cornelis, T.; De Wolf, E. A.; Janssen, X.; Knutsson, A.; Luyckx, S.; Ochesanu, S.; Rougny, R.; Van De Klundert, M.; Van Haevermaet, H.; Van Mechelen, P.; Van Remortel, N.; Van Spilbeeck, A.; Blekman, F.; Blyweert, S.; D'Hondt, J.; Daci, N.; Heracleous, N.; Keaveney, J.; Lowette, S.; Maes, M.; Olbrechts, A.; Python, Q.; Strom, D.; Tavernier, S.; Van Doninck, W.; Van Mulders, P.; Van Onsem, G. P.; Villella, I.; Caillol, C.; Clerbaux, B.; De Lentdecker, G.; Dobur, D.; Favart, L.; Gay, A. P. R.; Grebenyuk, A.; Léonard, A.; Mohammadi, A.; Perniè, L.; Reis, T.; Seva, T.; Thomas, L.; Vander Velde, C.; Vanlaer, P.; Wang, J.; Zenoni, F.; Adler, V.; Beernaert, K.; Benucci, L.; Cimmino, A.; Costantini, S.; Crucy, S.; Dildick, S.; Fagot, A.; Garcia, G.; Mccartin, J.; Ocampo Rios, A. A.; Ryckbosch, D.; Salva Diblen, S.; Sigamani, M.; Strobbe, N.; Thyssen, F.; Tytgat, M.; Yazgan, E.; Zaganidis, N.; Basegmez, S.; Beluffi, C.; Bruno, G.; Castello, R.; Caudron, A.; Ceard, L.; Da Silveira, G. G.; Delaere, C.; du Pree, T.; Favart, D.; Forthomme, L.; Giammanco, A.; Hollar, J.; Jafari, A.; Jez, P.; Komm, M.; Lemaitre, V.; Nuttens, C.; Pagano, D.; Perrini, L.; Pin, A.; Piotrzkowski, K.; Popov, A.; Quertenmont, L.; Selvaggi, M.; Vidal Marono, M.; Vizan Garcia, J. M.; Beliy, N.; Caebergs, T.; Daubie, E.; Hammad, G. H.; Aldá Júnior, W. L.; Alves, G. A.; Brito, L.; Correa Martins Junior, M.; Dos Reis Martins, T.; Mora Herrera, C.; Pol, M. E.; Carvalho, W.; Chinellato, J.; Custódio, A.; Da Costa, E. M.; De Jesus Damiao, D.; De Oliveira Martins, C.; Fonseca De Souza, S.; Malbouisson, H.; Matos Figueiredo, D.; Mundim, L.; Nogima, H.; Prado Da Silva, W. L.; Santaolalla, J.; Santoro, A.; Sznajder, A.; Tonelli Manganote, E. J.; Vilela Pereira, A.; Bernardes, C. A.; Dogra, S.; Fernandez Perez Tomei, T. R.; Gregores, E. M.; Mercadante, P. G.; Novaes, S. F.; Padula, Sandra S.; Aleksandrov, A.; Genchev, V.; Iaydjiev, P.; Marinov, A.; Piperov, S.; Rodozov, M.; Stoykova, S.; Sultanov, G.; Tcholakov, V.; Vutova, M.; Dimitrov, A.; Glushkov, I.; Hadjiiska, R.; Kozhuharov, V.; Litov, L.; Pavlov, B.; Petkov, P.; Bian, J. G.; Chen, G. M.; Chen, H. S.; Chen, M.; Du, R.; Jiang, C. H.; Plestina, R.; Tao, J.; Wang, Z.; Asawatangtrakuldee, C.; Ban, Y.; Li, Q.; Liu, S.; Mao, Y.; Qian, S. J.; Wang, D.; Zou, W.; Avila, C.; Chaparro Sierra, L. F.; Florez, C.; Gomez, J. P.; Gomez Moreno, B.; Sanabria, J. C.; Godinovic, N.; Lelas, D.; Polic, D.; Puljak, I.; Antunovic, Z.; Kovac, M.; Brigljevic, V.; Kadija, K.; Luetic, J.; Mekterovic, D.; Sudic, L.; Attikis, A.; Mavromanolakis, G.; Mousa, J.; Nicolaou, C.; Ptochos, F.; Razis, P. A.; Bodlak, M.; Finger, M.; Finger, M.; Assran, Y.; Ellithi Kamel, A.; Mahmoud, M. A.; Radi, A.; Kadastik, M.; Murumaa, M.; Raidal, M.; Tiko, A.; Eerola, P.; Fedi, G.; Voutilainen, M.; Härkönen, J.; Karimäki, V.; Kinnunen, R.; Kortelainen, M. J.; Lampén, T.; Lassila-Perini, K.; Lehti, S.; Lindén, T.; Luukka, P.; Mäenpää, T.; Peltola, T.; Tuominen, E.; Tuominiemi, J.; Tuovinen, E.; Wendland, L.; Talvitie, J.; Tuuva, T.; Besancon, M.; Couderc, F.; Dejardin, M.; Denegri, D.; Fabbro, B.; Faure, J. L.; Favaro, C.; Ferri, F.; Ganjour, S.; Givernaud, A.; Gras, P.; Hamel de Monchenault, G.; Jarry, P.; Locci, E.; Malcles, J.; Rander, J.; Rosowsky, A.; Titov, M.; Baffioni, S.; Beaudette, F.; Busson, P.; Charlot, C.; Dahms, T.; Dalchenko, M.; Dobrzynski, L.; Filipovic, N.; Florent, A.; Granier de Cassagnac, R.; Mastrolorenzo, L.; Miné, P.; Mironov, C.; Naranjo, I. N.; Nguyen, M.; Ochando, C.; Paganini, P.; Regnard, S.; Salerno, R.; Sauvan, J. B.; Sirois, Y.; Veelken, C.; Yilmaz, Y.; Zabi, A.; Agram, J.-L.; Andrea, J.; Aubin, A.; Bloch, D.; Brom, J.-M.; Chabert, E. C.; Collard, C.; Conte, E.; Fontaine, J.-C.; Gelé, D.; Goerlach, U.; Goetzmann, C.; Le Bihan, A.-C.; Van Hove, P.; Gadrat, S.; Beauceron, S.; Beaupere, N.; Boudoul, G.; Bouvier, E.; Brochet, S.; Carrillo Montoya, C. A.; Chasserat, J.; Chierici, R.; Contardo, D.; Depasse, P.; El Mamouni, H.; Fan, J.; Fay, J.; Gascon, S.; Gouzevitch, M.; Ille, B.; Kurca, T.; Lethuillier, M.; Mirabito, L.; Perries, S.; Ruiz Alvarez, J. D.; Sabes, D.; Sgandurra, L.; Sordini, V.; Vander Donckt, M.; Verdier, P.; Viret, S.; Xiao, H.; Tsamalaidze, Z.; Autermann, C.; Beranek, S.; Bontenackels, M.; Edelhoff, M.; Feld, L.; Hindrichs, O.; Klein, K.; Ostapchuk, A.; Perieanu, A.; Raupach, F.; Sammet, J.; Schael, S.; Weber, H.; Wittmer, B.; Zhukov, V.; Ata, M.; Brodski, M.; Dietz-Laursonn, E.; Duchardt, D.; Erdmann, M.; Fischer, R.; Güth, A.; Hebbeker, T.; Heidemann, C.; Hoepfner, K.; Klingebiel, D.; Knutzen, S.; Kreuzer, P.; Merschmeyer, M.; Meyer, A.; Millet, P.; Olschewski, M.; Padeken, K.; Papacz, P.; Reithler, H.; Schmitz, S. A.; Sonnenschein, L.; Teyssier, D.; Thüer, S.; Weber, M.; Cherepanov, V.; Erdogan, Y.; Flügge, G.; Geenen, H.; Geisler, M.; Haj Ahmad, W.; Heister, A.; Hoehle, F.; Kargoll, B.; Kress, T.; Kuessel, Y.; Künsken, A.; Lingemann, J.; Nowack, A.; Nugent, I. M.; Perchalla, L.; Pooth, O.; Stahl, A.; Asin, I.; Bartosik, N.; Behr, J.; Behrenhoff, W.; Behrens, U.; Bell, A. J.; Bergholz, M.; Bethani, A.; Borras, K.; Burgmeier, A.; Cakir, A.; Calligaris, L.; Campbell, A.; Choudhury, S.; Costanza, F.; Diez Pardos, C.; Dooling, S.; Dorland, T.; Eckerlin, G.; Eckstein, D.; Eichhorn, T.; Flucke, G.; Garay Garcia, J.; Geiser, A.; Gunnellini, P.; Hauk, J.; Hempel, M.; Horton, D.; Jung, H.; Kalogeropoulos, A.; Kasemann, M.; Katsas, P.; Kieseler, J.; Kleinwort, C.; Krücker, D.; Lange, W.; Leonard, J.; Lipka, K.; Lobanov, A.; Lohmann, W.; Lutz, B.; Mankel, R.; Marfin, I.; Melzer-Pellmann, I.-A.; Meyer, A. B.; Mittag, G.; Mnich, J.; Mussgiller, A.; Naumann-Emme, S.; Nayak, A.; Novgorodova, O.; Ntomari, E.; Perrey, H.; Pitzl, D.; Placakyte, R.; Raspereza, A.; Ribeiro Cipriano, P. M.; Roland, B.; Ron, E.; Sahin, M. Ö.; Salfeld-Nebgen, J.; Saxena, P.; Schmidt, R.; Schoerner-Sadenius, T.; Schröder, M.; Seitz, C.; Spannagel, S.; Vargas Trevino, A. D. R.; Walsh, R.; Wissing, C.; Aldaya Martin, M.; Blobel, V.; Centis Vignali, M.; Draeger, A. R.; Erfle, J.; Garutti, E.; Goebel, K.; Görner, M.; Haller, J.; Hoffmann, M.; Höing, R. S.; Kirschenmann, H.; Klanner, R.; Kogler, R.; Lange, J.; Lapsien, T.; Lenz, T.; Marchesini, I.; Ott, J.; Peiffer, T.; Pietsch, N.; Poehlsen, J.; Poehlsen, T.; Rathjens, D.; Sander, C.; Schettler, H.; Schleper, P.; Schlieckau, E.; Schmidt, A.; Seidel, M.; Sola, V.; Stadie, H.; Steinbrück, G.; Troendle, D.; Usai, E.; Vanelderen, L.; Vanhoefer, A.; Barth, C.; Baus, C.; Berger, J.; Böser, C.; Butz, E.; Chwalek, T.; De Boer, W.; Descroix, A.; Dierlamm, A.; Feindt, M.; Frensch, F.; Giffels, M.; Hartmann, F.; Hauth, T.; Husemann, U.; Katkov, I.; Kornmayer, A.; Kuznetsova, E.; Lobelle Pardo, P.; Mozer, M. U.; Müller, Th.; Nürnberg, A.; Quast, G.; Rabbertz, K.; Ratnikov, F.; Röcker, S.; Simonis, H. J.; Stober, F. M.; Ulrich, R.; Wagner-Kuhr, J.; Wayand, S.; Weiler, T.; Wolf, R.; Anagnostou, G.; Daskalakis, G.; Geralis, T.; Giakoumopoulou, V. A.; Kyriakis, A.; Loukas, D.; Markou, A.; Markou, C.; Psallidas, A.; Topsis-Giotis, I.; Agapitos, A.; Kesisoglou, S.; Panagiotou, A.; Saoulidou, N.; Stiliaris, E.; Aslanoglou, X.; Evangelou, I.; Flouris, G.; Foudas, C.; Kokkas, P.; Manthos, N.; Papadopoulos, I.; Paradas, E.; Bencze, G.; Hajdu, C.; Hidas, P.; Horvath, D.; Sikler, F.; Veszpremi, V.; Vesztergombi, G.; Zsigmond, A. J.; Beni, N.; Czellar, S.; Karancsi, J.; Molnar, J.; Palinkas, J.; Szillasi, Z.; Raics, P.; Trocsanyi, Z. L.; Ujvari, B.; Swain, S. K.; Beri, S. B.; Bhatnagar, V.; Gupta, R.; Bhawandeep, U.; Kalsi, A. K.; Kaur, M.; Kumar, R.; Mittal, M.; Nishu, N.; Singh, J. B.; Kumar, Ashok; Kumar, Arun; Ahuja, S.; Bhardwaj, A.; Choudhary, B. C.; Kumar, A.; Malhotra, S.; Naimuddin, M.; Ranjan, K.; Sharma, V.; Banerjee, S.; Bhattacharya, S.; Chatterjee, K.; Dutta, S.; Gomber, B.; Jain, Sa.; Jain, Sh.; Khurana, R.; Modak, A.; Mukherjee, S.; Roy, D.; Sarkar, S.; Sharan, M.; Abdulsalam, A.; Dutta, D.; Kailas, S.; Kumar, V.; Mohanty, A. K.; Pant, L. M.; Shukla, P.; Topkar, A.; Aziz, T.; Banerjee, S.; Bhowmik, S.; Chatterjee, R. M.; Dewanjee, R. K.; Dugad, S.; Ganguly, S.; Ghosh, S.; Guchait, M.; Gurtu, A.; Kole, G.; Kumar, S.; Maity, M.; Majumder, G.; Mazumdar, K.; Mohanty, G. B.; Parida, B.; Sudhakar, K.; Wickramage, N.; Bakhshiansohi, H.; Behnamian, H.; Etesami, S. M.; Fahim, A.; Goldouzian, R.; Khakzad, M.; Mohammadi Najafabadi, M.; Naseri, M.; Paktinat Mehdiabadi, S.; Rezaei Hosseinabadi, F.; Safarzadeh, B.; Zeinali, M.; Felcini, M.; Grunewald, M.; Abbrescia, M.; Barbone, L.; Calabria, C.; Chhibra, S. S.; Colaleo, A.; Creanza, D.; De Filippis, N.; De Palma, M.; Fiore, L.; Iaselli, G.; Maggi, G.; Maggi, M.; My, S.; Nuzzo, S.; Pompili, A.; Pugliese, G.; Radogna, R.; Selvaggi, G.; Silvestris, L.; Singh, G.; Venditti, R.; Zito, G.; Abbiendi, G.; Benvenuti, A. C.; Bonacorsi, D.; Braibant-Giacomelli, S.; Brigliadori, L.; Campanini, R.; Capiluppi, P.; Castro, A.; Cavallo, F. R.; Codispoti, G.; Cuffiani, M.; Dallavalle, G. M.; Fabbri, F.; Fanfani, A.; Fasanella, D.; Giacomelli, P.; Grandi, C.; Guiducci, L.; Marcellini, S.; Masetti, G.; Montanari, A.; Navarria, F. L.; Perrotta, A.; Primavera, F.; Rossi, A. M.; Rovelli, T.; Siroli, G. P.; Tosi, N.; Travaglini, R.; Albergo, S.; Cappello, G.; Chiorboli, M.; Costa, S.; Giordano, F.; Potenza, R.; Tricomi, A.; Tuve, C.; Barbagli, G.; Ciulli, V.; Civinini, C.; D'Alessandro, R.; Focardi, E.; Gallo, E.; Gonzi, S.; Gori, V.; Lenzi, P.; Meschini, M.; Paoletti, S.; Sguazzoni, G.; Tropiano, A.; Benussi, L.; Bianco, S.; Fabbri, F.; Piccolo, D.; Ferretti, R.; Ferro, F.; Lo Vetere, M.; Robutti, E.; Tosi, S.; Dinardo, M. E.; Fiorendi, S.; Gennai, S.; Gerosa, R.; Ghezzi, A.; Govoni, P.; Lucchini, M. T.; Malvezzi, S.; Manzoni, R. A.; Martelli, A.; Marzocchi, B.; Menasce, D.; Moroni, L.; Paganoni, M.; Pedrini, D.; Ragazzi, S.; Redaelli, N.; Tabarelli de Fatis, T.; Buontempo, S.; Cavallo, N.; Di Guida, S.; Fabozzi, F.; Iorio, A. O. M.; Lista, L.; Meola, S.; Merola, M.; Paolucci, P.; Azzi, P.; Bacchetta, N.; Bisello, D.; Branca, A.; Dall'Osso, M.; Dorigo, T.; Galanti, M.; Gasparini, F.; Giubilato, P.; Gozzelino, A.; Kanishchev, K.; Lacaprara, S.; Margoni, M.; Meneguzzo, A. T.; Montecassiano, F.; Passaseo, M.; Pazzini, J.; Pegoraro, M.; Pozzobon, N.; Ronchese, P.; Simonetto, F.; Torassa, E.; Tosi, M.; Triossi, A.; Zotto, P.; Zucchetta, A.; Zumerle, G.; Gabusi, M.; Ratti, S. P.; Re, V.; Riccardi, C.; Salvini, P.; Vitulo, P.; Biasini, M.; Bilei, G. M.; Ciangottini, D.; Fanò, L.; Lariccia, P.; Mantovani, G.; Menichelli, M.; Romeo, F.; Saha, A.; Santocchia, A.; Spiezia, A.; Androsov, K.; Azzurri, P.; Bagliesi, G.; Bernardini, J.; Boccali, T.; Broccolo, G.; Castaldi, R.; Ciocci, M. A.; Dell'Orso, R.; Donato, S.; Fiori, F.; Foà, L.; Giassi, A.; Grippo, M. T.; Ligabue, F.; Lomtadze, T.; Martini, L.; Messineo, A.; Moon, C. S.; Palla, F.; Rizzi, A.; Savoy-Navarro, A.; Serban, A. T.; Spagnolo, P.; Squillacioti, P.; Tenchini, R.; Tonelli, G.; Venturi, A.; Verdini, P. G.; Vernieri, C.; Barone, L.; Cavallari, F.; D'imperio, G.; Del Re, D.; Diemoz, M.; Grassi, M.; Jorda, C.; Longo, E.; Margaroli, F.; Meridiani, P.; Micheli, F.; Nourbakhsh, S.; Organtini, G.; Paramatti, R.; Rahatlou, S.; Rovelli, C.; Santanastasio, F.; Soffi, L.; Traczyk, P.; Amapane, N.; Arcidiacono, R.; Argiro, S.; Arneodo, M.; Bellan, R.; Biino, C.; Cartiglia, N.; Casasso, S.; Costa, M.; Degano, A.; Demaria, N.; Finco, L.; Mariotti, C.; Maselli, S.; Migliore, E.; Monaco, V.; Musich, M.; Obertino, M. M.; Ortona, G.; Pacher, L.; Pastrone, N.; Pelliccioni, M.; Pinna Angioni, G. L.; Potenza, A.; Romero, A.; Ruspa, M.; Sacchi, R.; Solano, A.; Staiano, A.; Tamponi, U.; Belforte, S.; Candelise, V.; Casarsa, M.; Cossutti, F.; Della Ricca, G.; Gobbo, B.; La Licata, C.; Marone, M.; Schizzi, A.; Umer, T.; Zanetti, A.; Chang, S.; Kropivnitskaya, A.; Nam, S. K.; Kim, D. H.; Kim, G. N.; Kim, M. S.; Kong, D. J.; Lee, S.; Oh, Y. D.; Park, H.; Sakharov, A.; Son, D. C.; Kim, T. J.; Kim, J. Y.; Song, S.; Choi, S.; Gyun, D.; Hong, B.; Jo, M.; Kim, H.; Kim, Y.; Lee, B.; Lee, K. S.; Park, S. K.; Roh, Y.; Choi, M.; Kim, J. H.; Park, I. C.; Ryu, G.; Ryu, M. S.; Choi, Y.; Choi, Y. K.; Goh, J.; Kim, D.; Kwon, E.; Lee, J.; Seo, H.; Yu, I.; Juodagalvis, A.; Komaragiri, J. R.; Md Ali, M. A. B.; Castilla-Valdez, H.; De La Cruz-Burelo, E.; Heredia-de La Cruz, I.; Hernandez-Almada, A.; Lopez-Fernandez, R.; Sanchez-Hernandez, A.; Carrillo Moreno, S.; Vazquez Valencia, F.; Pedraza, I.; Salazar Ibarguen, H. A.; Casimiro Linares, E.; Morelos Pineda, A.; Krofcheck, D.; Butler, P. H.; Reucroft, S.; Ahmad, A.; Ahmad, M.; Hassan, Q.; Hoorani, H. R.; Khalid, S.; Khan, W. A.; Khurshid, T.; Shah, M. A.; Shoaib, M.; Bialkowska, H.; Bluj, M.; Boimska, B.; Frueboes, T.; Górski, M.; Kazana, M.; Nawrocki, K.; Romanowska-Rybinska, K.; Szleper, M.; Zalewski, P.; Brona, G.; Bunkowski, K.; Cwiok, M.; Dominik, W.; Doroba, K.; Kalinowski, A.; Konecki, M.; Krolikowski, J.; Misiura, M.; Olszewski, M.; Wolszczak, W.; Bargassa, P.; Beirão Da Cruz E Silva, C.; Faccioli, P.; Ferreira Parracho, P. G.; Gallinaro, M.; Lloret Iglesias, L.; Nguyen, F.; Rodrigues Antunes, J.; Seixas, J.; Varela, J.; Vischia, P.; Gavrilenko, M.; Golutvin, I.; Gorbunov, I.; Karjavin, V.; Konoplyanikov, V.; Korenkov, V.; Kozlov, G.; Lanev, A.; Malakhov, A.; Matveev, V.; Mitsyn, V. V.; Moisenz, P.; Palichik, V.; Perelygin, V.; Shmatov, S.; Smirnov, V.; Tikhonenko, E.; Zarubin, A.; Golovtsov, V.; Ivanov, Y.; Kim, V.; Levchenko, P.; Murzin, V.; Oreshkin, V.; Smirnov, I.; Sulimov, V.; Uvarov, L.; Vavilov, S.; Vorobyev, A.; Vorobyev, An.; Andreev, Yu.; Dermenev, A.; Gninenko, S.; Golubev, N.; Kirsanov, M.; Krasnikov, N.; Pashenkov, A.; Tlisov, D.; Toropin, A.; Epshteyn, V.; Gavrilov, V.; Lychkovskaya, N.; Popov, V.; Safronov, G.; Semenov, S.; Spiridonov, A.; Stolin, V.; Vlasov, E.; Zhokin, A.; Andreev, V.; Azarkin, M.; Dremin, I.; Kirakosyan, M.; Leonidov, A.; Mesyats, G.; Rusakov, S. V.; Vinogradov, A.; Belyaev, A.; Boos, E.; Ershov, A.; Gribushin, A.; Khein, L.; Klyukhin, V.; Kodolova, O.; Lokhtin, I.; Lukina, O.; Obraztsov, S.; Petrushanko, S.; Savrin, V.; Snigirev, A.; Azhgirey, I.; Bayshev, I.; Bitioukov, S.; Kachanov, V.; Kalinin, A.; Konstantinov, D.; Krychkine, V.; Petrov, V.; Ryutin, R.; Sobol, A.; Tourtchanovitch, L.; Troshin, S.; Tyurin, N.; Uzunian, A.; Volkov, A.; Adzic, P.; Ekmedzic, M.; Milosevic, J.; Rekovic, V.; Alcaraz Maestre, J.; Battilana, C.; Calvo, E.; Cerrada, M.; Chamizo Llatas, M.; Colino, N.; De La Cruz, B.; Delgado Peris, A.; Domínguez Vázquez, D.; Escalante Del Valle, A.; Fernandez Bedoya, C.; Fernández Ramos, J. P.; Flix, J.; Fouz, M. C.; Garcia-Abia, P.; Gonzalez Lopez, O.; Goy Lopez, S.; Hernandez, J. M.; Josa, M. I.; Navarro De Martino, E.; Pérez-Calero Yzquierdo, A.; Puerta Pelayo, J.; Quintario Olmeda, A.; Redondo, I.; Romero, L.; Soares, M. S.; Albajar, C.; de Trocóniz, J. F.; Missiroli, M.; Moran, D.; Brun, H.; Cuevas, J.; Fernandez Menendez, J.; Folgueras, S.; Gonzalez Caballero, I.; Brochero Cifuentes, J. A.; Cabrillo, I. J.; Calderon, A.; Duarte Campderros, J.; Fernandez, M.; Gomez, G.; Graziano, A.; Lopez Virto, A.; Marco, J.; Marco, R.; Martinez Rivero, C.; Matorras, F.; Munoz Sanchez, F. J.; Piedra Gomez, J.; Rodrigo, T.; Rodríguez-Marrero, A. Y.; Ruiz-Jimeno, A.; Scodellaro, L.; Vila, I.; Vilar Cortabitarte, R.; Abbaneo, D.; Auffray, E.; Auzinger, G.; Bachtis, M.; Baillon, P.; Ball, A. H.; Barney, D.; Benaglia, A.; Bendavid, J.; Benhabib, L.; Benitez, J. F.; Bernet, C.; Bianchi, G.; Bloch, P.; Bocci, A.; Bonato, A.; Bondu, O.; Botta, C.; Breuker, H.; Camporesi, T.; Cerminara, G.; Colafranceschi, S.; D'Alfonso, M.; d'Enterria, D.; Dabrowski, A.; David, A.; De Guio, F.; De Roeck, A.; De Visscher, S.; Di Marco, E.; Dobson, M.; Dordevic, M.; Dorney, B.; Dupont-Sagorin, N.; Elliott-Peisert, A.; Eugster, J.; Franzoni, G.; Funk, W.; Gigi, D.; Gill, K.; Giordano, D.; Girone, M.; Glege, F.; Guida, R.; Gundacker, S.; Guthoff, M.; Hammer, J.; Hansen, M.; Harris, P.; Hegeman, J.; Innocente, V.; Janot, P.; Kousouris, K.; Krajczar, K.; Lecoq, P.; Lourenço, C.; Magini, N.; Malgeri, L.; Mannelli, M.; Marrouche, J.; Masetti, L.; Meijers, F.; Mersi, S.; Meschi, E.; Moortgat, F.; Morovic, S.; Mulders, M.; Musella, P.; Orsini, L.; Pape, L.; Perez, E.; Perrozzi, L.; Petrilli, A.; Petrucciani, G.; Pfeiffer, A.; Pierini, M.; Pimiä, M.; Piparo, D.; Plagge, M.; Racz, A.; Rolandi, G.; Rovere, M.; Sakulin, H.; Schäfer, C.; Schwick, C.; Sharma, A.; Siegrist, P.; Silva, P.; Simon, M.; Sphicas, P.; Spiga, D.; Steggemann, J.; Stieger, B.; Stoye, M.; Takahashi, Y.; Treille, D.; Tsirou, A.; Veres, G. I.; Vlimant, J. R.; Wardle, N.; Wöhri, H. K.; Wollny, H.; Zeuner, W. D.; Bertl, W.; Deiters, K.; Erdmann, W.; Horisberger, R.; Ingram, Q.; Kaestli, H. C.; Kotlinski, D.; Langenegger, U.; Renker, D.; Rohe, T.; Bachmair, F.; Bäni, L.; Bianchini, L.; Buchmann, M. A.; Casal, B.; Chanon, N.; Dissertori, G.; Dittmar, M.; Donegà, M.; Dünser, M.; Eller, P.; Grab, C.; Hits, D.; Hoss, J.; Lustermann, W.; Mangano, B.; Marini, A. C.; Martinez Ruiz del Arbol, P.; Masciovecchio, M.; Meister, D.; Mohr, N.; Nägeli, C.; Nessi-Tedaldi, F.; Pandolfi, F.; Pauss, F.; Peruzzi, M.; Quittnat, M.; Rebane, L.; Rossini, M.; Starodumov, A.; Takahashi, M.; Theofilatos, K.; Wallny, R.; Weber, H. A.; Amsler, C.; Canelli, M. F.; Chiochia, V.; De Cosa, A.; Hinzmann, A.; Hreus, T.; Kilminster, B.; Lange, C.; Millan Mejias, B.; Ngadiuba, J.; Robmann, P.; Ronga, F. J.; Taroni, S.; Verzetti, M.; Yang, Y.; Cardaci, M.; Chen, K. H.; Ferro, C.; Kuo, C. M.; Lin, W.; Lu, Y. J.; Volpe, R.; Yu, S. S.; Chang, P.; Chang, Y. H.; Chang, Y. W.; Chao, Y.; Chen, K. F.; Chen, P. H.; Dietz, C.; Grundler, U.; Hou, W.-S.; Kao, K. Y.; Lei, Y. J.; Liu, Y. F.; Lu, R.-S.; Majumder, D.; Petrakou, E.; Tzeng, Y. M.; Wilken, R.; Asavapibhop, B.; Srimanobhas, N.; Suwonjandee, N.; Adiguzel, A.; Bakirci, M. N.; Cerci, S.; Dozen, C.; Dumanoglu, I.; Eskut, E.; Girgis, S.; Gokbulut, G.; Gurpinar, E.; Hos, I.; Kangal, E. E.; Kayis Topaksu, A.; Onengut, G.; Ozdemir, K.; Ozturk, S.; Polatoz, A.; Sunar Cerci, D.; Tali, B.; Topakli, H.; Vergili, M.; Akin, I. V.; Bilin, B.; Bilmis, S.; Gamsizkan, H.; Karapinar, G.; Ocalan, K.; Sekmen, S.; Surat, U. E.; Yalvac, M.; Zeyrek, M.; Gülmez, E.; Isildak, B.; Kaya, M.; Kaya, O.; Cankocak, K.; Vardarlı, F. I.; Levchuk, L.; Sorokin, P.; Brooke, J. J.; Clement, E.; Cussans, D.; Flacher, H.; Goldstein, J.; Grimes, M.; Heath, G. P.; Heath, H. F.; Jacob, J.; Kreczko, L.; Lucas, C.; Meng, Z.; Newbold, D. M.; Paramesvaran, S.; Poll, A.; Senkin, S.; Smith, V. J.; Williams, T.; Bell, K. W.; Belyaev, A.; Brew, C.; Brown, R. M.; Cockerill, D. J. A.; Coughlan, J. A.; Harder, K.; Harper, S.; Olaiya, E.; Petyt, D.; Shepherd-Themistocleous, C. H.; Thea, A.; Tomalin, I. R.; Womersley, W. J.; Worm, S. D.; Baber, M.; Bainbridge, R.; Buchmuller, O.; Burton, D.; Colling, D.; Cripps, N.; Cutajar, M.; Dauncey, P.; Davies, G.; Della Negra, M.; Dunne, P.; Ferguson, W.; Fulcher, J.; Futyan, D.; Gilbert, A.; Hall, G.; Iles, G.; Jarvis, M.; Karapostoli, G.; Kenzie, M.; Lane, R.; Lucas, R.; Lyons, L.; Magnan, A.-M.; Malik, S.; Mathias, B.; Nash, J.; Nikitenko, A.; Pela, J.; Pesaresi, M.; Petridis, K.; Raymond, D. M.; Rogerson, S.; Rose, A.; Seez, C.; Sharp, P.; Tapper, A.; Vazquez Acosta, M.; Virdee, T.; Zenz, S. C.; Cole, J. E.; Hobson, P. R.; Khan, A.; Kyberd, P.; Leggat, D.; Leslie, D.; Martin, W.; Reid, I. D.; Symonds, P.; Teodorescu, L.; Turner, M.; Dittmann, J.; Hatakeyama, K.; Kasmi, A.; Liu, H.; Scarborough, T.; Charaf, O.; Cooper, S. I.; Henderson, C.; Rumerio, P.; Avetisyan, A.; Bose, T.; Fantasia, C.; Lawson, P.; Richardson, C.; Rohlf, J.; St. John, J.; Sulak, L.; Alimena, J.; Berry, E.; Bhattacharya, S.; Christopher, G.; Cutts, D.; Demiragli, Z.; Dhingra, N.; Ferapontov, A.; Garabedian, A.; Heintz, U.; Kukartsev, G.; Laird, E.; Landsberg, G.; Luk, M.; Narain, M.; Segala, M.; Sinthuprasith, T.; Speer, T.; Swanson, J.; Breedon, R.; Breto, G.; Calderon De La Barca Sanchez, M.; Chauhan, S.; Chertok, M.; Conway, J.; Conway, R.; Cox, P. T.; Erbacher, R.; Gardner, M.; Ko, W.; Lander, R.; Miceli, T.; Mulhearn, M.; Pellett, D.; Pilot, J.; Ricci-Tam, F.; Searle, M.; Shalhout, S.; Smith, J.; Squires, M.; Stolp, D.; Tripathi, M.; Wilbur, S.; Yohay, R.; Cousins, R.; Everaerts, P.; Farrell, C.; Hauser, J.; Ignatenko, M.; Rakness, G.; Takasugi, E.; Valuev, V.; Weber, M.; Burt, K.; Clare, R.; Ellison, J.; Gary, J. W.; Hanson, G.; Heilman, J.; Ivova Rikova, M.; Jandir, P.; Kennedy, E.; Lacroix, F.; Long, O. R.; Luthra, A.; Malberti, M.; Nguyen, H.; Olmedo Negrete, M.; Shrinivas, A.; Sumowidagdo, S.; Wimpenny, S.; Andrews, W.; Branson, J. G.; Cerati, G. B.; Cittolin, S.; D'Agnolo, R. T.; Evans, D.; Holzner, A.; Kelley, R.; Klein, D.; Lebourgeois, M.; Letts, J.; Macneill, I.; Olivito, D.; Padhi, S.; Palmer, C.; Pieri, M.; Sani, M.; Sharma, V.; Simon, S.; Sudano, E.; Tadel, M.; Tu, Y.; Vartak, A.; Welke, C.; Würthwein, F.; Yagil, A.; Barge, D.; Bradmiller-Feld, J.; Campagnari, C.; Danielson, T.; Dishaw, A.; Flowers, K.; Franco Sevilla, M.; Geffert, P.; George, C.; Golf, F.; Gouskos, L.; Incandela, J.; Justus, C.; Mccoll, N.; Richman, J.; Stuart, D.; To, W.; West, C.; Yoo, J.; Apresyan, A.; Bornheim, A.; Bunn, J.; Chen, Y.; Duarte, J.; Mott, A.; Newman, H. B.; Pena, C.; Rogan, C.; Spiropulu, M.; Timciuc, V.; Wilkinson, R.; Xie, S.; Zhu, R. Y.; Azzolini, V.; Calamba, A.; Carlson, B.; Ferguson, T.; Iiyama, Y.; Paulini, M.; Russ, J.; Vogel, H.; Vorobiev, I.; Cumalat, J. P.; Ford, W. T.; Gaz, A.; Luiggi Lopez, E.; Nauenberg, U.; Smith, J. G.; Stenson, K.; Ulmer, K. A.; Wagner, S. R.; Alexander, J.; Chatterjee, A.; Chu, J.; Dittmer, S.; Eggert, N.; Mirman, N.; Nicolas Kaufman, G.; Patterson, J. R.; Ryd, A.; Salvati, E.; Skinnari, L.; Sun, W.; Teo, W. D.; Thom, J.; Thompson, J.; Tucker, J.; Weng, Y.; Winstrom, L.; Wittich, P.; Winn, D.; Abdullin, S.; Albrow, M.; Anderson, J.; Apollinari, G.; Bauerdick, L. A. T.; Beretvas, A.; Berryhill, J.; Bhat, P. C.; Bolla, G.; Burkett, K.; Butler, J. N.; Cheung, H. W. K.; Chlebana, F.; Cihangir, S.; Elvira, V. D.; Fisk, I.; Freeman, J.; Gao, Y.; Gottschalk, E.; Gray, L.; Green, D.; Grünendahl, S.; Gutsche, O.; Hanlon, J.; Hare, D.; Harris, R. M.; Hirschauer, J.; Hooberman, B.; Jindariani, S.; Johnson, M.; Joshi, U.; Kaadze, K.; Klima, B.; Kreis, B.; Kwan, S.; Linacre, J.; Lincoln, D.; Lipton, R.; Liu, T.; Lykken, J.; Maeshima, K.; Marraffino, J. M.; Martinez Outschoorn, V. I.; Maruyama, S.; Mason, D.; McBride, P.; Merkel, P.; Mishra, K.; Mrenna, S.; Musienko, Y.; Nahn, S.; Newman-Holmes, C.; O'Dell, V.; Prokofyev, O.; Sexton-Kennedy, E.; Sharma, S.; Soha, A.; Spalding, W. J.; Spiegel, L.; Taylor, L.; Tkaczyk, S.; Tran, N. V.; Uplegger, L.; Vaandering, E. W.; Vidal, R.; Whitbeck, A.; Whitmore, J.; Yang, F.; Acosta, D.; Avery, P.; Bortignon, P.; Bourilkov, D.; Carver, M.; Cheng, T.; Curry, D.; Das, S.; De Gruttola, M.; Di Giovanni, G. P.; Field, R. D.; Fisher, M.; Furic, I. K.; Hugon, J.; Konigsberg, J.; Korytov, A.; Kypreos, T.; Low, J. F.; Matchev, K.; Milenovic, P.; Mitselmakher, G.; Muniz, L.; Rinkevicius, A.; Shchutska, L.; Snowball, M.; Sperka, D.; Yelton, J.; Zakaria, M.; Hewamanage, S.; Linn, S.; Markowitz, P.; Martinez, G.; Rodriguez, J. L.; Adams, T.; Askew, A.; Bochenek, J.; Diamond, B.; Haas, J.; Hagopian, S.; Hagopian, V.; Johnson, K. F.; Prosper, H.; Veeraraghavan, V.; Weinberg, M.; Baarmand, M. M.; Hohlmann, M.; Kalakhety, H.; Yumiceva, F.; Adams, M. R.; Apanasevich, L.; Bazterra, V. E.; Berry, D.; Betts, R. R.; Bucinskaite, I.; Cavanaugh, R.; Evdokimov, O.; Gauthier, L.; Gerber, C. E.; Hofman, D. J.; Khalatyan, S.; Kurt, P.; Moon, D. H.; O'Brien, C.; Silkworth, C.; Turner, P.; Varelas, N.; Albayrak, E. A.; Bilki, B.; Clarida, W.; Dilsiz, K.; Duru, F.; Haytmyradov, M.; Merlo, J.-P.; Mermerkaya, H.; Mestvirishvili, A.; Moeller, A.; Nachtman, J.; Ogul, H.; Onel, Y.; Ozok, F.; Penzo, A.; Rahmat, R.; Sen, S.; Tan, P.; Tiras, E.; Wetzel, J.; Yetkin, T.; Yi, K.; Barnett, B. A.; Blumenfeld, B.; Bolognesi, S.; Fehling, D.; Gritsan, A. V.; Maksimovic, P.; Martin, C.; Swartz, M.; Baringer, P.; Bean, A.; Benelli, G.; Bruner, C.; Kenny, R. P.; Malek, M.; Murray, M.; Noonan, D.; Sanders, S.; Sekaric, J.; Stringer, R.; Wang, Q.; Wood, J. S.; Barfuss, A. F.; Chakaberia, I.; Ivanov, A.; Khalil, S.; Makouski, M.; Maravin, Y.; Saini, L. K.; Shrestha, S.; Skhirtladze, N.; Svintradze, I.; Gronberg, J.; Lange, D.; Rebassoo, F.; Wright, D.; Baden, A.; Belloni, A.; Calvert, B.; Eno, S. C.; Gomez, J. A.; Hadley, N. J.; Kellogg, R. G.; Kolberg, T.; Lu, Y.; Marionneau, M.; Mignerey, A. C.; Pedro, K.; Skuja, A.; Tonjes, M. B.; Tonwar, S. C.; Apyan, A.; Barbieri, R.; Bauer, G.; Busza, W.; Cali, I. A.; Chan, M.; Di Matteo, L.; Dutta, V.; Gomez Ceballos, G.; Goncharov, M.; Gulhan, D.; Klute, M.; Lai, Y. S.; Lee, Y.-J.; Levin, A.; Luckey, P. D.; Ma, T.; Paus, C.; Ralph, D.; Roland, C.; Roland, G.; Stephans, G. S. F.; Stöckli, F.; Sumorok, K.; Velicanu, D.; Veverka, J.; Wyslouch, B.; Yang, M.; Zanetti, M.; Zhukova, V.; Dahmes, B.; Gude, A.; Kao, S. C.; Klapoetke, K.; Kubota, Y.; Mans, J.; Pastika, N.; Rusack, R.; Singovsky, A.; Tambe, N.; Turkewitz, J.; Acosta, J. G.; Oliveros, S.; Avdeeva, E.; Bloom, K.; Bose, S.; Claes, D. R.; Dominguez, A.; Gonzalez Suarez, R.; Keller, J.; Knowlton, D.; Kravchenko, I.; Lazo-Flores, J.; Malik, S.; Meier, F.; Snow, G. R.; Zvada, M.; Dolen, J.; Godshalk, A.; Iashvili, I.; Kharchilava, A.; Kumar, A.; Rappoccio, S.; Alverson, G.; Barberis, E.; Baumgartel, D.; Chasco, M.; Haley, J.; Massironi, A.; Morse, D. M.; Nash, D.; Orimoto, T.; Trocino, D.; Wang, R.-J.; Wood, D.; Zhang, J.; Hahn, K. A.; Kubik, A.; Mucia, N.; Odell, N.; Pollack, B.; Pozdnyakov, A.; Schmitt, M.; Stoynev, S.; Sung, K.; Velasco, M.; Won, S.; Brinkerhoff, A.; Chan, K. M.; Drozdetskiy, A.; Hildreth, M.; Jessop, C.; Karmgard, D. J.; Kellams, N.; Lannon, K.; Luo, W.; Lynch, S.; Marinelli, N.; Pearson, T.; Planer, M.; Ruchti, R.; Valls, N.; Wayne, M.; Wolf, M.; Woodard, A.; Antonelli, L.; Brinson, J.; Bylsma, B.; Durkin, L. S.; Flowers, S.; Hill, C.; Hughes, R.; Kotov, K.; Ling, T. Y.; Puigh, D.; Rodenburg, M.; Smith, G.; Winer, B. L.; Wolfe, H.; Wulsin, H. W.; Driga, O.; Elmer, P.; Hebda, P.; Hunt, A.; Koay, S. A.; Lujan, P.; Marlow, D.; Medvedeva, T.; Mooney, M.; Olsen, J.; Piroué, P.; Quan, X.; Saka, H.; Stickland, D.; Tully, C.; Werner, J. S.; Zuranski, A.; Brownson, E.; Mendez, H.; Ramirez Vargas, J. E.; Barnes, V. E.; Benedetti, D.; Bortoletto, D.; De Mattia, M.; Gutay, L.; Hu, Z.; Jha, M. K.; Jones, M.; Jung, K.; Kress, M.; Leonardo, N.; Lopes Pegna, D.; Maroussov, V.; Miller, D. H.; Neumeister, N.; Radburn-Smith, B. C.; Shi, X.; Shipsey, I.; Silvers, D.; Svyatkovskiy, A.; Wang, F.; Xie, W.; Xu, L.; Yoo, H. D.; Zablocki, J.; Zheng, Y.; Parashar, N.; Stupak, J.; Adair, A.; Akgun, B.; Ecklund, K. M.; Geurts, F. J. M.; Li, W.; Michlin, B.; Padley, B. P.; Redjimi, R.; Roberts, J.; Zabel, J.; Betchart, B.; Bodek, A.; Covarelli, R.; de Barbaro, P.; Demina, R.; Eshaq, Y.; Ferbel, T.; Garcia-Bellido, A.; Goldenzweig, P.; Han, J.; Harel, A.; Khukhunaishvili, A.; Petrillo, G.; Vishnevskiy, D.; Ciesielski, R.; Demortier, L.; Goulianos, K.; Lungu, G.; Mesropian, C.; Arora, S.; Barker, A.; Chou, J. P.; Contreras-Campana, C.; Contreras-Campana, E.; Duggan, D.; Ferencek, D.; Gershtein, Y.; Gray, R.; Halkiadakis, E.; Hidas, D.; Kaplan, S.; Lath, A.; Panwalkar, S.; Park, M.; Patel, R.; Salur, S.; Schnetzer, S.; Somalwar, S.; Stone, R.; Thomas, S.; Thomassen, P.; Walker, M.; Rose, K.; Spanier, S.; York, A.; Bouhali, O.; Castaneda Hernandez, A.; Eusebi, R.; Flanagan, W.; Gilmore, J.; Kamon, T.; Khotilovich, V.; Krutelyov, V.; Montalvo, R.; Osipenkov, I.; Pakhotin, Y.; Perloff, A.; Roe, J.; Rose, A.; Safonov, A.; Sakuma, T.; Suarez, I.; Tatarinov, A.; Akchurin, N.; Cowden, C.; Damgov, J.; Dragoiu, C.; Dudero, P. R.; Faulkner, J.; Kovitanggoon, K.; Kunori, S.; Lee, S. W.; Libeiro, T.; Volobouev, I.; Appelt, E.; Delannoy, A. G.; Greene, S.; Gurrola, A.; Johns, W.; Maguire, C.; Mao, Y.; Melo, A.; Sharma, M.; Sheldon, P.; Snook, B.; Tuo, S.; Velkovska, J.; Arenton, M. W.; Boutle, S.; Cox, B.; Francis, B.; Goodell, J.; Hirosky, R.; Ledovskoy, A.; Li, H.; Lin, C.; Neu, C.; Wood, J.; Clarke, C.; Harr, R.; Karchin, P. E.; Kottachchi Kankanamge Don, C.; Lamichhane, P.; Sturdy, J.; Belknap, D. A.; Carlsmith, D.; Cepeda, M.; Dasu, S.; Dodd, L.; Duric, S.; Friis, E.; Hall-Wilton, R.; Herndon, M.; Hervé, A.; Klabbers, P.; Lanaro, A.; Lazaridis, C.; Levine, A.; Loveless, R.; Mohapatra, A.; Ojalvo, I.; Perry, T.; Pierro, G. A.; Polese, G.; Ross, I.; Sarangi, T.; Savin, A.; Smith, W. H.; Taylor, D.; Verwilligen, P.; Vuosalo, C.; Woods, N.; CMS Collaboration

    2015-07-01

    Measurements of diffractive dissociation cross sections in p p collisions at √{s }=7 TeV are presented in kinematic regions defined by the masses MX and MY of the two final-state hadronic systems separated by the largest rapidity gap in the event. Differential cross sections are measured as a function of ξX=MX2/s in the region -5.5 3 , log10MX>1.1 , and log10MY>1.1 , a region dominated by DD. The cross sections integrated over these regions are found to be, respectively, 2.99 ±0.02 (stat)-0.29+0.32(syst) mb , 1.18 ±0.02 (stat) ±0.13 (syst) mb , and 0.58 ±0.01 (stat)-0.11+0.13(syst) mb , and are used to extract extrapolated total SD and DD cross sections. In addition, the inclusive differential cross section, d σ /d Δ ηF , for events with a pseudorapidity gap adjacent to the edge of the detector, is measured over Δ ηF=8.4 units of pseudorapidity. The results are compared to those of other experiments and to theoretical predictions and found compatible with slowly rising diffractive cross sections as a function of center-of-mass energy.

  2. Stick-slip behavior in a continuum-granular experiment.

    PubMed

    Geller, Drew A; Ecke, Robert E; Dahmen, Karin A; Backhaus, Scott

    2015-12-01

    We report moment distribution results from a laboratory experiment, similar in character to an isolated strike-slip earthquake fault, consisting of sheared elastic plates separated by a narrow gap filled with a two-dimensional granular medium. Local measurement of strain displacements of the plates at 203 spatial points located adjacent to the gap allows direct determination of the event moments and their spatial and temporal distributions. We show that events consist of spatially coherent, larger motions and spatially extended (noncoherent), smaller events. The noncoherent events have a probability distribution of event moment consistent with an M(-3/2) power law scaling with Poisson-distributed recurrence times. Coherent events have a log-normal moment distribution and mean temporal recurrence. As the applied normal pressure increases, there are more coherent events and their log-normal distribution broadens and shifts to larger average moment.

  3. Spatiotemporal stick-slip phenomena in a coupled continuum-granular system

    NASA Astrophysics Data System (ADS)

    Ecke, Robert

    In sheared granular media, stick-slip behavior is ubiquitous, especially at very small shear rates and weak drive coupling. The resulting slips are characteristic of natural phenomena such as earthquakes and well as being a delicate probe of the collective dynamics of the granular system. In that spirit, we developed a laboratory experiment consisting of sheared elastic plates separated by a narrow gap filled with quasi-two-dimensional granular material (bi-dispersed nylon rods) . We directly determine the spatial and temporal distributions of strain displacements of the elastic continuum over 200 spatial points located adjacent to the gap. Slip events can be divided into large system-spanning events and spatially distributed smaller events. The small events have a probability distribution of event moment consistent with an M - 3 / 2 power law scaling and a Poisson distributed recurrence time distribution. Large events have a broad, log-normal moment distribution and a mean repetition time. As the applied normal force increases, there are fractionally more (less) large (small) events, and the large-event moment distribution broadens. The magnitude of the slip motion of the plates is well correlated with the root-mean-square displacements of the granular matter. Our results are consistent with mean field descriptions of statistical models of earthquakes and avalanches. We further explore the high-speed dynamics of system events and also discuss the effective granular friction of the sheared layer. We find that large events result from stored elastic energy in the plates in this coupled granular-continuum system.

  4. Service Management Database for DSN Equipment

    NASA Technical Reports Server (NTRS)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.

  5. Inflammation and Atherosclerosis Are Associated With Hypertension in Kidney Transplant Recipients.

    PubMed

    Azancot, Maria A; Ramos, Natalia; Torres, Irina B; García-Carro, Clara; Romero, Katheryne; Espinel, Eugenia; Moreso, Francesc; Seron, Daniel

    2015-12-01

    The aim of the current study was to evaluate risk factors associated with hypertension in kidney transplant recipients. The authors recruited 92 consecutive kidney transplant recipients and 30 age-matched patients with chronic kidney disease without history of cardiovascular events. Twenty-four-hour ambulatory blood pressure monitoring, pulse wave velocity, and carotid ultrasound were performed. Serum levels of log-transformed interleukin 6 (Log IL-6), soluble tumor necrosis factor receptor 2, and intercellular adhesion molecule 1 were determined. Twenty-four-hour systolic blood pressure (SBP) (P=.0001), Log IL-6 (P=.011), and total number of carotid plaques (P=.013) were higher, while the percentage decline of SBP from day to night was lower in kidney transplant recipients (P=.003). Independent predictors of 24-hour SBP were urinary protein/creatinine ratio and circulating monocytes (P=.001), while Log IL-6, serum creatinine, and total number of carotid plaques (P=.0001) were independent predictors of percentage decline of SBP from day to night. These results suggest that subclinical atherosclerosis and systemic inflammation are associated with hypertension after transplantation. © 2015 Wiley Periodicals, Inc.

  6. Meta-T: TetrisⓇ as an experimental paradigm for cognitive skills research.

    PubMed

    Lindstedt, John K; Gray, Wayne D

    2015-12-01

    Studies of human performance in complex tasks using video games are an attractive prospect, but many existing games lack a comprehensive way to modify the game and track performance beyond basic levels of analysis. Meta-T provides experimenters a tool to study behavior in a dynamic task environment with time-stressed decision-making and strong perceptual-motor elements, offering a host of experimental manipulations with a robust and detailed logging system for all user events, system events, and screen objects. Its experimenter-friendly interface provides control over detailed parameters of the task environment without need for programming expertise. Support for eye-tracking and computational cognitive modeling extend the paradigm's scope.

  7. MO-F-CAMPUS-I-01: A System for Automatically Calculating Organ and Effective Dose for Fluoroscopically-Guided Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Rana, V

    2015-06-15

    Purpose: A system was developed that automatically calculates the organ and effective dose for individual fluoroscopically-guided procedures using a log of the clinical exposure parameters. Methods: We have previously developed a dose tracking system (DTS) to provide a real-time color-coded 3D- mapping of skin dose. This software produces a log file of all geometry and exposure parameters for every x-ray pulse during a procedure. The data in the log files is input into PCXMC, a Monte Carlo program that calculates organ and effective dose for projections and exposure parameters set by the user. We developed a MATLAB program to readmore » data from the log files produced by the DTS and to automatically generate the definition files in the format used by PCXMC. The processing is done at the end of a procedure after all exposures are completed. Since there are thousands of exposure pulses with various parameters for fluoroscopy, DA and DSA and at various projections, the data for exposures with similar parameters is grouped prior to entry into PCXMC to reduce the number of Monte Carlo calculations that need to be performed. Results: The software developed automatically transfers data from the DTS log file to PCXMC and runs the program for each grouping of exposure pulses. When the dose from all exposure events are calculated, the doses for each organ and all effective doses are summed to obtain procedure totals. For a complicated interventional procedure, the calculations can be completed on a PC without manual intervention in less than 30 minutes depending on the level of data grouping. Conclusion: This system allows organ dose to be calculated for individual procedures for every patient without tedious calculations or data entry so that estimates of stochastic risk can be obtained in addition to the deterministic risk estimate provided by the DTS. Partial support from NIH grant R01EB002873 and Toshiba Medical Systems Corp.« less

  8. Specializing network analysis to detect anomalous insider actions

    PubMed Central

    Chen, You; Nyemba, Steve; Zhang, Wen; Malin, Bradley

    2012-01-01

    Collaborative information systems (CIS) enable users to coordinate efficiently over shared tasks in complex distributed environments. For flexibility, they provide users with broad access privileges, which, as a side-effect, leave such systems vulnerable to various attacks. Some of the more damaging malicious activities stem from internal misuse, where users are authorized to access system resources. A promising class of insider threat detection models for CIS focuses on mining access patterns from audit logs, however, current models are limited in that they assume organizations have significant resources to generate label cases for training classifiers or assume the user has committed a large number of actions that deviate from “normal” behavior. In lieu of the previous assumptions, we introduce an approach that detects when specific actions of an insider deviate from expectation in the context of collaborative behavior. Specifically, in this paper, we introduce a specialized network anomaly detection model, or SNAD, to detect such events. This approach assesses the extent to which a user influences the similarity of the group of users that access a particular record in the CIS. From a theoretical perspective, we show that the proposed model is appropriate for detecting insider actions in dynamic collaborative systems. From an empirical perspective, we perform an extensive evaluation of SNAD with the access logs of two distinct environments: the patient record access logs a large electronic health record system (6,015 users, 130,457 patients and 1,327,500 accesses) and the editing logs of Wikipedia (2,394,385 revisors, 55,200 articles and 6,482,780 revisions). We compare our model with several competing methods and demonstrate SNAD is significantly more effective: on average it achieves 20–30% greater area under an ROC curve. PMID:23399988

  9. Effects of recent logging on the main channel of North Fork Caspar Creek

    Treesearch

    Thomas E. Lisle; Michael Napolitano

    1998-01-01

    The response of the mainstem channel of North Fork Caspar Creek to recent logging is examined by time trends in bed load yield, scour and fill at resurveyed cross sections, and the volume and fine-sediment content of pools. Companion papers report that recent logging has increased streamflow during the summer and moderate winter rainfall events, and blowdowns from...

  10. Snag longevity and surface fuel accumulation following post-fire logging in a ponderosa pine dominated forest

    Treesearch

    Martin W. Ritchie; Eric E. Knapp; Carl N. Skinner

    2013-01-01

    In a study of post-fire logging effects over an 8 year period at Blacks Mountain Experimental Forest, salvage logging was conducted at varying levels of intensity after a 2002 wildfire event. In a designed experiment, harvest prescriptions with snag retention levels ranging from 0% to 100% in 15 experimental units were installed. Observations of standing snags and...

  11. Including operational data in QMRA model: development and impact of model inputs.

    PubMed

    Jaidi, Kenza; Barbeau, Benoit; Carrière, Annie; Desjardins, Raymond; Prévost, Michèle

    2009-03-01

    A Monte Carlo model, based on the Quantitative Microbial Risk Analysis approach (QMRA), has been developed to assess the relative risks of infection associated with the presence of Cryptosporidium and Giardia in drinking water. The impact of various approaches for modelling the initial parameters of the model on the final risk assessments is evaluated. The Monte Carlo simulations that we performed showed that the occurrence of parasites in raw water was best described by a mixed distribution: log-Normal for concentrations > detection limit (DL), and a uniform distribution for concentrations < DL. The selection of process performance distributions for modelling the performance of treatment (filtration and ozonation) influences the estimated risks significantly. The mean annual risks for conventional treatment are: 1.97E-03 (removal credit adjusted by log parasite = log spores), 1.58E-05 (log parasite = 1.7 x log spores) or 9.33E-03 (regulatory credits based on the turbidity measurement in filtered water). Using full scale validated SCADA data, the simplified calculation of CT performed at the plant was shown to largely underestimate the risk relative to a more detailed CT calculation, which takes into consideration the downtime and system failure events identified at the plant (1.46E-03 vs. 3.93E-02 for the mean risk).

  12. Medical Device Plug-and-Play Interoperability Standards & Technology Leadership

    DTIC Science & Technology

    2011-10-01

    official Department of the Army position, policy or decision unless so designated by other documentation. REPORT DOCUMENTATION PAGE Form Approved...biomedical engineering students completed their senior design project on the X-Ray / Ventilator Use Case. We worked closely with the students to...Supporting Medical Device Adverse Event Analysis in an Interoperable Clinical Environment: Design of a Data Logging and Playback System,” Publication in

  13. Computing environment logbook

    DOEpatents

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  14. Simpler ISS Flight Control Communications and Log Keeping via Social Tools and Techniques

    NASA Technical Reports Server (NTRS)

    Scott, David W.; Cowart, Hugh; Stevens, Dan

    2012-01-01

    The heart of flight operations control involves a) communicating effectively in real time with other controllers in the room and/or in remote locations and b) tracking significant events, decisions, and rationale to support the next set of decisions, provide a thorough shift handover, and troubleshoot/improve operations. International Space Station (ISS) flight controllers speak with each other via multiple voice circuits or loops, each with a particular purpose and constituency. Controllers monitor and/or respond to several loops concurrently. The primary tracking tools are console logs, typically kept by a single operator and not visible to others in real-time. Information from telemetry, commanding, and planning systems also plays into decision-making. Email is very secondary/tertiary due to timing and archival considerations. Voice communications and log entries supporting ISS operations have increased by orders of magnitude because the number of control centers, flight crew, and payload operations have grown. This paper explores three developmental ground system concepts under development at Johnson Space Center s (JSC) Mission Control Center Houston (MCC-H) and Marshall Space Flight Center s (MSFC) Payload Operations Integration Center (POIC). These concepts could reduce ISS control center voice traffic and console logging yet increase the efficiency and effectiveness of both. The goal of this paper is to kindle further discussion, exploration, and tool development.

  15. A Dynamically Configurable Log-based Distributed Security Event Detection Methodology using Simple Event Correlator

    DTIC Science & Technology

    2010-06-01

    shadow |\\/ etc\\/ passwd |cmd... \\.exe .*?)\\s.*\\s\\".*\\" desc=$0 action=shellcmd /home/user/sec -2.5.3/ common/syslogclient "... Synthetic : " "$2|$1...etc\\/ shadow |\\/ etc\\/ passwd |cmd... \\.exe .*?)\\s.*\\s\\".*\\" desc=$0 context =[ HYBRID_LOGGING] action=shellcmd /home/user/sec -2.5.3/ common...suspicious filenames type=Single continue=TakeNext ptype=RegExp pattern =(.*)\\s(.*)\\s.*(\\/ etc\\/ shadow |\\/ etc\\/ passwd |cmd\\.exe .*?)\\s... .*\\s(.*)\\s.*\\s

  16. Log of Apollo 11.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Washington, DC.

    The major events of the first manned moon landing mission, Apollo 11, are presented in chronological order from launch time until arrival of the astronauts aboard the U.S.S. Hornet. The log is descriptive, non-technical, and includes numerous color photographs of the astronauts on the moon. (PR)

  17. Concurrent Schedules of Positive and Negative Reinforcement: Differential-Impact and Differential-Outcomes Hypotheses

    PubMed Central

    Magoon, Michael A; Critchfield, Thomas S

    2008-01-01

    Considerable evidence from outside of operant psychology suggests that aversive events exert greater influence over behavior than equal-sized positive-reinforcement events. Operant theory is largely moot on this point, and most operant research is uninformative because of a scaling problem that prevents aversive events and those based on positive reinforcement from being directly compared. In the present investigation, humans' mouse-click responses were maintained on similarly structured, concurrent schedules of positive (money gain) and negative (avoidance of money loss) reinforcement. Because gains and losses were of equal magnitude, according to the analytical conventions of the generalized matching law, bias (log b ≠ 0) would indicate differential impact by one type of consequence; however, no systematic bias was observed. Further research is needed to reconcile this outcome with apparently robust findings in other literatures of superior behavior control by aversive events. In an incidental finding, the linear function relating log behavior ratio and log reinforcement ratio was steeper for concurrent negative and positive reinforcement than for control conditions involving concurrent positive reinforcement. This may represent the first empirical confirmation of a free-operant differential-outcomes effect predicted by contingency-discriminability theories of choice. PMID:18683609

  18. Methods developed to elucidate nursing related adverse events in Japan.

    PubMed

    Yamagishi, Manaho; Kanda, Katsuya; Takemura, Yukie

    2003-05-01

    Financial resources for quality assurance in Japanese hospitals are limited and few hospitals have quality monitoring systems of nursing service systems. However, recently its necessity has been recognized. This study has cost effectively used adverse event occurrence rates as indicators of the quality of nursing service, and audited methods of collecting data on adverse events to elucidate their approximate true numbers. Data collection was conducted in July, August and November 2000 at a hospital in Tokyo that administered both primary and secondary health care services (281 beds, six wards, average length of stay 23 days). We collected adverse events through incident reports, logs, check-lists, nurse interviews, medication error questionnaires, urine leucocyte tests, patient interviews and medical records. Adverse events included the unplanned removals of invasive lines, medication errors, falls, pressure sores, skin deficiencies, physical restraints, and nosocomial infections. After evaluating the time and useful outcomes of each source, it soon became clear that we could elucidate adverse events most consistently and cost-effectively through incident reports, check lists, nurse interviews, urine leucocyte tests and medication error questionnaires. This study suggests that many hospitals in Japan could monitor the quality of the nursing service using these sources.

  19. Stimulation of 2-methylisoborneol (MIB) production by actinomycetes after cyclic chlorination in drinking water distribution systems.

    PubMed

    Abbaszadegan, Morteza; Yi, Min; Alum, Absar

    2015-01-01

    The impact of fluctuation in chlorine residual on actinomycetes and the production of 2-methylisoborneol (MIB) were studied in cast-iron and PVC model distribution systems. Actinomycetes were spiked in each system and continued operation for a 12-day non-chlorine experiment, resulting in no changes in actinomycetes and MIB concentrations. Three cyclic chlorination events were performed and chlorine residuals were maintained as follows: 1.0 mg L(-1) for 24 h, 0 mg L(-1) for 48 h, 0.5 mg L(-1) for 48 h, 0 mg L(-1) for 48 h and 2 mg L(-1) for 24 h. After each chlorination event, 2 -3 log decrease in actinomycetes was noted in both systems. However, within 48 h at 0 mg L(-1) chlorine, the actinomycetes recovered to the pre-chlorination levels. On the contrary, MIB concentration in both systems remained un-impacted after the first cycle and increased by fourfold (< 5 to > 20 mg L(-1)) after the second cycle, which lasted through the third cycle despite the fact that actinomycetes numbers fluctuated 2-3 logs during this time period. For obtaining biofilm samples from field, water meters were collected from municipality drinking water distribution systems located in central Arizona. The actinomycetes concentration in asbestos cement pipe and cast iron pipe averaged 3.1 × 10(3) and 1.9 × 10(4) CFU cm(-2), respectively. The study shows that production of MIB is associated with changes in chlorine residual in the systems. This is the first report of cyclic chlorine shock as a stimulus for MIB production by actinomycetes in drinking water distribution system's ecology.

  20. Bottomland hardwood forest recovery following tornado disturbance and salvage logging

    Treesearch

    John L. Nelson; John W. Groninger; Loretta L. Battaglia; Charles M. Ruffner

    2008-01-01

    Catastrophic wind events, including tornado, hurricane. and linear winds. are significant disturbances in temperate forested wetlands. Information is lacking on how post-disturbance salvage logging may impact short and long-term objectives in conservation areas where natural stands are typically managed passively. Woody regeneration and herbaceous cover were assessed...

  1. On the importance of accounting for competing risks in pediatric cancer trials designed to delay or avoid radiotherapy: I. Basic concepts and first analyses.

    PubMed

    Tai, Bee-Choo; Grundy, Richard G; Machin, David

    2010-04-01

    In trials designed to delay or avoid irradiation among children with malignant brain tumor, although irradiation after disease progression is an important event, patients who have disease progression may decline radiotherapy (RT), or those without disease progression may opt for elective RT. To accurately describe the cumulative need for RT in such instances, it is crucial to account for these distinct events and to evaluate how each contributes to the delay or advancement of irradiation via a competing risks analysis. We describe the summary of competing events in such trials using competing risks methods based on cumulative incidence functions and Gray's test. The results obtained are contrasted with standard survival methods based on Kaplan-Meier curves, cause-specific hazard functions and log-rank test. The Kaplan-Meier method overestimates all event-specific rates. The cause-specific hazard analysis showed reduction in hazards for all events (A: RT after progression; B: no RT after progression; C: elective RT) among children with ependymoma. For event A, a higher cumulative incidence was reported for ependymoma. Although Gray's test failed to detect any difference (p = 0.331) between histologic subtypes, the log-rank test suggested marginal evidence (p = 0.057). Similarly, for event C, the log-rank test found stronger evidence of reduction in hazard among those with ependymoma (p = 0.005) as compared with Gray's test (p = 0.086). To evaluate treatment differences, failing to account for competing risks using appropriate methodology may lead to incorrect interpretations.

  2. FORTE antenna element and release mechanism design

    NASA Technical Reports Server (NTRS)

    Rohweller, David J.; Butler, Thomas A.

    1995-01-01

    The Fast On-Orbit Recording of Transient Events (FORTE) satellite being built by Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL) has as its most prominent feature a large deployable (11 m by 5 m) log periodic antenna to monitor emissions from electrical storms on the Earth. This paper describes the antenna and the design for the long elements and explains the dynamics of their deployment and the damping system employed. It also describes the unique paraffin-actuated reusable tie-down and release mechanism employed in the system.

  3. Qualitative development of a patient-reported outcome symptom measure in diarrhea-predominant irritable bowel syndrome.

    PubMed

    Marquis, P; Lasch, K E; Delgado-Herrera, L; Kothari, S; Lembo, A; Lademacher, C; Spears, G; Nishida, A; Tesler, Waldman L; Piault, E; Rosa, K; Zeiher, B

    2014-06-26

    Despite a documented clinical need, no patient reported outcome (PRO) symptom measure meeting current regulatory requirements for clinically relevant end points is available for the evaluation of treatment benefit in diarrhea-predominant IBS (IBS-D). Patients (N=113) with IBS-D participated in five study phases: (1) eight concept elicitation focus groups (N=34), from which a 17-item IBS-D Daily Symptom Diary and four-item IBS-D Symptom Event Log (Diary and Event Log) were developed; (2) one-on-one cognitive interviews (N=11) to assess the instrument's comprehensiveness, understandability, appropriateness, and readability; (3) four data triangulation focus groups (N=32) to confirm the concepts elicited; (4) two hybrid (concept elicitation and cognitive interview) focus groups (N=16); and (5) two iterative sets of one-on-one cognitive interviews (N=20) to further clarify the symptoms of IBS-D and debrief a revised seven-item Diary and four-item Event Log. Of thirty-six concepts initially identified, 22 were excluded because they were not saturated, not clinically relevant, not critical symptoms of IBS-D, considered upper GI symptoms, or too broad or vaguely defined. The remaining concepts were diarrhea, immediate need (urgency), bloating/pressure, frequency of bowel movements, cramps, abdominal/stomach pain, gas, completely emptied bowels/incomplete evacuation, accidents, bubbling in intestines (bowel sounds), rectal burning, stool consistency, rectal spasm, and pain while wiping. The final instrument included a daily diary with separate items for abdominal and stomach pain and an event log with four items completed after each bowel movement as follows: (1) a record of the bowel movement/event and an assessment of (2) severity of immediacy of need/bowel urgency, (3) incomplete evacuation, and (4) stool consistency (evaluated using the newly developed Astellas Stool Form Scale). Based on rounds of interviews and clinical input, items considered secondary or nonspecific to IBS-D (rectal burning, bubbling in intestines, spasms, and pain while wiping) were excluded. The IBS-D Symptom Diary and Event Log represent a rigorously developed PRO instrument for the measurement of the IBS-D symptom experience from the perspective of the patient. Its content validity has been supported, and future work should evaluate the instrument's psychometric properties.

  4. Implementing a Rule-Based Contract Compliance Checker

    NASA Astrophysics Data System (ADS)

    Strano, Massimo; Molina-Jimenez, Carlos; Shrivastava, Santosh

    The paper describes the design and implementation of an independent, third party contract monitoring service called Contract Compliance Checker (CCC). The CCC is provided with the specification of the contract in force, and is capable of observing and logging the relevant business-to-business (B2B) interaction events, in order to determine whether the actions of the business partners are consistent with the contract. A contract specification language called EROP (for Events, Rights, Obligations and Prohibitions) for the CCC has been developed based on business rules, that provides constructs to specify what rights, obligation and prohibitions become active and inactive after the occurrence of events related to the execution of business operations. The system has been designed to work with B2B industry standards such as ebXML and RosettaNet.

  5. Improved grading system for structural logs for log homes

    Treesearch

    D.W. Green; T.M. Gorman; J.W. Evans; J.F. Murphy

    2004-01-01

    Current grading standards for logs used in log home construction use visual criteria to sort logs into either “wall logs” or structural logs (round and sawn round timbers). The conservative nature of this grading system, and the grouping of stronger and weaker species for marketing purposes, probably results in the specification of logs with larger diameter than would...

  6. Impact of Different Aortic Entry Tear Sites on Early Outcomes and Long-Term Survival in Patients with Stanford A Acute Aortic Dissection.

    PubMed

    Merkle, Julia; Sabashnikov, Anton; Deppe, Antje Christin; Weber, Saskia; Mader, Navid; Choi, Yeong-Hoon; Liakopoulos, Oliver; Kuhn-Régnier, Ferdinand; Wahlers, Thorsten

    2018-06-13

     Stanford A acute aortic dissection (AAD) is a life-threatening emergency. The aim of this study was to compare the impact of three different aortic entry tear sites on early outcomes and long-term survival of patients with Stanford A AAD.  From January 2006 to April 2015, a total of 240 consecutive patients with diagnosed Stanford A AAD underwent emergent, isolated surgical aortic repair in our center. Patients were divided into three groups comprising isolated ascending aorta, proximal aortic arch, and distal aortic arch entry tear site and were followed up for up to 9 years.  Thirty-day mortality as well as major cerebrovascular events were significantly different between the three groups ( p  = 0.007 and p  = 0.048, respectively). Overall cumulative short- and long-term survival of all patients revealed significant differences (Log-Rank p  = 0.002), whereas survival of all patients free from major cerebrovascular events was similar (Log-Rank p  = 0.780). Subgroup analysis of short- and long-term survival of patients showed significant differences in terms of men (Log-Rank p  = 0.043), women (Log-Rank p  = 0.004), patients over 65 years of age (Log-Rank p  = 0.007), and hypertensive patients (Log-Rank p  = 0.003). Kaplan-Meier survival estimation plots significantly showed poorest survival for distal aortic arch entry tear site group.  The location of the primary entry tear in patients with Stanford A AAD significantly influences early outcomes, short- and long-term survival of patients, whereas survival of patients free from major cerebrovascular events showed similar results among the three groups. Distal aortic entry tear site showed poorest outcomes and survival. Georg Thieme Verlag KG Stuttgart · New York.

  7. Effect of extreme sea surface temperature events on the demography of an age-structured albatross population.

    PubMed

    Pardo, Deborah; Jenouvrier, Stéphanie; Weimerskirch, Henri; Barbraud, Christophe

    2017-06-19

    Climate changes include concurrent changes in environmental mean, variance and extremes, and it is challenging to understand their respective impact on wild populations, especially when contrasted age-dependent responses to climate occur. We assessed how changes in mean and standard deviation of sea surface temperature (SST), frequency and magnitude of warm SST extreme climatic events (ECE) influenced the stochastic population growth rate log( λ s ) and age structure of a black-browed albatross population. For changes in SST around historical levels observed since 1982, changes in standard deviation had a larger (threefold) and negative impact on log( λ s ) compared to changes in mean. By contrast, the mean had a positive impact on log( λ s ). The historical SST mean was lower than the optimal SST value for which log( λ s ) was maximized. Thus, a larger environmental mean increased the occurrence of SST close to this optimum that buffered the negative effect of ECE. This 'climate safety margin' (i.e. difference between optimal and historical climatic conditions) and the specific shape of the population growth rate response to climate for a species determine how ECE affect the population. For a wider range in SST, both the mean and standard deviation had negative impact on log( λ s ), with changes in the mean having a greater effect than the standard deviation. Furthermore, around SST historical levels increases in either mean or standard deviation of the SST distribution led to a younger population, with potentially important conservation implications for black-browed albatrosses.This article is part of the themed issue 'Behavioural, ecological and evolutionary responses to extreme climatic events'. © 2017 The Author(s).

  8. AliEn—ALICE environment on the GRID

    NASA Astrophysics Data System (ADS)

    Saiz, P.; Aphecetche, L.; Bunčić, P.; Piskač, R.; Revsbech, J.-E.; Šego, V.; Alice Collaboration

    2003-04-01

    AliEn ( http://alien.cern.ch) (ALICE Environment) is a Grid framework built on top of the latest Internet standards for information exchange and authentication (SOAP, PKI) and common Open Source components. AliEn provides a virtual file catalogue that allows transparent access to distributed datasets and a number of collaborating Web services which implement the authentication, job execution, file transport, performance monitor and event logging. In the paper we will present the architecture and components of the system.

  9. 77 FR 45675 - Margy Temponeras, M.D.; Decision and Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-01

    ..., under DEA regulations, ``[i]n the event a person commences business with no controlled substances on... further noted that it was ``[o]f significance, [that] no invoices, DEA Form 222s, or dispensing logs were... requires that a registrant retain its invoices, form 222s, as well as a dispensing log, for at least two...

  10. Wetherbee with the ship's log in the middeck

    NASA Image and Video Library

    2001-03-17

    STS102-E-5234 (17 March 2001) --- On Discovery's mid deck, astronauts William M. (Bill) Shepherd (left) and James D. Wetherbee discuss events of the joint activities among shuttle and station crew members. Wetherbee, STS-102 commander, looks over the ship's log with the outgoing station commander. The image was recorded with a digital still camera.

  11. Changes in storm peak flows after clearcut logging

    Treesearch

    Jack Lewis

    1997-01-01

    Streamflow in a rain-dominated, 473-ha watershed bearing second-growth redwood forest was monitored at 13 locations before and after 50% of the watershed was logged, primarily by clearcutting. Three gauged subwatersheds were maintained as unlogged controls through-out the 11-year study period. The analysis included 526 observations of peak flow from 59 storm events....

  12. Cyber indicators of compromise: a domain ontology for security information and event management

    DTIC Science & Technology

    2017-03-01

    COMPROMISE: A DOMAIN ONTOLOGY FOR SECURITY INFORMATION AND EVENT MANAGEMENT by Marsha D. Rowell March 2017 Thesis Co-Advisors: J. D...to automate this work is Security Information and Event Management (SIEM). In short, SIEM technology works by aggregating log information , and then...Distribution is unlimited. CYBER INDICATORS OF COMPROMISE: A DOMAIN ONTOLOGY FOR SECURITY INFORMATION AND EVENT MANAGEMENT Marsha D. Rowell

  13. Survival analysis: Part I — analysis of time-to-event

    PubMed Central

    2018-01-01

    Length of time is a variable often encountered during data analysis. Survival analysis provides simple, intuitive results concerning time-to-event for events of interest, which are not confined to death. This review introduces methods of analyzing time-to-event. The Kaplan-Meier survival analysis, log-rank test, and Cox proportional hazards regression modeling method are described with examples of hypothetical data. PMID:29768911

  14. Discrete dynamical system modelling for gene regulatory networks of 5-hydroxymethylfurfural tolerance for ethanologenic yeast.

    PubMed

    Song, M; Ouyang, Z; Liu, Z L

    2009-05-01

    Composed of linear difference equations, a discrete dynamical system (DDS) model was designed to reconstruct transcriptional regulations in gene regulatory networks (GRNs) for ethanologenic yeast Saccharomyces cerevisiae in response to 5-hydroxymethylfurfural (HMF), a bioethanol conversion inhibitor. The modelling aims at identification of a system of linear difference equations to represent temporal interactions among significantly expressed genes. Power stability is imposed on a system model under the normal condition in the absence of the inhibitor. Non-uniform sampling, typical in a time-course experimental design, is addressed by a log-time domain interpolation. A statistically significant DDS model of the yeast GRN derived from time-course gene expression measurements by exposure to HMF, revealed several verified transcriptional regulation events. These events implicate Yap1 and Pdr3, transcription factors consistently known for their regulatory roles by other studies or postulated by independent sequence motif analysis, suggesting their involvement in yeast tolerance and detoxification of the inhibitor.

  15. Technoeconomic analysis of conventional logging systems operating from stump to landing

    Treesearch

    Raymond L. Sarles; William G. Luppold; William G. Luppold

    1986-01-01

    Analyzes technical and economic factors for six conventional logging systems suitable for operation in eastern forests. Discusses financial risks and business implications for loggers investing in high-production, state-of-the-art logging systems. Provides logging contractors with information useful as a preliminary guide for selection of equipment and systems....

  16. Generalised Extreme Value Distributions Provide a Natural Hypothesis for the Shape of Seed Mass Distributions

    PubMed Central

    2015-01-01

    Among co-occurring species, values for functionally important plant traits span orders of magnitude, are uni-modal, and generally positively skewed. Such data are usually log-transformed “for normality” but no convincing mechanistic explanation for a log-normal expectation exists. Here we propose a hypothesis for the distribution of seed masses based on generalised extreme value distributions (GEVs), a class of probability distributions used in climatology to characterise the impact of event magnitudes and frequencies; events that impose strong directional selection on biological traits. In tests involving datasets from 34 locations across the globe, GEVs described log10 seed mass distributions as well or better than conventional normalising statistics in 79% of cases, and revealed a systematic tendency for an overabundance of small seed sizes associated with low latitudes. GEVs characterise disturbance events experienced in a location to which individual species’ life histories could respond, providing a natural, biological explanation for trait expression that is lacking from all previous hypotheses attempting to describe trait distributions in multispecies assemblages. We suggest that GEVs could provide a mechanistic explanation for plant trait distributions and potentially link biology and climatology under a single paradigm. PMID:25830773

  17. An integrated 3D log processing optimization system for small sawmills in central Appalachia

    Treesearch

    Wenshu Lin; Jingxin Wang

    2013-01-01

    An integrated 3D log processing optimization system was developed to perform 3D log generation, opening face determination, headrig log sawing simulation, fl itch edging and trimming simulation, cant resawing, and lumber grading. A circular cross-section model, together with 3D modeling techniques, was used to reconstruct 3D virtual logs. Internal log defects (knots)...

  18. Stormwater run-off from an industrial log yard: characterization, contaminant correlation and first-flush phenomenon.

    PubMed

    Kaczala, Fabio; Marques, Marcia; Vinrot, Eva; Hogland, William

    2012-01-01

    The stormwater run-off generated in an industrial log yard during eight run-off events was studied with the main focus on the transport of toxic metals. Associations between water quality constituents and potential surrogates were evaluated by correlation analysis. The first-flush phenomenon was verified by normalized M(V) curves. The results have shown that, whereas some metals such as Zn, Ba, Cd, As and Fe were always detected in these waters, others (Cr, Pb, Cu, Ni, V, Co) were not. Large variations in the water constituents' concentrations were observed, with Fe, Pb and V being the most variable ones. Concentrations of Zn and Cu in the run-off waters exceeded the values established by the Swedish environmental authorities in 100% and 97% of samples, respectively. The correlation analyses indicated TSS as a potential surrogate of Pb, V, Co, Ni, As, Ba, Cr and COD (0.949 > R > 0.808), making it reasonable to state that a treatment system with focus on TSS removal would also reduce toxic metals from these waters. The first-flush phenomenon was evident for most of the constituents. Significant differences (p < 0.05) in the first-flush magnitude of different run-off events were observed confirming that hydro-meteorological variables such as dry period, precipitation duration and average intensity play important roles. Metal loads originating from the log yard were mainly composed ofZn, Cu and Ba. Knowledge of the physicochemical characteristics, discharge dynamics and the storm variables involved in the process is a crucial step for the proposal and implementation of a stormwater management programme.

  19. Reproducing the scaling laws for Slow and Fast ruptures

    NASA Astrophysics Data System (ADS)

    Romanet, Pierre; Bhat, Harsha; Madariaga, Raúl

    2017-04-01

    Modelling long term behaviour of large, natural fault systems, that are geometrically complex, is a challenging problem. This is why most of the research so far has concentrated on modelling the long term response of single planar fault system. To overcome this limitation, we appeal to a novel algorithm called the Fast Multipole Method which was developed in the context of modelling gravitational N-body problems. This method allows us to decrease the computational complexity of the calculation from O(N2) to O(N log N), N being the number of discretised elements on the fault. We then adapted this method to model the long term quasi-dynamic response of two faults, with step-over like geometry, that are governed by rate and state friction laws. We assume the faults have spatially uniform rate weakening friction. The results show that when stress interaction between faults is accounted, a complex spectrum of slip (including slow-slip events, dynamic ruptures and partial ruptures) emerges naturally. The simulated slow-slip and dynamic events follow the scaling law inferred by Ide et al. 2007 i. e. M ∝ T for slow-slip events and M ∝ T2 (in 2D) for dynamic events.

  20. Preliminary Results From the Chicxulub Post-Impact Sediments: XRF and Physical Properties Data

    NASA Astrophysics Data System (ADS)

    Gebhardt, C.; Perez-Cruz, L. L.; Chenot, É.; Christeson, G. L.; Le Ber, E.; Lofi, J.; Nixon, C.; Rae, A.; Expedition 364 Science Party, I. I.

    2017-12-01

    In spring 2016, joint IODP/ICDP Expedition 364 drilled into the peak ring of the Chicxulub crater, offshore the Yucatan Peninsula, Mexico. A continuous core was drilled (Hole M0077A) and recovered a sequence of Paleogene post-impact rocks, suevites, impact-melt rocks and granitic basement between 505.7 m and 1334.7 m below sea floor (bsf). The Chicxulub crater was formed 66 million years ago by an impact. This catastrophic event was directly linked to a major mass extinction. For this study, we concentrate on the post-impact sediments (505.7 to 617.3 m bsf; 48 to 66 Ma). The main goal of drilling the post-impact section was to study the pace and mode of recovery of life in the ocean after the impact, and to analyze the paleoenvironmental changes across the Paleocene and Eocene. The late Paleocene and Eocene are characterized by a series of transient warming events, so-called hyperthermals that were associated with increased atmospheric pCO2. Here, we present preliminary geochemical and physical properties data from the 112 m of Paleogene sediments. XRF data show high log (Ca/Ti) values between 617 and 598 m bsf (Paleocene and early Eocene), and lower values between 598 and 505 m bsf. In particular the upper part is characterized by high-frequency fluctuations in log (Ca/Ti) reflecting repeated changes in lithology. These were presumably caused by Milankovitch cycles. Low log (Ba/Ti) values characterize the lowermost part of the record between 617 and 610, followed by a gradual increase to higher values, presumably indicating an increase in primary productivity towards the end of the Paleocene. Values remain at this higher level between 605 and 540 m bsf. Hyperthermals are characterized by strong positive log (Ba/Ti) peaks, likely pointing at highly elevated primary productivity levels during these short-lived events. Between 540 and 505 m bsf, log (Ba/Ti) values are more variable and drop occasionally to values as low as were encountered in the lowermost part. Similar to the log Ca/Ti curve, the log Ba/Ti curve is superimposed by high-frequency fluctuations. These fluctuations are also strongly visible in color reflectance measurements.

  1. Response of nesting northern goshawks to logging truck noise in northern Arizona

    Treesearch

    Teryl G. Grubb; Larry L. Pater; Angela E. Gatto; David K. Delaney

    2013-01-01

    We recorded 94 sound-response events at 3 adult-occupied northern goshawk (Accipiter gentilis) nests 78 m, 143 m, and 167m from the nearest United States Forest Service maintenance level 3, improved gravel road on the Kaibab Plateau in northern Arizona. During 4 test sessions on 7, 8, 10, and 11 June 2010, we recorded 60 experimentally controlled logging trucks; 30 non...

  2. Development of minimum standards for event-based data collection loggers and performance measure definitions for signalized intersections.

    DOT National Transportation Integrated Search

    2017-01-01

    The arterial traffic signal performance measures were not used to their fullest potential in the past. The development of traffic signal controllers with event-based, high-resolution data logging capabilities enabled the advances in derivation and vi...

  3. Discerning Trends in Performance Across Multiple Events

    NASA Technical Reports Server (NTRS)

    Slater, Simon; Hiltz, Mike; Rice, Craig

    2006-01-01

    Mass Data is a computer program that enables rapid, easy discernment of trends in performance data across multiple flights and ground tests. The program can perform Fourier analysis and other functions for the purposes of frequency analysis and trending of all variables. These functions facilitate identification of past use of diagnosed systems and of anomalies in such systems, and enable rapid assessment of related current problems. Many variables, for computation of which it is usually necessary to perform extensive manual manipulation of raw downlist data, are automatically computed and made available to all users, regularly eliminating the need for what would otherwise be an extensive amount of engineering analysis. Data from flight, ground test, and simulation are preprocessed and stored in one central location for instantaneous access and comparison for diagnostic and trending purposes. Rules are created so that an event log is created for every flight, making it easy to locate information on similar maneuvers across many flights. The same rules can be created for test sets and simulations, and are searchable, so that information on like events is easily accessible.

  4. On the equivalence of case-crossover and time series methods in environmental epidemiology.

    PubMed

    Lu, Yun; Zeger, Scott L

    2007-04-01

    The case-crossover design was introduced in epidemiology 15 years ago as a method for studying the effects of a risk factor on a health event using only cases. The idea is to compare a case's exposure immediately prior to or during the case-defining event with that same person's exposure at otherwise similar "reference" times. An alternative approach to the analysis of daily exposure and case-only data is time series analysis. Here, log-linear regression models express the expected total number of events on each day as a function of the exposure level and potential confounding variables. In time series analyses of air pollution, smooth functions of time and weather are the main confounders. Time series and case-crossover methods are often viewed as competing methods. In this paper, we show that case-crossover using conditional logistic regression is a special case of time series analysis when there is a common exposure such as in air pollution studies. This equivalence provides computational convenience for case-crossover analyses and a better understanding of time series models. Time series log-linear regression accounts for overdispersion of the Poisson variance, while case-crossover analyses typically do not. This equivalence also permits model checking for case-crossover data using standard log-linear model diagnostics.

  5. Do natural disturbances or the forestry practices that follow them convert forests to early-successional communities?

    PubMed

    Brewer, J Stephen; Bertz, Christine A; Cannon, Jeffery B; Chesser, Jason D; Maynard, Erynn E

    2012-03-01

    Stand-replacing natural disturbances in mature forests are traditionally seen as events that cause forests to revert to early stages of succession and maintain species diversity. In some cases, however, such transitions could be an artifact of salvage logging and may increase biotic homogenization. We present initial (two-year) results of a study of the effects of tornado damage and the combined effects of tornado damage and salvage logging on environmental conditions and ground cover plant communities in mixed oak-pine forests in north central Mississippi. Plots were established in salvage-logged areas, adjacent to plots established before the storm in unlogged areas, spanning a gradient of storm damage intensity. Vegetation change directly attributable to tornado damage was driven primarily by a reduction in canopy cover but was not consistent with a transition to an early stage of succession. Although we observed post-storm increases of several disturbance indicators (ruderals), we also observed significant increases in the abundance of a few species indicative of upland forests. Increases in flowering were just as likely to occur in species indicative of forests as in species indicative of open woodlands. Few species declined as a result of the tornado, resulting in a net increase in species richness. Ruderals were very abundant in salvage-logged areas, which contained significantly higher amounts of bare ground and greater variance in soil penetrability than did damaged areas that were not logged. In contrast to unlogged areas severely damaged by the tornado, most upland forest indicators were not abundant in logged areas. Several of the forest and open-woodland indicators that showed increased flowering in damaged areas were absent or sparse in logged areas. Species richness was lower in salvage-logged areas than in adjacent damaged areas but similar to that in undamaged areas. These results suggest that salvage logging prevented positive responses of several forest and open-woodland species to tornado damage. Anthropogenic disturbances such as salvage logging appear to differ fundamentally from stand-level canopy-reducing disturbances in their effects on ground cover vegetation in the forests studied here and are perhaps more appropriately viewed as contributing to biotic homogenization than as events that maintain diversity.

  6. Body Fat Distribution Ratios and Obstructive Sleep Apnea Severity in Youth With Obesity.

    PubMed

    Glicksman, Amy; Hadjiyannakis, Stasia; Barrowman, Nicholas; Walker, Scott; Hoey, Lynda; Katz, Sherri Lynne

    2017-04-15

    Obesity and regional fat distribution, measured by neck fat mass percentage using dual-energy X-ray absorptiometry (DXA), correlate with obstructive sleep apnea (OSA) severity in adults. In obese children, neck-to-waist-circumference ratio predicts OSA. This study examined associations between body fat percentage and distribution and sleep-disordered breathing (SDB) severity in obese youth, measured with DXA. Cross-sectional retrospective study conducted at a tertiary children's hospital. Participants were aged 6 to 18 years with obesity (body mass index [BMI] > 99th percentile [BMI z-score 2.35] or > 95th percentile with comorbidity). They underwent polysomnography and DXA to quantify body fat percentage and distribution ratios (neck-to-abdominal fat percentage [NAF % ratio]). SDB was defined as apnea-hypopnea index (AHI) > 5 and OSA as obstructive AHI (OAHI) > 1 event/h. Relationships of BMI z-score and NAF % ratio to log AHI and log OAHI were evaluated. Thirty individuals participated; 18 male; median age 14.1 years. Twenty-four individuals had BMI z-scores > 2.35. Ten had AHI > 5 events/h. NAF % ratio was significantly associated with log AHI in males and with log OAHI in all, whereas total fat mass percent was not. The association between log OAHI and NAF % ratio was significant in males, but not females. NAF % ratio was significantly associated with log OAHI in those with BMI z-score above 2.35. NAF % ratio was associated with OSA severity in males and youth with BMI > 99th percentile; however, total fat mass percentage was not, suggesting that body fat distribution is associated with OSA risk in youth. © 2017 American Academy of Sleep Medicine

  7. Estimation of the displacements among distant events based on parallel tracking of events in seismic traces under uncertainty

    NASA Astrophysics Data System (ADS)

    Huamán Bustamante, Samuel G.; Cavalcanti Pacheco, Marco A.; Lazo Lazo, Juan G.

    2018-07-01

    The method we propose in this paper seeks to estimate interface displacements among strata related with reflection seismic events, in comparison to the interfaces at other reference points. To do so, we search for reflection events in the reference point of a second seismic trace taken from the same 3D survey and close to a well. However, the nature of the seismic data introduces uncertainty in the results. Therefore, we perform an uncertainty analysis using the standard deviation results from several experiments with cross-correlation of signals. To estimate the displacements of events in depth between two seismic traces, we create a synthetic seismic trace with an empirical wavelet and the sonic log of the well, close to the second seismic trace. Then, we relate the events of the seismic traces to the depth of the sonic log. Finally, we test the method with data from the Namorado Field in Brazil. The results show that the accuracy of the event estimated depth depends on the results of parallel cross-correlation, primarily those from the procedures used in the integration of seismic data with data from the well. The proposed approach can correctly identify several similar events in two seismic traces without requiring all seismic traces between two distant points of interest to correlate strata in the subsurface.

  8. Applying the Ce-in-zircon oxygen geobarometer to diverse silicic magmatic systems

    NASA Astrophysics Data System (ADS)

    Claiborne, L. L.; Miller, C. F.

    2012-12-01

    Zircon provides information on age, temperature, and composition of the magma from which it grew. In systems such as Mount St. Helens, where zircon is not coeval with the rest of the crystal cargo, it provides the only accessible record of the extended history of the magmatic system, including cycles of intrusion, crystallization and rejuvenation beneath an active volcano (Claiborne et al., 2010). The rare earth elements, which are present in measureable quantities in zircon, provide information about the composition of the magma from which zircon grew. Unique among the generally trivalent rare earth elements, cerium can exist as either trivalent or tetravalent, depending on the oxidation state of the magma. The tetravalent ion is highly compatible in zircon, in the site that usually hosts tetravalent zirconium, and so the amount of Cerium in zircon relative (relative to what would be expected of trivalent Ce) depends the oxidation state of the magma from which it grew. Trail et al. (2011) proposed a calibration based on experimental data that uses the Ce anomaly in zircon as a direct proxy for magma oxidation (fugacity), describing the relationship between Ce in zircon and magma oxygen fugacity as ln(Ce/Ce*)D = (0.1156±0.0050)xln(fO2)+(13860±708)/T-(6.125±0.484). For systems like Mount St. Helens, where the major minerals record only events in the hundreds to thousands of years leading to eruption, (including the Fe-Ti oxides traditionally relied upon for records of oxidation state of the magmas), this presents a novel approach for understanding more extended histories of oxidation of magmas in the tens and hundreds of thousands of years of magmatism at a volcanic center. This calibration also promises to help us better constrain conditions of crystallization in intrusive portions of volcanic systems, as well as plutonic bodes. We apply this new oxygen geobarometer to natural volcanic and plutonic zircons from a variety of tectonic settings, and compare to existing indicators of oxidation state for each system, as available. Zircons included this study are from Mount St. Helens (ΔNNO +1.5 log units; Smith, 1984), the Peach Spring Tuff and Spirit Mountain Batholith (sphene-bearing, silicic, Miocene-aged rocks from the Colorado River Extensional Corridor), Alid Volcano in Eritrea, and rhyolites and granites from Iceland. Median log fO2 for these systems, calculated from the Cerium anomaly in zircons following Trail et al. (2011) using temperatures from Ti-in-zircon thermometry (Ferry and Watson, 2007) are as follows: Alid -12 bars (ΔNNO +3 log units) at 750 degrees C; Iceland -11 bars (ΔNNO +3 log units) at 800 degrees C; Mount St. Helens -8.6 bars (ΔNNO +6 log units) at 750 degrees C; Peach Spring Tuff -3.4 (ΔNNO +10 log units) at 830 degrees C. While ubiquitous sphene in the Spirit Mountain granites suggest relatively high fO2, calculations based on the cerium anomaly in zircon suggest median log fO2 of >0 at 770 degrees C, which is certainly erroneous. While median values for our natural zircons are, for the most part, above expected fugacities for each system when compared with other indicators, and extreme values for each system are almost certainly erroneous, many are within expected values for terrestrial magmas and they vary relative to one another as might be expected given the magma types and tectonic settings.

  9. Study on Radiation Condition in DAMPE Orbit by Analyzing the Engineering Data of BGO Calorimeter

    NASA Astrophysics Data System (ADS)

    Feng, Changqing; Liu, Shubin; Zhang, Yunlong; Ma, Siyuan

    2016-07-01

    The DAMPE (DArk Matter Particle Explorer) is a scientific satellite which was successfully launched into a 500 Km sun-synchronous orbit, on December 17th, 2015, from the Jiuquan Satellite Launch Center of China. The major scientific objectives of the DAMPE mission are primary cosmic ray, gamma ray astronomy and dark matter particles, by observing high energy primary cosmic rays, especially positrons/electrons and gamma rays with an energy range from 5 GeV to 10 TeV. The BGO calorimeter is a critical sub-detector of DAMPE payload, for measuring the energy of cosmic particles, distinguishing positrons/electrons and gamma rays from hadron background, and providing trigger information. It utilizes 308 BGO (Bismuth Germanate Oxide) crystal logs with the size of 2.5cm*2.5cm*60cm for each log, to form a total absorption electromagnetic calorimeter. All the BGO logs are stacked in 14 layers, with each layer consisting of 22 BGO crystal logs and each log is viewed by two Hamamatsu R5610A PMTs (photomultiplier tubes), from both sides respectively. In order to achieve a large dynamic range, each PMT base incorporates a three dynode (2, 5, 8) pick off, which results in 616 PMTs and 1848 signal channels. The readout electronics system, which consists of 16 FEE (Front End Electronics) modules, was developed. Its main functions are based on the Flash-based FPGA (Field Programmable Gate Array) chip and low power, 32-channel VA160 and VATA160 ASICs (Application Specific Integrated Circuits) for precisely measuring the charge of PMT signals and providing "hit" signals as well. The hit signals are sent to the trigger module of PDPU (Payload Data Process Unit) and the hit rates of each layer is real-timely recorded by counters and packed into the engineering data, which directly reflect the flux of particles which fly into or pass through the detectors. In order to mitigate the SEU (Single Event Upset) effect in radioactive space environment, certain protecting methods, such as TMR (Triple Modular Redundancy) and CRC (Cyclic Redundancy Check) for some critical registers in FPGA logic was adopted. To mitigate the SEL (Single Event Latch-up) effect for the ASICs chips, a protecting solution by monitoring the current of VA160/VATA160 chips are applied. All the SEU and SEL events are recorded by counters and transmitted to ground station in the form of engineering data. The information of hit rates, and the SEU and SEL counters in the engineering data can be used to evaluate the radiation condition and its variations in DAMPE orbit. The preliminary results are introduced in this paper, which is based on the engineering data in the first six months after launching.

  10. Well network installation and hydrogeologic data collection, Assateague Island National Seashore, Worcester County, Maryland, 2010

    USGS Publications Warehouse

    Banks, William S.L.; Masterson, John P.; Johnson, Carole D.

    2012-01-01

    The U.S. Geological Survey, as part of its Climate and Land Use Change Research and Development Program, is conducting a multi-year investigation to assess potential impacts on the natural resources of Assateague Island National Seashore, Maryland that may result from changes in the hydrologic system in response to projected sea-level rise. As part of this effort, 26 monitoring wells were installed in pairs along five east-west trending transects. Each of the five transects has between two and four pairs of wells, consisting of a shallow well and a deeper well. The shallow well typically was installed several feet below the water table—usually in freshwater about 10 feet below land surface (ft bls)—to measure water-level changes in the shallow groundwater system. The deeper well was installed below the anticipated depth to the freshwater-saltwater interface—usually in saltwater about 45 to 55 ft bls—for the purpose of borehole geophysical logging to characterize local differences in lithology and salinity and to monitor tidal influences on groundwater. Four of the 13 shallow wells and 5 of the 13 deeper wells were instrumented with water-level recorders that collected water-level data at 15-minute intervals from August 12 through September 28, 2010. Data collected from these instrumented wells were compared with tide data collected north of Assateague Island at the Ocean City Inlet tide gage, and precipitation data collected by National Park Service staff on Assateague Island. These data indicate that precipitation events coupled with changes in ambient sea level had the largest effect on groundwater levels in all monitoring wells near the Atlantic Ocean and Chincoteague and Sinepuxent Bays, whereas precipitation events alone had the greatest impact on shallow groundwater levels near the center of the island. Daily and bi-monthly tidal cycles appeared to have minimal influence on groundwater levels throughout the island and the water-level changes that were observed appeared to vary among well sites, indicating that changes in lithology and salinity also may affect the response of water levels in the shallow and deeper groundwater systems throughout the island. Borehole geophysical logs were collected at each of the 13 deeper wells along the 5 transects. Electromagnetic induction logs were collected to identify changes in lithology; determine the approximate location of the freshwater-saltwater interface; and characterize the distribution of fresh and brackish water in the shallow aquifer, and the geometry of the fresh groundwater lens beneath the island. Natural gamma logs were collected to provide information on the geologic framework of the island including the presence and thickness of finer-grained deposits found in the subsurface throughout the island during previous investigations. Results of this investigation show the need for collection of continuous water-level data in both the shallow and deeper parts of the flow system and electromagnetic induction and natural gamma geophysical logging data to better understand the response of this groundwater system to changes in precipitation and tidal forcing. Hydrologic data collected as part of this investigation will serve as the foundation for the development of numerical flow models to assess the potential effects of climate change on the coastal groundwater system of Assateague Island.

  11. Balloon logging with the inverted skyline

    NASA Technical Reports Server (NTRS)

    Mosher, C. F.

    1975-01-01

    There is a gap in aerial logging techniques that has to be filled. The need for a simple, safe, sizeable system has to be developed before aerial logging will become effective and accepted in the logging industry. This paper presents such a system designed on simple principles with realistic cost and ecological benefits.

  12. 3-D Vp/Vs Ratio Distribution in the Geothermal Reservoir at Basel, Switzerland, from Microseismic Data

    NASA Astrophysics Data System (ADS)

    Kummerow, J.; Reshetnikov, A.; Häring, M.; Asanuma, H.

    2012-12-01

    Thousands of microseismic events occurred during and after the stimulation of the 4.5km deep Basel 1 well at the Deep Heat Mining Project in Basel, Switzerland, in December 2006. The located seismicity extends about 1km in vertical direction and also 1km in NNW-SSE direction, consistent with the orientation of the maximum horizontal stress. In this study, we analyze 2100 events with magnitudes Mw>0.0, which were recorded by six borehole seismometers between December 2, 2006, and June 7, 2007. We first identify event multiplets based on waveform similarity and apply an automatic, iterative arrival time optimization to calculate high-precision P and S time picks for the multiplet events. Local estimates of the Vp/Vs ratio in the stimulated Basel geothermal reservoir are then obtained from the slope of the demeaned differential S versus P arrival times. The average value of Vp/Vs=1.70 is close to the characteristic reservoir value of 1.72, which was determined independently from sonic log measurements. Also, in the vicinity of the borehole, the depth distribution of Vp/Vs correlates well with the low-pass filtered sonic log data: Vp/Vs values are less than 1.70 at the top of the seismicity cloud at <3.9km depth, close to average at 4.0-4.4km depth, and exceed the value of 1.75 at larger depth (4.4-4.6km), consistent with the sonic log data. Furthermore, we observe a correlation of anomalous Vp/Vs values with zones of enhanced seismic reflectivity which were resolved by microseismic reflection imaging. Away from the borehole, increased Vp/Vs ratios also seem to correlate with domains of high event density, possibly indicating fluid migration paths.

  13. Study of Magnitudes, Seismicity and Earthquake Detectability Using a Global Network

    DTIC Science & Technology

    1984-06-01

    Ings. the stations are then further classified into three groups: j . A Stations reporting a P-detection with an associated log(A/ T ) value (V...detections, nondetections and reported log(A/ T ) values for the j’th event. given that Its true magnitude is/u j . where t .-. ( + j )1J jL (4) Aj~~ -.Q’ +Qj...subset for which I., ’. % , ILriLI-W I tT W W9 .11mtl’%r’ vl.:", Q -A -Pk,7,W-W ’ dW .P ,[ J -4-- log(A/ T ) was reported. As a first order approximation

  14. High-throughput determination of octanol/water partition coefficients using a shake-flask method and novel two-phase solvent system.

    PubMed

    Morikawa, Go; Suzuka, Chihiro; Shoji, Atsushi; Shibusawa, Yoichi; Yanagida, Akio

    2016-01-05

    A high-throughput method for determining the octanol/water partition coefficient (P(o/w)) of a large variety of compounds exhibiting a wide range in hydrophobicity was established. The method combines a simple shake-flask method with a novel two-phase solvent system comprising an acetonitrile-phosphate buffer (0.1 M, pH 7.4)-1-octanol (25:25:4, v/v/v; AN system). The AN system partition coefficients (K(AN)) of 51 standard compounds for which log P(o/w) (at pH 7.4; log D) values had been reported were determined by single two-phase partitioning in test tubes, followed by measurement of the solute concentration in both phases using an automatic flow injection-ultraviolet detection system. The log K(AN) values were closely related to reported log D values, and the relationship could be expressed by the following linear regression equation: log D=2.8630 log K(AN) -0.1497(n=51). The relationship reveals that log D values (+8 to -8) for a large variety of highly hydrophobic and/or hydrophilic compounds can be estimated indirectly from the narrow range of log K(AN) values (+3 to -3) determined using the present method. Furthermore, log K(AN) values for highly polar compounds for which no log D values have been reported, such as amino acids, peptides, proteins, nucleosides, and nucleotides, can be estimated using the present method. The wide-ranging log D values (+5.9 to -7.5) of these molecules were estimated for the first time from their log K(AN) values and the above regression equation. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Testing the Quick Seismic Event Locator and Magnitude Calculator (SSL_Calc) by Marsite Project Data Base

    NASA Astrophysics Data System (ADS)

    Tunc, Suleyman; Tunc, Berna; Caka, Deniz; Baris, Serif

    2016-04-01

    Locating and calculating size of the seismic events is quickly one of the most important and challenging issue in especially real time seismology. In this study, we developed a Matlab application to locate seismic events and calculate their magnitudes (Local Magnitude and empirical Moment Magnitude) using single station called SSL_Calc. This newly developed sSoftware has been tested on the all stations of the Marsite project "New Directions in Seismic Hazard Assessment through Focused Earth Observation in the Marmara Supersite-MARsite". SSL_Calc algorithm is suitable both for velocity and acceleration sensors. Data has to be in GCF (Güralp Compressed Format). Online or offline data can be selected in SCREAM software (belongs to Guralp Systems Limited) and transferred to SSL_Calc. To locate event P and S wave picks have to be marked by using SSL_Calc window manually. During magnitude calculation, instrument correction has been removed and converted to real displacement in millimeter. Then the displacement data is converted to Wood Anderson Seismometer output by using; Z=[0;0]; P=[-6.28+4.71j; -6.28-4.71j]; A0=[2080] parameters. For Local Magnitude calculation,; maximum displacement amplitude (A) and distance (dist) are used in formula (1) for distances up to 200km and formula (2) for more than 200km. ML=log10(A)-(-1.118-0.0647*dist+0.00071*dist2-3.39E-6*dist3+5.71e-9*dist4) (1) ML=log10(A)+(2.1173+0.0082*dist-0.0000059628*dist2) (2) Following Local Magnitude calculation, the programcode calculates two empiric Moment Magnitudes using formulas (3) Akkar et al. (2010) and (4) Ulusay et al. (2004). Mw=0.953* ML+0.422 (3) Mw=0.7768* ML+1.5921 (4) SSL_Calc is a software that is easy to implement and user friendly and offers practical solution to individual users to location of event and ML, Mw calculation.

  16. Parental perceptions of the learner driver log book system in two Australian states.

    PubMed

    Bates, Lyndel; Watson, Barry; King, Mark Johann

    2014-01-01

    Though many jurisdictions internationally now require learner drivers to complete a specified number of hours of supervised driving practice before being able to drive unaccompanied, very few require learner drivers to complete a log book to record this practice and then present it to the licensing authority. Learner drivers in most Australian jurisdictions must complete a log book that records their practice, thereby confirming to the licensing authority that they have met the mandated hours of practice requirement. These log books facilitate the management and enforcement of minimum supervised hours of driving requirements. Parents of learner drivers in 2 Australian states, Queensland and New South Wales, completed an online survey assessing a range of factors, including their perceptions of the accuracy of their child's learner log book and the effectiveness of the log book system. The study indicates that the large majority of parents believe that their child's learner log book is accurate. However, they generally report that the log book system is only moderately effective as a system to measure the number of hours of supervised practice a learner driver has completed. The results of this study suggest the presence of a paradox, with many parents possibly believing that others are not as diligent in the use of log books as they are or that the system is too open to misuse. Given that many parents report that their child's log book is accurate, this study has important implications for the development and ongoing monitoring of hours of practice requirements in graduated driver licensing systems.

  17. Surface Soil Changes Following Selective Logging in an Eastern Amazon Forest

    NASA Technical Reports Server (NTRS)

    Olander, Lydia P.; Bustamante, Mercedes M.; Asner, Gregory P.; Telles, Everaldo; Prado, Zayra; Camargo, Plinio B.

    2005-01-01

    In the Brazilian Amazon, selective logging is second only to forest conversion in its extent. Conversion to pasture or agriculture tends to reduce soil nutrients and site productivity over time unless fertilizers are added. Logging removes nutrients in bole wood, enough that repeated logging could deplete essential nutrients over time. After a single logging event, nutrient losses are likely to be too small to observe in the large soil nutrient pools, but disturbances associated with logging also alter soil properties. Selective logging, particularly reduced-impact logging, results in consistent patterns of disturbance that may be associated with particular changes in soil properties. Soil bulk density, pH, carbon (C), nitrogen (N), phosphorus (P), calcium (Ca), magnesium (Mg), potassium (K), iron (Fe), aluminum (Al), delta(sup 3)C, delta(sup 15)N, and P fractionations were measured on the soils of four different types of loggingrelated disturbances: roads, decks, skids, and treefall gaps. Litter biomass and percent bare ground were also determined in these areas. To evaluate the importance of fresh foliage inputs from downed tree crowns in treefall gaps, foliar nutrients for mature forest trees were also determined and compared to that of fresh litterfall. The immediate impacts of logging on soil properties and how these might link to the longer-term estimated nutrient losses and the observed changes in soils were studied.

  18. A Flight/Ground/Test Event Logging Facility

    NASA Technical Reports Server (NTRS)

    Dvorak, Daniel

    1999-01-01

    The onboard control software for spacecraft such as Mars Pathfinder and Cassini is composed of many subsystems including executive control, navigation, attitude control, imaging, data management, and telecommunications. The software in all of these subsystems needs to be instrumented for several purposes: to report required telemetry data, to report warning and error events, to verify internal behavior during system testing, and to provide ground operators with detailed data when investigating in-flight anomalies. Events can range in importance from purely informational events to major errors. It is desirable to provide a uniform mechanism for reporting such events and controlling their subsequent processing. Since radiation-hardened flight processors are several years behind the speed and memory of their commercial cousins, and since most subsystems require real-time control, and since downlink rates to earth can be very low from deep space, there are limits to how much of the data can be saved and transmitted. Some kinds of events are more important than others and should therefore be preferentially retained when memory is low. Some faults can cause an event to recur at a high rate, but this must not be allowed to consume the memory pool. Some event occurrences may be of low importance when reported but suddenly become more important when a subsequent error event gets reported. Some events may be so low-level that they need not be saved and reported unless specifically requested by ground operators.

  19. Transient Volcano Deformation Event Detection over Variable Spatial Scales in Alaska

    NASA Astrophysics Data System (ADS)

    Li, J. D.; Rude, C. M.; Gowanlock, M.; Herring, T.; Pankratius, V.

    2016-12-01

    Transient deformation events driven by volcanic activity can be monitored using increasingly dense networks of continuous Global Positioning System (GPS) ground stations. The wide spatial extent of GPS networks, the large number of GPS stations, and the spatially and temporally varying scale of deformation events result in the mixing of signals from multiple sources. Typical analysis then necessitates manual identification of times and regions of volcanic activity for further study and the careful tuning of algorithmic parameters to extract possible transient events. Here we present a computer-aided discovery system that facilitates the discovery of potential transient deformation events at volcanoes by providing a framework for selecting varying spatial regions of interest and for tuning the analysis parameters. This site specification step in the framework reduces the spatial mixing of signals from different volcanic sources before applying filters to remove interfering signals originating from other geophysical processes. We analyze GPS data recorded by the Plate Boundary Observatory network and volcanic activity logs from the Alaska Volcano Observatory to search for and characterize transient inflation events in Alaska. We find 3 transient inflation events between 2008 and 2015 at the Akutan, Westdahl, and Shishaldin volcanoes in the Aleutian Islands. The inflation event detected in the first half of 2008 at Akutan is validated other studies, while the inflation events observed in early 2011 at Westdahl and in early 2013 at Shishaldin are previously unreported. Our analysis framework also incorporates modelling of the transient inflation events and enables a comparison of different magma chamber inversion models. Here, we also estimate the magma sources that best describe the deformation observed by the GPS stations at Akutan, Westdahl, and Shishaldin. We acknowledge support from NASA AIST-NNX15AG84G (PI: V. Pankratius).

  20. Use phase signals to promote lifetime extension for Windows PCs.

    PubMed

    Hickey, Stewart; Fitzpatrick, Colin; O'Connell, Maurice; Johnson, Michael

    2009-04-01

    This paper proposes a signaling methodology for personal computers. Signaling may be viewed as an ecodesign strategy that can positively influence the consumer to consumer (C2C) market process. A number of parameters are identified that can provide the basis for signal implementation. These include operating time, operating temperature, operating voltage, power cycle counts, hard disk drive (HDD) self-monitoring, and reporting technology (SMART) attributes and operating system (OS) event information. All these parameters are currently attainable or derivable via embedded technologies in modern desktop systems. A case study detailing a technical implementation of how the development of signals can be achieved in personal computers that incorporate Microsoft Windows operating systems is presented. Collation of lifetime temperature data from a system processor is demonstrated as a possible means of characterizing a usage profile for a desktop system. In addition, event log data is utilized for devising signals indicative of OS quality. The provision of lifetime usage data in the form of intuitive signals indicative of both hardware and software quality can in conjunction with consumer education facilitate an optimal remarketing strategy for used systems. This implementation requires no additional hardware.

  1. Effects of Obstructive Sleep Apnea and Obesity on Cardiac Remodeling: The Wisconsin Sleep Cohort Study.

    PubMed

    Korcarz, Claudia E; Peppard, Paul E; Young, Terry B; Chapman, Carrie B; Hla, K Mae; Barnet, Jodi H; Hagen, Erika; Stein, James H

    2016-06-01

    To characterize the prospective associations of obstructive sleep apnea (OSA) with future echocardiographic measures of adverse cardiac remodeling. This was a prospective long-term observational study. Participants had overnight polysomnography followed by transthoracic echocardiography a mean (standard deviation) of 18.0 (3.7) y later. OSA was characterized by the apnea-hypopnea index (AHI, events/hour). Echocardiography was used to assess left ventricular (LV) systolic and diastolic function and mass, left atrial volume and pressure, cardiac output, systemic vascular resistance, and right ventricular (RV) systolic function, size, and hemodynamics. Multivariate regression models estimated associations between log10(AHI+1) and future echocardiographic findings. A secondary analysis looked at oxygen desaturation indices and future echocardiographic findings. At entry, the 601 participants were mean (standard deviation) 47 (8) y old (47% female). After adjustment for age, sex, and body mass index, baseline log10(AHI+1) was associated significantly with future reduced LV ejection fraction and tricuspid annular plane systolic excursion (TAPSE) ≤ 15 mm. After further adjustment for cardiovascular risk factors, participants with higher baseline log10(AHI+1) had lower future LV ejection fraction (β = -1.35 [standard error = 0.6]/log10(AHI+1), P = 0.03) and higher odds of TAPSE ≤ 15 mm (odds ratio = 6.3/log10(AHI+1), 95% confidence interval = 1.3-30.5, P = 0.02). SaO2 desaturation indices were associated independently with LV mass, LV wall thickness, and RV area (all P < 0.03). OSA is associated independently with decreasing LV systolic function and with reduced RV function. Echocardiographic measures of adverse cardiac remodeling are strongly associated with OSA but are confounded by obesity. Hypoxia may be a stimulus for hypertrophy in individuals with OSA. © 2016 Associated Professional Sleep Societies, LLC.

  2. Testing and comparison of three frequency-based magnitude estimating parameters for earthquake early warning based events in the Yunnan region, China in 2014

    NASA Astrophysics Data System (ADS)

    Zhang, Jianjing; Li, Hongjie

    2018-06-01

    To mitigate potential seismic disasters in the Yunnan region, China, building up suitable magnitude estimation scaling laws for an earthquake early warning system (EEWS) is in high demand. In this paper, the records from the main and after-shocks of the Yingjiang earthquake (M W 5.9), the Ludian earthquake (M W 6.2) and the Jinggu earthquake (M W 6.1), which occurred in Yunnan in 2014, were used to develop three estimators, including the maximum of the predominant period ({{τ }{{p}}}\\max ), the characteristic period (τ c) and the log-average period (τ log), for estimating earthquake magnitude. The correlations between these three frequency-based parameters and catalog magnitudes were developed, compared and evaluated against previous studies. The amplitude and period of seismic waves might be amplified in the Ludian mountain-canyon area by multiple reflections and resonance, leading to excessive values of the calculated parameters, which are consistent with Sichuan’s scaling. As a result, τ log was best correlated with magnitude and τ c had the highest slope of regression equation, while {{τ }{{p}}}\\max performed worst with large scatter and less sensitivity for the change of magnitude. No evident saturation occurred in the case of M 6.1 and M 6.2 in this study. Even though both τ c and τ log performed similarly and can well reflect the size of the Earthquake, τ log has slightly fewer prediction errors for small scale earthquakes (M ≤ 4.5), which was also observed by previous research. Our work offers an insight into the feasibility of a EEWS in Yunnan, China, and this study shows that it is necessary to build up an appropriate scaling law suitable for the warning region.

  3. Lithology identification of aquifers from geophysical well logs and fuzzy logic analysis: Shui-Lin Area, Taiwan

    NASA Astrophysics Data System (ADS)

    Hsieh, Bieng-Zih; Lewis, Charles; Lin, Zsay-Shing

    2005-04-01

    The purpose of this study is to construct a fuzzy lithology system from well logs to identify formation lithology of a groundwater aquifer system in order to better apply conventional well logging interpretation in hydro-geologic studies because well log responses of aquifers are sometimes different from those of conventional oil and gas reservoirs. The input variables for this system are the gamma-ray log reading, the separation between the spherically focused resistivity and the deep very-enhanced resistivity curves, and the borehole compensated sonic log reading. The output variable is groundwater formation lithology. All linguistic variables are based on five linguistic terms with a trapezoidal membership function. In this study, 50 data sets are clustered into 40 training sets and 10 testing sets for constructing the fuzzy lithology system and validating the ability of system prediction, respectively. The rule-based database containing 12 fuzzy lithology rules is developed from the training data sets, and the rule strength is weighted. A Madani inference system and the bisector of area defuzzification method are used for fuzzy inference and defuzzification. The success of training performance and the prediction ability were both 90%, with the calculated correlation of training and testing equal to 0.925 and 0.928, respectively. Well logs and core data from a clastic aquifer (depths 100-198 m) in the Shui-Lin area of west-central Taiwan are used for testing the system's construction. Comparison of results from core analysis, well logging and the fuzzy lithology system indicates that even though the well logging method can easily define a permeable sand formation, distinguishing between silts and sands and determining grain size variation in sands is more subjective. These shortcomings can be improved by a fuzzy lithology system that is able to yield more objective decisions than some conventional methods of log interpretation.

  4. Composition and Temperature Dependence of Shear Viscosity of Hydrocarbon Mixtures

    DTIC Science & Technology

    1980-07-01

    HNN- XTHDCPD Binary System IX. VTF Eq. Parameters for Shear Viscosities Using Constant B Parameter X. Results of Fits to Master Viscosity Eqs. (43...T(K) for 5 C10 Hydrocarbons I Fig. 2a. log n versus 103/T(K) for HNNi I Fig. 2b. log n versus 103/T(K) for XTHDCPD Fig. 3. Isothem of log n versus X...CD for CO-MO Binary System Fig. 4. Isotherm of log n versus XNBC for NBC-DMO Binary System ( ~Fig. 5. Isotherm of log n versus XfINN for HNN- XTHDCPD

  5. 6. Log calving barn. Interior view showing log postandbeam support ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. Log calving barn. Interior view showing log post-and-beam support system and animal stalls. - William & Lucina Bowe Ranch, Log Calving Barn, 230 feet south-southwest of House, Melrose, Silver Bow County, MT

  6. Elevated plasma angiotensin converting enzyme 2 activity is an independent predictor of major adverse cardiac events in patients with obstructive coronary artery disease.

    PubMed

    Ramchand, Jay; Patel, Sheila K; Srivastava, Piyush M; Farouque, Omar; Burrell, Louise M

    2018-01-01

    Angiotensin converting enzyme 2 (ACE2) is an endogenous regulator of the renin angiotensin system. Increased circulating ACE2 predicts adverse outcomes in patients with heart failure (HF), but it is unknown if elevated plasma ACE2 activity predicts major adverse cardiovascular events (MACE) in patients with obstructive coronary artery disease (CAD). We prospectively recruited patients with obstructive CAD (defined as ≥50% stenosis of the left main coronary artery and/or ≥70% stenosis in ≥ 1 other major epicardial vessel on invasive coronary angiography) and measured plasma ACE2 activity. Patients were followed up to determine if circulating ACE2 activity levels predicted the primary endpoint of MACE (cardiovascular mortality, HF or myocardial infarction). We recruited 79 patients with obstructive coronary artery disease. The median (IQR) plasma ACE2 activity was 29.3 pmol/ml/min [21.2-41.2]. Over a median follow up of 10.5 years [9.6-10.8years], MACE occurred in 46% of patients (36 events). On Kaplan-Meier analysis, above-median plasma ACE2 activity was associated with MACE (log-rank test, p = 0.035) and HF hospitalisation (p = 0.01). After Cox multivariable adjustment, log ACE2 activity remained an independent predictor of MACE (hazard ratio (HR) 2.4, 95% confidence interval (CI) 1.24-4.72, p = 0.009) and HF hospitalisation (HR: 4.03, 95% CI: 1.42-11.5, p = 0.009). Plasma ACE2 activity independently increased the hazard of adverse long-term cardiovascular outcomes in patients with obstructive CAD.

  7. A three-dimensional optimal sawing system for small sawmills in central Appalachia

    Treesearch

    Wenshu Lin; Jingxin Wang; R. Edward. Thomas

    2011-01-01

    A three-dimensional (3D) log sawing optimization system was developed to perform 3D log generation, opening face determination, sawing simulation, and lumber grading. Superficial characteristics of logs such as length, large-end and small-end diameters, and external defects were collected from local sawmills. Internal log defect positions and shapes were predicted...

  8. Challenges in converting among log scaling methods.

    Treesearch

    Henry Spelter

    2003-01-01

    The traditional method of measuring log volume in North America is the board foot log scale, which uses simple assumptions about how much of a log's volume is recoverable. This underestimates the true recovery potential and leads to difficulties in comparing volumes measured with the traditional board foot system and those measured with the cubic scaling systems...

  9. On the log-normality of historical magnetic-storm intensity statistics: implications for extreme-event probabilities

    USGS Publications Warehouse

    Love, Jeffrey J.; Rigler, E. Joshua; Pulkkinen, Antti; Riley, Pete

    2015-01-01

    An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to −Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, −Dst≥850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42,2.41] times per century; a 100-yr magnetic storm is identified as having a −Dst≥880 nT (greater than Carrington) but a wide 95% confidence interval of [490,1187] nT.

  10. Comparison study of global and local approaches describing critical phenomena on the Polish stock exchange market

    NASA Astrophysics Data System (ADS)

    Czarnecki, Łukasz; Grech, Dariusz; Pamuła, Grzegorz

    2008-12-01

    We confront global and local methods to analyze the financial crash-like events on the Polish financial market from the critical phenomena point of view. These methods are based on the analysis of log-periodicity and the local fractal properties of financial time series in the vicinity of phase transitions (crashes). The whole history (1991-2008) of Warsaw Stock Exchange Index (WIG) describing the largest developing financial market in Europe, is analyzed in a daily time horizon. We find that crash-like events on the Polish financial market are described better by the log-divergent price model decorated with log-periodic behavior than the corresponding power-law-divergent price model. Predictions coming from log-periodicity scenario are verified for all main crashes that took place in WIG history. It is argued that crash predictions within log-periodicity model strongly depend on the amount of data taken to make a fit and therefore are likely to contain huge inaccuracies. Turning to local fractal description, we calculate the so-called local (time dependent) Hurst exponent H for the WIG time series and we find the dependence between the behavior of the local fractal properties of the WIG time series and the crashes appearance on the financial market. The latter method seems to work better than the global approach - both for developing as for developed markets. The current situation on the market, particularly related to the Fed intervention in September’07 and the situation on the market immediately after this intervention is also analyzed from the fractional Brownian motion point of view.

  11. Forest farming of shiitake mushrooms: an integrated evaluation of management practices.

    PubMed

    Bruhn, J N; Mihail, J D; Pickens, J B

    2009-12-01

    Two outdoor shiitake (Lentinula edodes) cultivation experiments, established in Missouri USA in 1999 and 2000, produced mushrooms in 2000-2005. We examined shiitake production in response to substrate species, inoculum form, inoculum strain, and inoculation timing, using total mushroom weight per log as the primary response variable with log characteristics as covariates. The significantly greater mushroom weight produced by sugar maple logs compared with white or northern red oak was attributable to the higher proportion of undiscolored wood volume in the maple logs, rather than to bark thickness or log diameter. The "wide temperature range" shiitake strain produced significantly greater yield compared with the "warm" or "cold" weather strains. Both the wide-range and warm-weather strains were stimulated to fruit by significant rain events, while the cold-weather strain was responsive to temperature. Inoculation with sawdust spawn gave significantly greater yield than colonized wooden dowels or pre-packaged "thimble" plug inoculum. The second and third full years following inoculation were the most productive.

  12. DIY Soundcard Based Temperature Logging System. Part II: Applications

    ERIC Educational Resources Information Center

    Nunn, John

    2016-01-01

    This paper demonstrates some simple applications of how temperature logging systems may be used to monitor simple heat experiments, and how the data obtained can be analysed to get some additional insight into the physical processes. [For "DIY Soundcard Based Temperature Logging System. Part I: Design," see EJ1114124.

  13. Logging disturbance shifts net primary productivity and its allocation in Bornean tropical forests.

    PubMed

    Riutta, Terhi; Malhi, Yadvinder; Kho, Lip Khoon; Marthews, Toby R; Huaraca Huasco, Walter; Khoo, MinSheng; Tan, Sylvester; Turner, Edgar; Reynolds, Glen; Both, Sabine; Burslem, David F R P; Teh, Yit Arn; Vairappan, Charles S; Majalap, Noreen; Ewers, Robert M

    2018-01-24

    Tropical forests play a major role in the carbon cycle of the terrestrial biosphere. Recent field studies have provided detailed descriptions of the carbon cycle of mature tropical forests, but logged or secondary forests have received much less attention. Here, we report the first measures of total net primary productivity (NPP) and its allocation along a disturbance gradient from old-growth forests to moderately and heavily logged forests in Malaysian Borneo. We measured the main NPP components (woody, fine root and canopy NPP) in old-growth (n = 6) and logged (n = 5) 1 ha forest plots. Overall, the total NPP did not differ between old-growth and logged forest (13.5 ± 0.5 and 15.7 ± 1.5 Mg C ha -1  year -1 respectively). However, logged forests allocated significantly higher fraction into woody NPP at the expense of the canopy NPP (42% and 48% into woody and canopy NPP, respectively, in old-growth forest vs 66% and 23% in logged forest). When controlling for local stand structure, NPP in logged forest stands was 41% higher, and woody NPP was 150% higher than in old-growth stands with similar basal area, but this was offset by structure effects (higher gap frequency and absence of large trees in logged forest). This pattern was not driven by species turnover: the average woody NPP of all species groups within logged forest (pioneers, nonpioneers, species unique to logged plots and species shared with old-growth plots) was similar. Hence, below a threshold of very heavy disturbance, logged forests can exhibit higher NPP and higher allocation to wood; such shifts in carbon cycling persist for decades after the logging event. Given that the majority of tropical forest biome has experienced some degree of logging, our results demonstrate that logging can cause substantial shifts in carbon production and allocation in tropical forests. © 2018 John Wiley & Sons Ltd.

  14. Well logging evaporative thermal protection system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamers, M.D.; Martelli, V.P.

    1981-02-03

    An evaporative thermal protection system for use in hostile environment well logging applications, the system including a downhole thermal protection cartridge disposed within a well logging sonde or tool to keep a payload such as sensors and support electronics cool, the cartridge carrying either an active evaporative system for refrigeration or a passive evaporative system, both exhausting to the surface through an armored flexible fluidic communication mechanical cable.

  15. Survival and Transfer of Murine Norovirus within a Hydroponic System during Kale and Mustard Microgreen Harvesting.

    PubMed

    Wang, Qing; Kniel, Kalmia E

    2016-01-15

    Hydroponically grown microgreens are gaining in popularity, but there is a lack of information pertaining to their microbiological safety. The potential risks associated with virus contamination of crops within a hydroponic system have not been studied to date. Here a human norovirus (huNoV) surrogate (murine norovirus [MNV]) was evaluated for its ability to become internalized from roots to edible tissues of microgreens. Subsequently, virus survival in recirculated water without adequate disinfection was assessed. Kale and mustard seeds were grown on hydroponic pads (for 7 days with harvest at days 8 to 12), edible tissues (10 g) were cut 1 cm above the pads, and corresponding pieces (4 cm by 4 cm) of pads containing only roots were collected separately. Samples were collected from a newly contaminated system (recirculated water inoculated with ∼3 log PFU/ml MNV on day 8) and from a previously contaminated system. (A contaminated system without adequate disinfection or further inoculation was used for production of another set of microgreens.) Viral titers and RNA copies were quantified by plaque assay and real-time reverse transcription (RT)-PCR. The behaviors of MNV in kale and mustard microgreens were similar (P > 0.05). MNV was detected in edible tissues and roots after 2 h postinoculation, and the levels were generally stable during the first 12 h. Relatively low levels (∼2.5 to ∼1.5 log PFU/sample of both edible tissues and roots) of infectious viruses were found with a decreasing trend over time from harvest days 8 to 12. However, the levels of viral RNA present were higher and consistently stable (∼4.0 to ∼5.5 log copies/sample). Recirculated water maintained relatively high levels of infectious MNV over the period of harvest, from 3.54 to 2.73 log PFU/ml. Importantly, cross-contamination occurred easily; MNV remained infectious in previously contaminated hydroponic systems for up to 12 days (2.26 to 1.00 PFU/ml), and MNV was detected in both edible tissues and roots. Here we see that viruses can be recirculated in water, even after an initial contamination event is removed, taken up through the roots of microgreens, and transferred to edible tissues. The ease of product contamination shown here reinforces the need for proper sanitation. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  16. Survival and Transfer of Murine Norovirus within a Hydroponic System during Kale and Mustard Microgreen Harvesting

    PubMed Central

    Wang, Qing

    2015-01-01

    Hydroponically grown microgreens are gaining in popularity, but there is a lack of information pertaining to their microbiological safety. The potential risks associated with virus contamination of crops within a hydroponic system have not been studied to date. Here a human norovirus (huNoV) surrogate (murine norovirus [MNV]) was evaluated for its ability to become internalized from roots to edible tissues of microgreens. Subsequently, virus survival in recirculated water without adequate disinfection was assessed. Kale and mustard seeds were grown on hydroponic pads (for 7 days with harvest at days 8 to 12), edible tissues (10 g) were cut 1 cm above the pads, and corresponding pieces (4 cm by 4 cm) of pads containing only roots were collected separately. Samples were collected from a newly contaminated system (recirculated water inoculated with ∼3 log PFU/ml MNV on day 8) and from a previously contaminated system. (A contaminated system without adequate disinfection or further inoculation was used for production of another set of microgreens.) Viral titers and RNA copies were quantified by plaque assay and real-time reverse transcription (RT)-PCR. The behaviors of MNV in kale and mustard microgreens were similar (P > 0.05). MNV was detected in edible tissues and roots after 2 h postinoculation, and the levels were generally stable during the first 12 h. Relatively low levels (∼2.5 to ∼1.5 log PFU/sample of both edible tissues and roots) of infectious viruses were found with a decreasing trend over time from harvest days 8 to 12. However, the levels of viral RNA present were higher and consistently stable (∼4.0 to ∼5.5 log copies/sample). Recirculated water maintained relatively high levels of infectious MNV over the period of harvest, from 3.54 to 2.73 log PFU/ml. Importantly, cross-contamination occurred easily; MNV remained infectious in previously contaminated hydroponic systems for up to 12 days (2.26 to 1.00 PFU/ml), and MNV was detected in both edible tissues and roots. Here we see that viruses can be recirculated in water, even after an initial contamination event is removed, taken up through the roots of microgreens, and transferred to edible tissues. The ease of product contamination shown here reinforces the need for proper sanitation. PMID:26567309

  17. Development of a 3D log sawing optimization system for small sawmills in central Appalachia, US

    Treesearch

    Wenshu Lin; Jingxin Wang; Edward Thomas

    2011-01-01

    A 3D log sawing optimization system was developed to perform log generation, opening face determination, sawing simulation, and lumber grading using 3D modeling techniques. Heuristic and dynamic programming algorithms were used to determine opening face and grade sawing optimization. Positions and shapes of internal log defects were predicted using a model developed by...

  18. 40 CFR 141.35 - Reporting for unregulated contaminant monitoring results.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... logs or construction drawings indicating that the representative well(s) is/are at a representative... after the first sampling event, as specified in Table 2 of § 141.40(a)(4)(i)(B), no notification to EPA... before and one month after the month in which the second sampling event is scheduled (i.e., it is not...

  19. McStas event logger: Definition and applications

    NASA Astrophysics Data System (ADS)

    Bergbäck Knudsen, Erik; Bryndt Klinkby, Esben; Kjær Willendrup, Peter

    2014-02-01

    Functionality is added to the McStas neutron ray-tracing code, which allows individual neutron states before and after a scattering to be temporarily stored, and analysed. This logging mechanism has multiple uses, including studies of longitudinal intensity loss in neutron guides and guide coating design optimisations. Furthermore, the logging method enables the cold/thermal neutron induced gamma background along the guide to be calculated from the un-reflected neutron, using a recently developed MCNPX-McStas interface.

  20. COMBATXXI, JDAFS, and LBC Integration Requirements for EASE

    DTIC Science & Technology

    2015-10-06

    process as linear and as new data is made available, any previous analysis is obsolete and has to start the process over again. Figure 2 proposes a...final line of the manifest file names the scenario file associated with the run. Under the usual practice, the analyst now starts the COMBATXXI...describes which events are to be logged. Finally the scenario is started with the click of a button. The simulation generates logs of a couple of sorts

  1. An empirical method for determining average soil infiltration rates and runoff, Powder River structural basin, Wyoming

    USGS Publications Warehouse

    Rankl, James G.

    1982-01-01

    This report describes a method to estimate infiltration rates of soils for use in estimating runoff from small basins. Average rainfall intensity is plotted against storm duration on log-log paper. All rainfall events are designated as having either runoff or nonrunoff. A power-decay-type curve is visually fitted to separate the two types of rainfall events. This separation curve is an incipient-ponding curve and its equation describes infiltration parameters for a soil. For basins with more than one soil complex, only the incipient-ponding curve for the soil complex with the lowest infiltration rate can be defined using the separation technique. Incipient-ponding curves for soils with infiltration rates greater than the lowest curve are defined by ranking the soils according to their relative permeabilities and optimizing the curve position. A comparison of results for six basins produced computed total runoff for all events used ranging from 16.6 percent less to 2.3 percent more than measured total runoff. (USGS)

  2. Event-based Sensing for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Cohen, G.; Afshar, S.; van Schaik, A.; Wabnitz, A.; Bessell, T.; Rutten, M.; Morreale, B.

    A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatiotemporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and nighttime (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An eventbased sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding challenges required by space-based SSA systems. Results from these experiments and the systems developed highlight the applicability of event-based sensors to ground and space-based SSA tasks.

  3. Correlation of lithologic and sonic logs from the COST No. B-2 well with seismic reflection data

    USGS Publications Warehouse

    King, K.C.

    1979-01-01

    The purpose of this study was to correlate events recorded on seismic records with changes in lithology recorded from sample descriptions from the Continental Offshore Stratigraphic Test (COST) No. B-2 well.  The well is located on the U.S. mid-Atlantic Outer Continental Shelf about 146 km east of Atlantic City, N.J. (see location map).  Lithologic data are summarized from the sample descriptions of Smith and others (1976).  Sonic travel times were read at 0.15 m intervals in the well using a long-space sonic logging tool.  Interval velocities, reflection coefficients and a synthetic seismogram were calculated from the sonic log.

  4. Cross-sectional and longitudinal study of association between circulating thiobarbituric acid-reacting substance levels and clinicobiochemical parameters in 1,178 middle-aged Japanese men - the Amagasaki Visceral Fat Study

    PubMed Central

    2011-01-01

    Background Circulating thiobarbituric acid-reacting substance (TBARS) levels, a marker of systemic oxidative stress, are predictive of cardiovascular events. However, they has not been evaluated in Japanese, especially with regard to the factors that contribute to the changes in circulating TBARS levels. We investigated the cross-sectional and longitudinal relationships between circulating TBARS levels and various clinicobiochemical parameters in middle-aged men. Methods In this population-based study (The Amagasaki Visceral Fat Study), 1,178 Japanese male urban workers who had undergone health check-ups in 2006, 2007 and 2008 and were not on medications for metabolic disorders during the follow-up period, were enrolled. Serum TBARS levels were measured by the method of Yagi. The estimated visceral fat area (eVFA) by bioelectrical impedance was measured annually. After health check-ups, subjects received health education with lifestyle modification by medical personnel. Results The number of cardiovascular risk factors (hypertension, hyperglycemia, low HDL-C, hypertriglyceridemia, hyperuricemia, hyper-LDL-C and impaired renal function) augmented with the increases in log-eVFA (p < 0.0001) and log-TBARS (p < 0.0001). The combination of TBARS and eVFA had a multiplicative effect on risk factor accumulation (F value = 79.1, p = 0.0065). Stepwise multiple regression analysis identified log-eVFA, as well as age, log-body mass index (BMI), LDL-C, log-adiponectin, γ-glutamyl transpeptidase (γ-GTP) and uric acid as significant determinants of log-TBARS. Stepwise multiple regression analysis identified one-year changes in eVFA as well as BMI, γ-GTP and estimated glomerular filtration rate (eGFR) as significant determinants of one-year change in TBARS, and biennial changes in eVFA as well as BMI and γ-GTP, eGFR as significant determinants of biennial change in TBARS. Conclusions The present study showed a significant cross-sectional and longitudinal correlation between TBARS and eVFA, as well as BMI and γ- GTP, eGFR. Visceral fat reduction may independently associate with the improvement in systemic ROS in middle-aged Japanese men. Trial Registration The Amagasaki Visceral Fat Study UMIN000002391. PMID:22108213

  5. MAIL LOG, program theory, volume 1. [Scout project automatic data system

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The program theory used to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, is described. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG data base consists of three main subfiles: (1) incoming and outgoing mail correspondence; (2) design information releases and reports; and (3) drawings and engineering orders. All subroutine descriptions, flowcharts, and MAIL LOG outputs are given and the data base design is described.

  6. USB Storage Device Forensics for Windows 10.

    PubMed

    Arshad, Ayesha; Iqbal, Waseem; Abbas, Haider

    2018-05-01

    Significantly increased use of USB devices due to their user-friendliness and large storage capacities poses various threats for many users/companies in terms of data theft that becomes easier due to their efficient mobility. Investigations for such data theft activities would require gathering critical digital information capable of recovering digital forensics artifacts like date, time, and device information. This research gathers three sets of registry and logs data: first, before insertion; second, during insertion; and the third, after removal of a USB device. These sets are analyzed to gather evidentiary information from Registry and Windows Event log that helps in tracking a USB device. This research furthers the prior research on earlier versions of Microsoft Windows and compares it with latest Windows 10 system. Comparison of Windows 8 and Windows 10 does not show much difference except for new subkey under USB Key in registry. However, comparison of Windows 7 with latest version indicates significant variances. © 2017 American Academy of Forensic Sciences.

  7. An analysis technique for testing log grades

    Treesearch

    Carl A. Newport; William G. O' Regan

    1963-01-01

    An analytical technique that may be used in evaluating log-grading systems is described. It also provides means of comparing two or more grading systems, or a proposed change with the system from which it was developed. The total volume and computed value of lumber from each sample log are the basic data used.

  8. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... suits upon such claims. (b) Logs may be retained on microfilm, microfiche or other data-storage systems... of logs, stored on data-storage systems, to full-size copies, is required of licensees if requested... converting to a data-storage system pursuant to the requirements of § 73.1800 (c) and (d), (§ 73.1800...

  9. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... suits upon such claims. (b) Logs may be retained on microfilm, microfiche or other data-storage systems... of logs, stored on data-storage systems, to full-size copies, is required of licensees if requested... converting to a data-storage system pursuant to the requirements of § 73.1800 (c) and (d), (§ 73.1800...

  10. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... suits upon such claims. (b) Logs may be retained on microfilm, microfiche or other data-storage systems... of logs, stored on data-storage systems, to full-size copies, is required of licensees if requested... converting to a data-storage system pursuant to the requirements of § 73.1800 (c) and (d), (§ 73.1800...

  11. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... suits upon such claims. (b) Logs may be retained on microfilm, microfiche or other data-storage systems... of logs, stored on data-storage systems, to full-size copies, is required of licensees if requested... converting to a data-storage system pursuant to the requirements of § 73.1800 (c) and (d), (§ 73.1800...

  12. 47 CFR 73.1840 - Retention of logs.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... suits upon such claims. (b) Logs may be retained on microfilm, microfiche or other data-storage systems... of logs, stored on data-storage systems, to full-size copies, is required of licensees if requested... converting to a data-storage system pursuant to the requirements of § 73.1800 (c) and (d), (§ 73.1800...

  13. Petrophysical evaluation of subterranean formations

    DOEpatents

    Klein, James D; Schoderbek, David A; Mailloux, Jason M

    2013-05-28

    Methods and systems are provided for evaluating petrophysical properties of subterranean formations and comprehensively evaluating hydrate presence through a combination of computer-implemented log modeling and analysis. Certain embodiments include the steps of running a number of logging tools in a wellbore to obtain a variety of wellbore data and logs, and evaluating and modeling the log data to ascertain various petrophysical properties. Examples of suitable logging techniques that may be used in combination with the present invention include, but are not limited to, sonic logs, electrical resistivity logs, gamma ray logs, neutron porosity logs, density logs, NRM logs, or any combination or subset thereof.

  14. Real-Time Multimission Event Notification System for Mars Relay

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Wang, Paul; Hy, Franklin H.

    2013-01-01

    As the Mars Relay Network is in constant flux (missions and teams going through their daily workflow), it is imperative that users are aware of such state changes. For example, a change by an orbiter team can affect operations on a lander team. This software provides an ambient view of the real-time status of the Mars network. The Mars Relay Operations Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay Network. As part of MaROS, a feature set was developed that operates on several levels of the software architecture. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. The result is a real-time event notification and management system, so mission teams can track and act upon events on a moment-by-moment basis. This software retrieves events from MaROS and displays them to the end user. Updates happen in real time, i.e., messages are pushed to the user while logged into the system, and queued when the user is not online for later viewing. The software does not do away with the email notifications, but augments them with in-line notifications. Further, this software expands the events that can generate a notification, and allows user-generated notifications. Existing software sends a smaller subset of mission-generated notifications via email. A common complaint of users was that the system-generated e-mails often "get lost" with other e-mail that comes in. This software allows for an expanded set (including user-generated) of notifications displayed in-line of the program. By separating notifications, this can improve a user's workflow.

  15. LogScope

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex

    2012-01-01

    LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).

  16. Global Scale Solar Disturbances

    NASA Astrophysics Data System (ADS)

    Title, A. M.; Schrijver, C. J.; DeRosa, M. L.

    2013-12-01

    The combination of the STEREO and SDO missions have allowed for the first time imagery of the entire Sun. This coupled with the high cadence, broad thermal coverage, and the large dynamic range of the Atmospheric Imaging Assembly on SDO has allowed discovery of impulsive solar disturbances that can significantly affect a hemisphere or more of the solar volume. Such events are often, but not always, associated with M and X class flares. GOES C and even B class flares are also associated with these large scale disturbances. Key to the recognition of the large scale disturbances was the creation of log difference movies. By taking the log of images before differencing events in the corona become much more evident. Because such events cover such a large portion of the solar volume their passage can effect the dynamics of the entire corona as it adjusts to and recovers from their passage. In some cases this may lead to a another flare or filament ejection, but in general direct causal evidence of 'sympathetic' behavior is lacking. However, evidence is accumulating these large scale events create an environment that encourages other solar instabilities to occur. Understanding the source of these events and how the energy that drives them is built up, stored, and suddenly released is critical to understanding the origins of space weather. Example events and comments of their relevance will be presented.

  17. Design and Evaluation of Log-To-Dimension Manufacturing Systems Using System Simulation

    Treesearch

    Wenjie Lin; D. Earl Kline; Philip A. Araman; Janice K. Wiedenbeck

    1995-01-01

    In a recent study of alternative dimension manufacturing systems that produce green hardwood dimension directly fromlogs, it was observed that for Grade 2 and 3 red oak logs, up to 78 and 76 percent of the log scale volume could be converted into clear dimension parts. The potential high yields suggest that this processing system can be a promising technique for...

  18. Designing efficient logging systems for northern hardwoods using equipment production capabilities and costs.

    Treesearch

    R.B. Gardner

    1966-01-01

    Describes a typical logging system used in the Lake and Northeastern States, discusses each step in the operation, and presents a simple method for designing and efficient logging system for such an operation. Points out that a system should always be built around the key piece of equipment, which is usually the skidder. Specific equipment types and their production...

  19. g-PRIME: A Free, Windows Based Data Acquisition and Event Analysis Software Package for Physiology in Classrooms and Research Labs.

    PubMed

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.

  20. Global and Local Approaches Describing Critical Phenomena on the Developing and Developed Financial Markets

    NASA Astrophysics Data System (ADS)

    Grech, Dariusz

    We define and confront global and local methods to analyze the financial crash-like events on the financial markets from the critical phenomena point of view. These methods are based respectively on the analysis of log-periodicity and on the local fractal properties of financial time series in the vicinity of phase transitions (crashes). The log-periodicity analysis is made in a daily time horizon, for the whole history (1991-2008) of Warsaw Stock Exchange Index (WIG) connected with the largest developing financial market in Europe. We find that crash-like events on the Polish financial market are described better by the log-divergent price model decorated with log-periodic behavior than by the power-law-divergent price model usually discussed in log-periodic scenarios for developed markets. Predictions coming from log-periodicity scenario are verified for all main crashes that took place in WIG history. It is argued that crash predictions within log-periodicity model strongly depend on the amount of data taken to make a fit and therefore are likely to contain huge inaccuracies. Next, this global analysis is confronted with the local fractal description. To do so, we provide calculation of the so-called local (time dependent) Hurst exponent H loc for the WIG time series and for main US stock market indices like DJIA and S&P 500. We point out dependence between the behavior of the local fractal properties of financial time series and the crashes appearance on the financial markets. We conclude that local fractal method seems to work better than the global approach - both for developing and developed markets. The very recent situation on the market, particularly related to the Fed intervention in September 2007 and the situation immediately afterwards is also analyzed within fractal approach. It is shown in this context how the financial market evolves through different phases of fractional Brownian motion. Finally, the current situation on American market is analyzed in fractal language. This is to show how far we still are from the end of recession and from the beginning of a new boom on US financial market or on other world leading stocks.

  1. Evaluation and performance of a newly developed patient-reported outcome instrument for diarrhea-predominant irritable bowel syndrome in a clinical study population

    PubMed Central

    Delgado-Herrera, Leticia; Lasch, Kathryn; Zeiher, Bernhardt; Lembo, Anthony J.; Drossman, Douglas A.; Banderas, Benjamin; Rosa, Kathleen; Lademacher, Christopher; Arbuckle, Rob

    2017-01-01

    Background: To evaluate the psychometric properties of the newly developed seven-item Irritable Bowel Syndrome – Diarrhea predominant (IBS-D) Daily Symptom Diary and four-item Event Log using phase II clinical trial safety and efficacy data in patients with IBS-D. This instrument measures diarrhea (stool frequency and stool consistency), abdominal pain related to IBS-D (stomach pain, abdominal pain, abdominal cramps), immediate need to have a bowel movement (immediate need and accident occurrence), bloating, pressure, gas, and incomplete evacuation. Methods: Psychometric properties and responsiveness of the instrument were evaluated in a clinical trial population [ClinicalTrials.gov identifier: NCT01494233]. Results: A total of 434 patients were included in the analyses. Significant differences were found among severity groups (p < 0.01) defined by IBS Patient Global Impression of Severity (PGI-S) and IBS Patient Global Impression of Change (PGI-C). Severity scores for each Diary and Event Log item score and five-item, four-item, and three-item summary scores were calculated. Between-group differences in changes over time were significant for all summary scores in groups stratified by changes in PGI-S (p < 0.05), two of six Diary items, and three of four Event Log items; a one-grade change in PGI-S was considered a meaningful difference with mean change scores on all Diary items −0.13 to −0.86 [standard deviation (SD) 0.79–1.39]. Similarly, for patients who reported being ‘slightly improved’ (considered a clinically meaningful difference) on the PGI-C, mean change scores on Diary items ranged from −0.45 to −1.55 (SD 0.69–1.39). All estimates of clinically important change for each item and all summary scores were small and should be considered preliminary. These results are aligned with the previous standalone psychometric study regarding reliability and validity tests. Conclusions: These analyses provide evidence of the psychometric properties of the IBS-D Daily Symptom Diary and Event Log in a clinical trial population. PMID:28932269

  2. Development and Demonstration of a Security Core Component

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turke, Andy

    In recent years, the convergence of a number of trends has resulted in Cyber Security becoming a much greater concern for electric utilities. A short list of these trends includes: · Industrial Control Systems (ICSs) have evolved from depending on proprietary hardware and operating software toward using standard off-the-shelf hardware and operating software. This has meant that these ICSs can no longer depend on “security through obscurity. · Similarly, these same systems have evolved toward using standard communications protocols, further reducing their ability to rely upon obscurity. · The rise of the Internet and the accompanying demand for more datamore » about virtually everything has resulted in formerly isolated ICSs becoming at least partially accessible via Internet-connected networks. · “Cyber crime” has become commonplace, whether it be for industrial espionage, reconnaissance for a possible cyber attack, theft, or because some individual or group “has something to prove.” Electric utility system operators are experts at running the power grid. The reality is, especially at small and mid-sized utilities, these SCADA operators will by default be “on the front line” if and when a cyber attack occurs against their systems. These people are not computer software, networking, or cyber security experts, so they are ill-equipped to deal with a cyber security incident. Cyber Security Manager (CSM) was conceived, designed, and built so that it can be configured to know what a utility’s SCADA/EMS/DMS system looks like under normal conditions. To do this, CSM monitors log messages from any device that uses the syslog standard. It can also monitor a variety of statistics from the computers that make up the SCADA/EMS/DMS: outputs from host-based security tools, intrusion detection systems, SCADA alarms, and real-time SCADA values – even results from a SIEM (Security Information and Event Management) system. When the system deviates from “normal,” CSM can alert the operator in language that they understand that an incident may be occurring, provide actionable intelligence, and informing them what actions to take. These alarms may be viewed on CSM’s built-in user interface, sent to a SCADA alarm list, or communicated via email, phone, pager, or SMS message. In recognition of the fact that “real world” training for cyber security events is impractical, CSM has a built-in Operator Training Simulator capability. This can be used stand alone to create simulated event scenarios for training purposes. It may also be used in conjunction with the recipient’s SCADA/EMS/DMS Operator Training Simulator. In addition to providing cyber security situational awareness for electric utility operators, CSM also provides tools for analysts and support personnel; in fact, the majority of user interface displays are designed for use in analyzing current and past security events. CSM keeps security-related information in long-term storage, as well as writing any decisions it makes to a (syslog) log for use forensic or other post-event analysis.« less

  3. Integrating PCLIPS into ULowell's Lincoln Logs: Factory of the future

    NASA Technical Reports Server (NTRS)

    Mcgee, Brenda J.; Miller, Mark D.; Krolak, Patrick; Barr, Stanley J.

    1990-01-01

    We are attempting to show how independent but cooperating expert systems, executing within a parallel production system (PCLIPS), can operate and control a completely automated, fault tolerant prototype of a factory of the future (The Lincoln Logs Factory of the Future). The factory consists of a CAD system for designing the Lincoln Log Houses, two workcells, and a materials handling system. A workcell consists of two robots, part feeders, and a frame mounted vision system.

  4. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  5. The design and implementation of web mining in web sites security

    NASA Astrophysics Data System (ADS)

    Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li

    2003-06-01

    The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.

  6. Controlled Vocabularies and Ontologies for Oceanographic Data: The R2R Eventlogger Project

    NASA Astrophysics Data System (ADS)

    Coburn, E.; Maffei, A. R.; Chandler, C. L.; Raymond, L. M.

    2012-12-01

    Research vessels coordinated by the United States University-National Oceanographic Laboratory System (US-UNOLS) collect data which is considered an important oceanographic resource. The NSF-funded Rolling Deck to Repository (R2R) project aims to improve access to this data and diminish the barriers to use. One aspect of the R2R project has been to develop a shipboard scientific event logging system, Eventlogger, that incorporates best practice guidelines, controlled vocabularies, a cruise metadata schema, and a scientific event log. This will facilitate the eventual ingestion of datasets into oceanographic data repositories for subsequent integration and synthesis by investigators. One important aspect of this system is the careful use of controlled vocabularies and ontologies. Existing ontologies, where available, will be used and others will be developed. The use of internationally-informed, consensus-driven controlled vocabularies will make datasets more interoperable, and discoverable. The R2R Eventlogger project is led by Woods Hole Oceanographic Institution (WHOI), and the management of the controlled vocabularies and mapping of these vocabularies to authoritative community vocabularies are led by the Data Librarian in the Marine Biological Laboratory/Woods Hole Oceanographic Institution (MBLWHOI) Library. The first target vocabulary is oceanographic instruments. Management of this vocabulary has thus far consisted of reconciling local community terms with the more widely used SeaDataNet Device Vocabulary terms. Rather than adopt existing terms, often the local terms are mapped by data managers in the NSF-funded Biological and Chemical Oceanographic Data Management Office (BCO-DMO) to the existing terms as they are given by investigators and often provide important information and meaning. New terms (often custom, or modified instruments) are submitted for review to the SeaDataNet community listserv for discussion and eventual incorporation into the Device Vocabulary. These vocabularies and their mappings are an important part of the Eventlogger system. Before a research cruise investigators configure the instruments they intend to use for science activities. The instruments available for selection are pulled directly from the instrument vocabulary. The promotion and use of controlled vocabularies and ontologies will pave the way for linked data. By mapping local terms to agreed upon authoritative terms links are created, whereby related datasets can be discovered, and utilized. The Library is a natural home for the management of standards. Librarians have an established history of working with controlled vocabularies and metadata and libraries serve as centers for information discovery. Eventlogger is currently being tested across the UNOLS fleet. A large submission of suggested instrument terms to the SeaDataNet community listserv is in progress. References: Maffei, Andrew R., Cynthia L. Chandler, Janet Fredericks, Nan Galbraith, Laura Stolp. Rolling Deck to Repository (R2R): A Controlled Vocabulary and Ontology Development Effort for Oceanographic Research Cruise Event Logging. EGU2011-12341. Poster presented at the 2011 EGU Meeting.

  7. A rule-based approach for the correlation of alarms to support Disaster and Emergency Management

    NASA Astrophysics Data System (ADS)

    Gloria, M.; Minei, G.; Lersi, V.; Pasquariello, D.; Monti, C.; Saitto, A.

    2009-04-01

    Key words: Simple Event Correlator, Agent Platform, Ontology, Semantic Web, Distributed Systems, Emergency Management The importance of recognition of emergency's typology to control the critical situation for security of citizens has been always recognized. It follows this aspect is very important for proper management of a hazardous event. In this work we present a solution for the recognition of emergency's typology adopted by an Italian research project, called CI6 (Centro Integrato per Servizi di Emergenza Innovativi). In our approach, CI6 receives alarms by citizen or people involved in the work (for example: police, operator of 112, and so on). CI6 represents any alarm by a set of information, including a text that describes it and obtained when the user points out the danger, and a pair of coordinates for its location. The system realizes an analysis of text and automatically infers information on the type of emergencies by means a set of parsing rules and rules of inference applied by a independent module: a correlator of events based on their log and called Simple Event Correlator (SEC). SEC, integrated in CI6's platform, is an open source and platform independent event correlation tool. SEC accepts input both files and text derived from standard input, making it flexible because it can be matched to any application that is able to write its output to a file stream. The SEC configuration is stored in text files as rules, each rule specifying an event matching condition, an action list, and optionally a Boolean expression whose truth value decides whether the rule can be applied at a given moment. SEC can produce output events by executing user-specified shell scripts or programs, by writing messages to files, and by various other means. SEC has been successfully applied in various domains like network management, system monitoring, data security, intrusion detection, log file monitoring and analysis, etc; it has been used or integrated with many application as CiscoWorks, HP OpenView NNM and Operation, BMC Patrol, etc. Analysis of text of an alarm can detect some keywords that allow to classify the particular event. The inference rules were developed by means an analysis about news regard real emergency found by web reaserches. We have seen that often a kind of emergency is characterized by more keyword. Keywords are not uniquely associated with a specific emergency, but they can be shared by different types of emergencies (such as. keyword "landslide" can be associated both emergency "landslide" and emergency "Flood"). However, the identification of two or more keywords associated with a particular type of emergency identified in most cases the correct type of emergency. So, for example, if text contains words as "water", "flood", "overflowing", "landslide" o other words belonging to the set of defined keywords or words that have some root of keywords, the system "decides" that this alarm belongs to specific typology, in this case "flood typology". The system has the memory of this information, so if a new alarm is reported and belongs to one of the typology already identified, it proceeds with the comparison of coordinates. The comparison between the centers of the alarms allows to see if they describe an area inscribed in an ideal circle that has centered on the first alarm and radius defined by the typology above mentioned. If this happens the system CI6 creates an emergency that has centered on the centre of that area and typology equal to that of the alarms. It follows that an emergency is represented by at least two alarms. Thus, the system suggests to manager (CI6's user) the possibility that most alarms can concern same events and makes a classification of this event. It is important to stress that CI6 is a system of decision support, hence also this service is limited to providing advice to the user to facilitate his task, leaving him the decision to accept it or not. REFERENCES SEC (Simple Event Correlator), http://kodu.neti.ee/~risto/sec/ M. Gloria,V. Lersi, G. Minei, D. Pasquariello, C. Monti, A. Saitto, "A Semantic WEB Services Platform to support Disaster and Emergency Management", 4th biennial Meeting of International Environmental Modelling and Software Society (iEMSs), 2008

  8. Web usage data mining agent

    NASA Astrophysics Data System (ADS)

    Madiraju, Praveen; Zhang, Yanqing

    2002-03-01

    When a user logs in to a website, behind the scenes the user leaves his/her impressions, usage patterns and also access patterns in the web servers log file. A web usage mining agent can analyze these web logs to help web developers to improve the organization and presentation of their websites. They can help system administrators in improving the system performance. Web logs provide invaluable help in creating adaptive web sites and also in analyzing the network traffic analysis. This paper presents the design and implementation of a Web usage mining agent for digging in to the web log files.

  9. Event-Based Processing of Neutron Scattering Data

    DOE PAGES

    Peterson, Peter F.; Campbell, Stuart I.; Reuter, Michael A.; ...

    2015-09-16

    Many of the world's time-of-flight spallation neutrons sources are migrating to the recording of individual neutron events. This provides for new opportunities in data processing, the least of which is to filter the events based on correlating them with logs of sample environment and other ancillary equipment. This paper will describe techniques for processing neutron scattering data acquired in event mode that preserve event information all the way to a final spectrum, including any necessary corrections or normalizations. This results in smaller final errors, while significantly reducing processing time and memory requirements in typical experiments. Results with traditional histogramming techniquesmore » will be shown for comparison.« less

  10. Reduction of lithologic-log data to numbers for use in the digital computer

    USGS Publications Warehouse

    Morgan, C.O.; McNellis, J.M.

    1971-01-01

    The development of a standardized system for conveniently coding lithologic-log data for use in the digital computer has long been needed. The technique suggested involves a reduction of the original written alphanumeric log to a numeric log by use of computer programs. This numeric log can then be retrieved as a written log, interrogated for pertinent information, or analyzed statistically. ?? 1971 Plenum Publishing Corporation.

  11. Log Truck-Weighing System

    NASA Technical Reports Server (NTRS)

    1977-01-01

    ELDEC Corp., Lynwood, Wash., built a weight-recording system for logging trucks based on electronic technology the company acquired as a subcontractor on space programs such as Apollo and the Saturn launch vehicle. ELDEC employed its space-derived expertise to develop a computerized weight-and-balance system for Lockheed's TriStar jetliner. ELDEC then adapted the airliner system to a similar product for logging trucks. Electronic equipment computes tractor weight, trailer weight and overall gross weight, and this information is presented to the driver by an instrument in the cab. The system costs $2,000 but it pays for itself in a single year. It allows operators to use a truck's hauling capacity more efficiently since the load can be maximized without exceeding legal weight limits for highway travel. Approximately 2,000 logging trucks now use the system.

  12. Selective logging: does the imprint remain on tree structure and composition after 45 years?

    PubMed

    Osazuwa-Peters, Oyomoare L; Chapman, Colin A; Zanne, Amy E

    2015-01-01

    Selective logging of tropical forests is increasing in extent and intensity. The duration over which impacts of selective logging persist, however, remains an unresolved question, particularly for African forests. Here, we investigate the extent to which a past selective logging event continues to leave its imprint on different components of an East African forest 45 years later. We inventoried 2358 stems ≥10 cm in diameter in 26 plots (200 m × 10 m) within a 5.2 ha area in Kibale National Park, Uganda, in logged and unlogged forest. In these surveys, we characterized the forest light environment, taxonomic composition, functional trait composition using three traits (wood density, maximum height and maximum diameter) and forest structure based on three measures (stem density, total basal area and total above-ground biomass). In comparison to unlogged forests, selectively logged forest plots in Kibale National Park on average had higher light levels, different structure characterized by lower stem density, lower total basal area and lower above-ground biomass, and a distinct taxonomic composition driven primarily by changes in the relative abundance of species. Conversely, selectively logged forest plots were like unlogged plots in functional composition, having similar community-weighted mean values for wood density, maximum height and maximum diameter. This similarity in functional composition irrespective of logging history may be due to functional recovery of logged forest or background changes in functional attributes of unlogged forest. Despite the passage of 45 years, the legacy of selective logging on the tree community in Kibale National Park is still evident, as indicated by distinct taxonomic and structural composition and reduced carbon storage in logged forest compared with unlogged forest. The effects of selective logging are exerted via influences on tree demography rather than functional trait composition.

  13. Selective logging: does the imprint remain on tree structure and composition after 45 years?

    PubMed Central

    Osazuwa-Peters, Oyomoare L.; Chapman, Colin A.; Zanne, Amy E.

    2015-01-01

    Selective logging of tropical forests is increasing in extent and intensity. The duration over which impacts of selective logging persist, however, remains an unresolved question, particularly for African forests. Here, we investigate the extent to which a past selective logging event continues to leave its imprint on different components of an East African forest 45 years later. We inventoried 2358 stems ≥10 cm in diameter in 26 plots (200 m × 10 m) within a 5.2 ha area in Kibale National Park, Uganda, in logged and unlogged forest. In these surveys, we characterized the forest light environment, taxonomic composition, functional trait composition using three traits (wood density, maximum height and maximum diameter) and forest structure based on three measures (stem density, total basal area and total above-ground biomass). In comparison to unlogged forests, selectively logged forest plots in Kibale National Park on average had higher light levels, different structure characterized by lower stem density, lower total basal area and lower above-ground biomass, and a distinct taxonomic composition driven primarily by changes in the relative abundance of species. Conversely, selectively logged forest plots were like unlogged plots in functional composition, having similar community-weighted mean values for wood density, maximum height and maximum diameter. This similarity in functional composition irrespective of logging history may be due to functional recovery of logged forest or background changes in functional attributes of unlogged forest. Despite the passage of 45 years, the legacy of selective logging on the tree community in Kibale National Park is still evident, as indicated by distinct taxonomic and structural composition and reduced carbon storage in logged forest compared with unlogged forest. The effects of selective logging are exerted via influences on tree demography rather than functional trait composition. PMID:27293697

  14. Cloud object store for archive storage of high performance computing data using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  15. Shortnose sturgeon in the Gulf of Maine: Use of spawning habitat in the Kennebec System and response to dam removal

    USGS Publications Warehouse

    Wippelhauser, Gail S.; Zydlewski, Gayle B.; Kieffer, Micah; Sulikowski, James; Kinnison, Michael T.

    2015-01-01

    Evidence has become available in this century indicating that populations of the endangered Shortnose Sturgeon Acipenser brevirostrum migrate outside their natal river systems, but the full extent and functional basis of these migrations are not well understood. Between 2007 and 2013, 40 Shortnose Sturgeon captured and tagged in four Gulf of Maine river systems migrated long distances in coastal waters to reach the Kennebec System where their movements were logged by an acoustic receiver array. Twenty-one (20%) of 104 Shortnose Sturgeon tagged in the Penobscot River, two (50%) of four tagged in the Kennebec System, one (50%) of two tagged in the Saco River, and 16 (37%) of 43 tagged in the Merrimack River moved to a previously identified spawning site or historical spawning habitat in the Kennebec System in spring. Most (65%) moved in early spring from the tagging location directly to a spawning site in the Kennebec System, whereas the rest moved primarily in the fall from the tagging location to a wintering site in that system and moved to a spawning site the following spring. Spawning was inferred from the location, behavior, and sexual status of the fish and from season, water temperature, and discharge, and was confirmed by the capture of larvae in some years. Tagged fish went to a known spawning area in the upper Kennebec Estuary (16 events) or the Androscoggin Estuary (14 events), an historical spawning habitat in the restored Kennebec River (8 events), or two spawning areas in a single year (7 events). We have provided the first evidence indicating that Shortnose Sturgeon spawn in the restored Kennebec River in an historical habitat that became accessible in 1999 when Edwards Dam was removed, 162 years after it was constructed. These results highlight the importance of the Kennebec System to Shortnose Sturgeon throughout the Gulf of Maine.

  16. Prognostic Factors for Survival in Patients with Gastric Cancer using a Random Survival Forest

    PubMed

    Adham, Davoud; Abbasgholizadeh, Nategh; Abazari, Malek

    2017-01-01

    Background: Gastric cancer is the fifth most common cancer and the third top cause of cancer related death with about 1 million new cases and 700,000 deaths in 2012. The aim of this investigation was to identify important factors for outcome using a random survival forest (RSF) approach. Materials and Methods: Data were collected from 128 gastric cancer patients through a historical cohort study in Hamedan-Iran from 2007 to 2013. The event under consideration was death due to gastric cancer. The random survival forest model in R software was applied to determine the key factors affecting survival. Four split criteria were used to determine importance of the variables in the model including log-rank, conversation?? of events, log-rank score, and randomization. Efficiency of the model was confirmed in terms of Harrell’s concordance index. Results: The mean age of diagnosis was 63 ±12.57 and mean and median survival times were 15.2 (95%CI: 13.3, 17.0) and 12.3 (95%CI: 11.0, 13.4) months, respectively. The one-year, two-year, and three-year rates for survival were 51%, 13%, and 5%, respectively. Each RSF approach showed a slightly different ranking order. Very important covariates in nearly all the 4 RSF approaches were metastatic status, age at diagnosis and tumor size. The performance of each RSF approach was in the range of 0.29-0.32 and the best error rate was obtained by the log-rank splitting rule; second, third, and fourth ranks were log-rank score, conservation of events, and the random splitting rule, respectively. Conclusion: Low survival rate of gastric cancer patients is an indication of absence of a screening program for early diagnosis of the disease. Timely diagnosis in early phases increases survival and decreases mortality. Creative Commons Attribution License

  17. Variability in Accreditation Council for Graduate Medical Education Resident Case Log System practices among orthopaedic surgery residents.

    PubMed

    Salazar, Dane; Schiff, Adam; Mitchell, Erika; Hopkinson, William

    2014-02-05

    The Accreditation Council for Graduate Medical Education (ACGME) Resident Case Log System is designed to be a reflection of residents' operative volume and an objective measure of their surgical experience. All operative procedures and manipulations in the operating room, Emergency Department, and outpatient clinic are to be logged into the Resident Case Log System. Discrepancies in the log volumes between residents and residency programs often prompt scrutiny. However, it remains unclear if such disparities truly represent differences in operative experiences or if they are reflections of inconsistent logging practices. The purpose of this study was to investigate individual recording practices among orthopaedic surgery residents prior to August 1, 2011. Orthopaedic surgery residents received a questionnaire on case log practices that was distributed through the Council of Orthopaedic Residency Directors list server. Respondents were asked to respond anonymously about recording practices in different clinical settings as well as types of cases routinely logged. Hypothetical scenarios of common orthopaedic procedures were presented to investigate the differences in the Current Procedural Terminology codes utilized. Two hundred and ninety-eight orthopaedic surgery residents completed the questionnaire; 37% were fifth-year residents, 22% were fourth-year residents, 18% were third-year residents, 15% were second-year residents, and 8% were first-year residents. Fifty-six percent of respondents reported routinely logging procedures performed in the Emergency Department or urgent care setting. Twenty-two percent of participants routinely logged procedures in the clinic or outpatient setting, 20% logged joint injections, and only 13% logged casts or splints applied in the office setting. There was substantial variability in the Current Procedural Terminology codes selected for the seven clinical scenarios. There has been a lack of standardization in case-logging practices among orthopaedic surgery residents prior to August 1, 2011. ACGME case log data prior to this date may not be a reliable measure of residents' procedural experience.

  18. A simulation-based approach for evaluating logging residue handling systems.

    Treesearch

    B. Bruce Bare; Benjamin A. Jayne; Brian F. Anholt

    1976-01-01

    Describes a computer simulation model for evaluating logging residue handling systems. The flow of resources is traced through a prespecified combination of operations including yarding, chipping, sorting, loading, transporting, and unloading. The model was used to evaluate the feasibility of converting logging residues to chips that could be used, for example, to...

  19. Robust Spatial Autoregressive Modeling for Hardwood Log Inspection

    Treesearch

    Dongping Zhu; A.A. Beex

    1994-01-01

    We explore the application of a stochastic texture modeling method toward a machine vision system for log inspection in the forest products industry. This machine vision system uses computerized tomography (CT) imaging to locate and identify internal defects in hardwood logs. The application of CT to such industrial vision problems requires efficient and robust image...

  20. On the bad metallicity and phase diagrams of Fe1+δX (X =Te, Se, S, solid solutions): an electrical resistivity study

    NASA Astrophysics Data System (ADS)

    El Massalami, M.; Deguchi, K.; Machida, T.; Takeya, H.; Takano, Y.

    2014-12-01

    Based on a systematic analysis of the thermal evolution of the resistivities of Fe-based chalcogenides Fe1+δTe1-xXx (X = Se, S), it is inferred that their often observed nonmetallic resistivities are related to a presence of two resistive channels: one is a high- temperature thermally-activated process while the other is a low-temperature log-in-T process. On lowering temperature, there are often two metal-to-nonmetall crossover events: one from the high-T thermally-activated nonmetallic regime into a metal-like phase and the other from the log-in-T regime into a second metal-like phase. Based on these events, together with the magnetic and superconducting transitions, a phase diagram is constructed for each series. We discuss the origin of both processes as well as the associated crossover events. We also discuss how these resistive processes are being influenced by pressure, intercalation, disorder, doping, or sample condition and, in turn, how these modifications are shaping the associated phase diagrams.

  1. Influence of logging on the effects of wildfire in Siberia

    NASA Astrophysics Data System (ADS)

    Kukavskaya, E. A.; Buryak, L. V.; Ivanova, G. A.; Conard, S. G.; Kalenskaya, O. P.; Zhila, S. V.; McRae, D. J.

    2013-12-01

    The Russian boreal zone supports a huge terrestrial carbon pool. Moreover, it is a tremendous reservoir of wood products concentrated mainly in Siberia. The main natural disturbance in these forests is wildfire, which modifies the carbon budget and has potentially important climate feedbacks. In addition, both legal and illegal logging increase landscape complexity and affect burning conditions and fuel consumption. We investigated 100 individual sites with different histories of logging and fire on a total of 23 study areas in three different regions of Siberia to evaluate the impacts of fire and logging on fuel loads, carbon emissions, and tree regeneration in pine and larch forests. We found large variations of fire and logging effects among regions depending on growing conditions and type of logging activity. Logged areas in the Angara region had the highest surface and ground fuel loads (up to 135 t ha-1), mainly due to logging debris. This resulted in high carbon emissions where fires occurred on logged sites (up to 41 tC ha-1). The Shushenskoe/Minusinsk and Zabaikal regions are characterized by better slash removal and a smaller amount of carbon emitted to the atmosphere during fires. Illegal logging, which is widespread in the Zabaikal region, resulted in an increase in fire hazard and higher carbon emissions than legal logging. The highest fuel loads (on average 108 t ha-1) and carbon emissions (18-28 tC ha-1) in the Zabaikal region are on repeatedly burned unlogged sites where trees fell on the ground following the first fire event. Partial logging in the Shushenskoe/Minusinsk region has insufficient impact on stand density, tree mortality, and other forest conditions to substantially increase fire hazard or affect carbon stocks. Repeated fires on logged sites resulted in insufficient tree regeneration and transformation of forest to grasslands. We conclude that negative impacts of fire and logging on air quality, the carbon cycle, and ecosystem sustainability could be decreased by better slash removal in the Angara region, removal of trees killed by fire in the Zabaikal region, and tree planting after fires in drier conditions where natural regeneration is hampered by soil overheating and grass proliferation.

  2. Role of Large Clinical Datasets From Physiologic Monitors in Improving the Safety of Clinical Alarm Systems and Methodological Considerations: A Case From Philips Monitors.

    PubMed

    Sowan, Azizeh Khaled; Reed, Charles Calhoun; Staggers, Nancy

    2016-09-30

    Large datasets of the audit log of modern physiologic monitoring devices have rarely been used for predictive modeling, capturing unsafe practices, or guiding initiatives on alarm systems safety. This paper (1) describes a large clinical dataset using the audit log of the physiologic monitors, (2) discusses benefits and challenges of using the audit log in identifying the most important alarm signals and improving the safety of clinical alarm systems, and (3) provides suggestions for presenting alarm data and improving the audit log of the physiologic monitors. At a 20-bed transplant cardiac intensive care unit, alarm data recorded via the audit log of bedside monitors were retrieved from the server of the central station monitor. Benefits of the audit log are many. They include easily retrievable data at no cost, complete alarm records, easy capture of inconsistent and unsafe practices, and easy identification of bedside monitors missed from a unit change of alarm settings adjustments. Challenges in analyzing the audit log are related to the time-consuming processes of data cleaning and analysis, and limited storage and retrieval capabilities of the monitors. The audit log is a function of current capabilities of the physiologic monitoring systems, monitor's configuration, and alarm management practices by clinicians. Despite current challenges in data retrieval and analysis, large digitalized clinical datasets hold great promise in performance, safety, and quality improvement. Vendors, clinicians, researchers, and professional organizations should work closely to identify the most useful format and type of clinical data to expand medical devices' log capacity.

  3. Peak Source Power Associated with Positive Narrow Bipolar Lightning Pulses

    NASA Astrophysics Data System (ADS)

    Bandara, S. A.; Marshall, T. C.; Karunarathne, S.; Karunarathne, N. D.; Siedlecki, R. D., II; Stolzenburg, M.

    2017-12-01

    During the summer of 2016, we deployed a lightning sensor array in and around Oxford Mississippi, USA. The array system comprised seven lightning sensing stations in a network approximately covering an area of 30 km × 30 km. Each station is equipped with four sensors: Fast antenna (10 ms decay time), Slow antenna (1.0 s decay time)), field derivative sensor (dE/dt) and Log-RF antenna (bandwidth 187-192 MHz). We have observed 319 Positive NBPs and herein we report on comparisons of the NBP properties measured from the Fast antenna data with the Log-RF antenna data. These properties include 10-90% rise time, full width at half maximum, zero cross time, and range-normalized amplitude at 100 km. NBPs were categorized according to the fine structure of the electric field wave shapes into Types A-D, as in Karunarathne et al. [2015]. The source powers of NBPs in each category were determined using single station Log-RF data. Furthermore, we also categorized the NBPs in three other groups: initial event of an IC flash, isolated, and not-isolated (according to their spatiotemporal relationship with other lightning activity). We compared the source powers within each category. Karunarathne, S., T. C. Marshall, M. Stolzenburg, and N. Karunarathna (2015), Observations of positive narrow bipolar pulses, J. Geophys. Res. Atmos., 120, doi:10.1002/2015JD023150.

  4. Integrating Multiple Subsurface Exploration Technologies in Slope Hydrogeologic Investigation: A Case Study in Taiwan

    NASA Astrophysics Data System (ADS)

    Lo, H.-C.; Hsu, S.-M.; Jeng, D.-I.; Ku, C.-Y.

    2009-04-01

    Taiwan is an island located at a tectonically active collision zone between the Eurasian Plate and the Pacific Plate. Also, the island is in the subtropical climate region with frequent typhoon events that are always accompanied by intense rainfalls within a short period of time. These seismic and climatic elements frequently trigger, directly or indirectly, natural disasters such as landslides on the island with casualties and property damages. Prompted by the urge for minimizing the detrimental effects of such natural disasters, Taiwan government has initiated and funded a series of investigations and studies aimed at better understanding the causes of the natural disasters that may lead to the formulation of more effective disaster contingency plans and possibly some forecasts system. The hydrogeology of a landslide site can help unveil the detention condition of storm water entering the aquifer system of the slope as well as its groundwater condition which, in turn, plays a critical role in slope stability. In this study, a hydrogeologic investigation employing a series of subsurface exploration technologies was conducted at an active landslide site in the vicinity of Hwa Yuan Village in northern Taiwan. The site, which covers an area of approximately 0.14 km2 (35 acres) and generally ranges between 25 to 36 degree in slope, was initially investigated with ground resistivity image profiling (RIP) and electrical logging in order to determine the lithology and possibly the water-bearing capacity of the geologic units beneath the slope surface. Subsequently, both acoustic and optical borehole loggings were then applied to identify potentially significant fracture features at depth and their hydrogeologic implications. In addition, flowmeter loggings and hydraulic packer tests were conducted to further characterize the hydrogeologic system of the site and quantitatively determine the hydraulic properties of major hydrogeologic units. According to the ground resistivity profiles combined with rock core data, the geologic units can be primarily categorized into colluvium and weathered rock at depths of 4-23 m and 23-80 m, respectively. An approximately 20 m shear zone at depths of 45-65 m was found based on the detection outcome of low electrical resistance. Also, according to the borehole electrical logging, the layer of sandstone was identified in the interval of 48-59 m and 68.5-74 m and showed low water-bearing capacity. In addition, the electrical logging identified the layer of shale was in the interval of 59-68.5 m, which possessed a high water-bearing capacity. The velocity profile along the borehole was obtained from the flowmeter logging. A relatively high velocity zone (1.36~2.23 m/min) was measured in the interval of sandstone and relatively low velocity zone (0.12~0.78 m/min) was measured in the interval of shale, which is similar to those found in electrical logging. Moreover, 198 discontinuity planes were identified from the borehole image logging. The orientations of all discontinuities were calculated and compiled to draw a stereographic projection diagram. Judging from the discontinuity clusters on the stereographic projection diagram, a plane failure may possibly occur based on Hoek and Brown's criteria. This is a good demonstration that slope failure geometry and type can be determined by stereographic projection diagram analysis. The borehole images also clearly showed the structures of discontinuities at depth. They not only helped to characterize the results of the above investigation technologies but also provided useful indication in selecting specific geologic intervals for packer tests. The packer tests were conducted and the intervals were isolated based on the results of borehole and flowmeter logging. They indicated that the hydraulic conductivities of the shale and sandstone intervals are respectively 1.37Ã-10-8 m/sec and 2.68Ã-10-5-3.76Ã-10-5 m/sec, which are in good accordance with the hydraulic characteristics inferred by flowmeter logging. The aforementioned investigation results, including the geology units and water-bearing capacity categorized by RIP and electrical logging, velocity and hydraulic conductivity obtained from flowmeter logging and packer test, and discontinuity structures recorded by borehole image logging, were used to clarify the complexity of the subsurface environment and to establish the hydrogeologic conceptual model of the landslide site.

  5. Percutaneous mitral valve repair with the MitraClip system according to the predicted risk by the logistic EuroSCORE: preliminary results from the German Transcatheter Mitral Valve Interventions (TRAMI) Registry.

    PubMed

    Wiebe, Jens; Franke, Jennifer; Lubos, Edith; Boekstegers, Peter; Schillinger, Wolfgang; Ouarrak, Taoufik; May, Andreas E; Eggebrecht, Holger; Kuck, Karl-Heinz; Baldus, Stephan; Senges, Jochen; Sievert, Horst

    2014-10-01

    To evaluate in-hospital and short-term outcomes of percutaneous mitral valve repair according to patients' logistic EuroSCORE (logEuroSCORE) in a multicenter registry The logEuroSCORE is an established tool to predict the risk of mortality during cardiac surgery. In high-risk patients percutaneous mitral valve repair with the MitraClip system represents a less-invasive alternative Data from 1002 patients, who underwent percutaneous mitral valve repair with the MitraClip system, were analyzed in the German Transcatheter Mitral Valve Interventions (TRAMI) Registry. A logEuroSCORE (mortality risk in %) ≥ 20 was considered high risk Of all patients, 557 (55.6%) had a logEuroSCORE ≥ 20. Implantation of the MitraClip was successful in 95.5 % (942/986) patients. Moderate residual mitral valve regurgitation was more often detected in patients with a logEuroSCORE ≥ 20 (23.8% vs. 17.1%, respectively, P < 0.05). In patients with a logEuroSCORE ≥ 20 the procedural complication rate was 8.9% (vs. 6.4, n.s.) and the in-hospital MACCE rate 4.9% (vs. 1.4% P < 0.01). The in-hospital mortality rate in patients with a logEuroSCORE ≥ 20 and logEuroSCORE < 20 was 4.3 and 1.1%, respectively (P ≤ 0.01) CONCLUSION: Percutaneous mitral valve repair with the MitraClip system is feasible in patients with a logEuroSCORE ≥ 20 with similar procedural results compared to patients with lower predicted risk. Although mortality was four times higher than in patients with logEuroSCORE < 20, mortality in high risk patients was lower than predicted. In those with a logEuroSCORE ≥ 20, moderate residual mitral valve regurgitation was more frequent. © 2014 Wiley Periodicals, Inc.

  6. A depth video sensor-based life-logging human activity recognition system for elderly care in smart indoor environments.

    PubMed

    Jalal, Ahmad; Kamal, Shaharyar; Kim, Daijin

    2014-07-02

    Recent advancements in depth video sensors technologies have made human activity recognition (HAR) realizable for elderly monitoring applications. Although conventional HAR utilizes RGB video sensors, HAR could be greatly improved with depth video sensors which produce depth or distance information. In this paper, a depth-based life logging HAR system is designed to recognize the daily activities of elderly people and turn these environments into an intelligent living space. Initially, a depth imaging sensor is used to capture depth silhouettes. Based on these silhouettes, human skeletons with joint information are produced which are further used for activity recognition and generating their life logs. The life-logging system is divided into two processes. Firstly, the training system includes data collection using a depth camera, feature extraction and training for each activity via Hidden Markov Models. Secondly, after training, the recognition engine starts to recognize the learned activities and produces life logs. The system was evaluated using life logging features against principal component and independent component features and achieved satisfactory recognition rates against the conventional approaches. Experiments conducted on the smart indoor activity datasets and the MSRDailyActivity3D dataset show promising results. The proposed system is directly applicable to any elderly monitoring system, such as monitoring healthcare problems for elderly people, or examining the indoor activities of people at home, office or hospital.

  7. A Depth Video Sensor-Based Life-Logging Human Activity Recognition System for Elderly Care in Smart Indoor Environments

    PubMed Central

    Jalal, Ahmad; Kamal, Shaharyar; Kim, Daijin

    2014-01-01

    Recent advancements in depth video sensors technologies have made human activity recognition (HAR) realizable for elderly monitoring applications. Although conventional HAR utilizes RGB video sensors, HAR could be greatly improved with depth video sensors which produce depth or distance information. In this paper, a depth-based life logging HAR system is designed to recognize the daily activities of elderly people and turn these environments into an intelligent living space. Initially, a depth imaging sensor is used to capture depth silhouettes. Based on these silhouettes, human skeletons with joint information are produced which are further used for activity recognition and generating their life logs. The life-logging system is divided into two processes. Firstly, the training system includes data collection using a depth camera, feature extraction and training for each activity via Hidden Markov Models. Secondly, after training, the recognition engine starts to recognize the learned activities and produces life logs. The system was evaluated using life logging features against principal component and independent component features and achieved satisfactory recognition rates against the conventional approaches. Experiments conducted on the smart indoor activity datasets and the MSRDailyActivity3D dataset show promising results. The proposed system is directly applicable to any elderly monitoring system, such as monitoring healthcare problems for elderly people, or examining the indoor activities of people at home, office or hospital. PMID:24991942

  8. Log-Log Convexity of Type-Token Growth in Zipf's Systems

    NASA Astrophysics Data System (ADS)

    Font-Clos, Francesc; Corral, Álvaro

    2015-06-01

    It is traditionally assumed that Zipf's law implies the power-law growth of the number of different elements with the total number of elements in a system—the so-called Heaps' law. We show that a careful definition of Zipf's law leads to the violation of Heaps' law in random systems, with growth curves that have a convex shape in log-log scale. These curves fulfill universal data collapse that only depends on the value of Zipf's exponent. We observe that real books behave very much in the same way as random systems, despite the presence of burstiness in word occurrence. We advance an explanation for this unexpected correspondence.

  9. Performance analysis of MIMO wireless optical communication system with Q-ary PPM over correlated log-normal fading channel

    NASA Astrophysics Data System (ADS)

    Wang, Huiqin; Wang, Xue; Lynette, Kibe; Cao, Minghua

    2018-06-01

    The performance of multiple-input multiple-output wireless optical communication systems that adopt Q-ary pulse position modulation over spatial correlated log-normal fading channel is analyzed in terms of its un-coded bit error rate and ergodic channel capacity. The analysis is based on the Wilkinson's method which approximates the distribution of a sum of correlated log-normal random variables to a log-normal random variable. The analytical and simulation results corroborate the increment of correlation coefficients among sub-channels lead to system performance degradation. Moreover, the receiver diversity has better performance in resistance of spatial correlation caused channel fading.

  10. 1987 Nuclear Science Symposium, 34th, and 1987 Symposium on Nuclear Power Systems, 19th, San Francisco, CA, Oct. 21-23, 1987, Proceedings

    NASA Astrophysics Data System (ADS)

    Armantrout, Guy A.

    1988-02-01

    The present conference consideres topics in radiation detectors, advanced electronic circuits, data acquisition systems, radiation detector systems, high-energy and nuclear physics radiation detection, spaceborne instrumentation, health physics and environmental radiation detection, nuclear medicine, nuclear well logging, and nuclear reactor instrumentation. Attention is given to the response of scintillators to heavy ions, phonon-mediated particle detection, ballistic deficits in pulse-shaping amplifiers, fast analog ICs for particle physics, logic cell arrays, the CERN host interface, high performance data buses, a novel scintillating glass for high-energy physics applications, background events in microchannel plates, a tritium accelerator mass spectrometer, a novel positron tomograph, advancements in PET, cylindrical positron tomography, nuclear techniques in subsurface geology, REE borehole neutron activation, and a continuous tritium monitor for aqueous process streams.

  11. Macroscopic response to microscopic intrinsic noise in three-dimensional Fisher fronts.

    PubMed

    Nesic, S; Cuerno, R; Moro, E

    2014-10-31

    We study the dynamics of three-dimensional Fisher fronts in the presence of density fluctuations. To this end we simulate the Fisher equation subject to stochastic internal noise, and study how the front moves and roughens as a function of the number of particles in the system, N. Our results suggest that the macroscopic behavior of the system is driven by the microscopic dynamics at its leading edge where number fluctuations are dominated by rare events. Contrary to naive expectations, the strength of front fluctuations decays extremely slowly as 1/logN, inducing large-scale fluctuations which we find belong to the one-dimensional Kardar-Parisi-Zhang universality class of kinetically rough interfaces. Hence, we find that there is no weak-noise regime for Fisher fronts, even for realistic numbers of particles in macroscopic systems.

  12. Ubiquitous Learning Project Using Life-Logging Technology in Japan

    ERIC Educational Resources Information Center

    Ogata, Hiroaki; Hou, Bin; Li, Mengmeng; Uosaki, Noriko; Mouri, Kosuke; Liu, Songran

    2014-01-01

    A Ubiquitous Learning Log (ULL) is defined as a digital record of what a learner has learned in daily life using ubiquitous computing technologies. In this paper, a project which developed a system called SCROLL (System for Capturing and Reusing Of Learning Log) is presented. The aim of developing SCROLL is to help learners record, organize,…

  13. Healthcare Blockchain System Using Smart Contracts for Secure Automated Remote Patient Monitoring.

    PubMed

    Griggs, Kristen N; Ossipova, Olya; Kohlios, Christopher P; Baccarini, Alessandro N; Howson, Emily A; Hayajneh, Thaier

    2018-06-06

    As Internet of Things (IoT) devices and other remote patient monitoring systems increase in popularity, security concerns about the transfer and logging of data transactions arise. In order to handle the protected health information (PHI) generated by these devices, we propose utilizing blockchain-based smart contracts to facilitate secure analysis and management of medical sensors. Using a private blockchain based on the Ethereum protocol, we created a system where the sensors communicate with a smart device that calls smart contracts and writes records of all events on the blockchain. This smart contract system would support real-time patient monitoring and medical interventions by sending notifications to patients and medical professionals, while also maintaining a secure record of who has initiated these activities. This would resolve many security vulnerabilities associated with remote patient monitoring and automate the delivery of notifications to all involved parties in a HIPAA compliant manner.

  14. MapMySmoke: feasibility of a new quit cigarette smoking mobile phone application using integrated geo-positioning technology, and motivational messaging within a primary care setting.

    PubMed

    Schick, Robert S; Kelsey, Thomas W; Marston, John; Samson, Kay; Humphris, Gerald W

    2018-01-01

    Approximately 11,000 people die in Scotland each year as a result of smoking-related causes. Quitting smoking is relatively easy; maintaining a quit attempt is a very difficult task with success rates for unaided quit attempts stubbornly remaining in the single digits. Pharmaceutical treatment can improve these rates by lowering the overall reward factor of nicotine. However, these and related nicotine replacement therapies do not operate on, or address, the spatial and contextual aspects of smoking behaviour. With the ubiquity of smartphones that can log spatial, quantitative and qualitative data related to smoking behaviour, there exists a person-centred clinical opportunity to support smokers attempting to quit by first understanding their smoking behaviour and subsequently sending them dynamic messages to encourage health behaviour change within a situational context. We have built a smartphone app-MapMySmoke-that works on Android and iOS platforms. The deployment of this app within a clinical National Health Service (NHS) setting has two distinct phases: (1) a 2-week logging phase where pre-quit patients log all of their smoking and craving events; and (2) a post-quit phase where users receive dynamic support messages and can continue to log craving events, and should they occur, relapse events. Following the initial logging phase, patients consult with their general practitioner (GP) or healthcare provider to review their smoking patterns and to outline a precise, individualised quit attempt plan. Our feasibility study consists of assessment of an initial app version during and after use by eight patients recruited from an NHS Fife GP practice. In addition to evaluation of the app as a potential smoking cessation aid, we have assessed the user experience, technological requirements and security of the data flow. In an initial feasibility study, we have deployed the app for a small number of patients within one GP practice in NHS Fife. We recruited eight patients within one surgery, four of whom actively logged information about their smoking behaviour. Initial feedback was very positive, and users indicated a willingness to log their craving and smoking events. In addition, two out of three patients who completed follow-up interviews noted that the app helped them reduce the number of cigarettes they smoked per day, while the third indicated that it had helped them quit. The study highlighted the use of pushed notifications as a potential technology for maintaining quit attempts, and the security of collection of data was audited. These initial results influenced the design of a planned second larger study, comprised of 100 patients, the primary objectives of which are to use statistical modelling to identify times and places of probable switches into smoking states, and to target these times with dynamic health behaviour messaging. While the health benefits of quitting smoking are unequivocal, such behaviour change is very difficult to achieve. Many factors are likely to contribute to maintaining smoking behaviour, yet the precise role of cues derived from the spatial environment remains unclear. The rise of smartphones, therefore, allows clinicians the opportunity to better understand the spatial aspects of smoking behaviour and affords them the opportunity to push targeted individualised health support messages at vulnerable times and places. ClinicalTrial.gov, NCT02932917.

  15. Combined Log Inventory and Process Simulation Models for the Planning and Control of Sawmill Operations

    Treesearch

    Guillermo A. Mendoza; Roger J. Meimban; Philip A. Araman; William G. Luppold

    1991-01-01

    A log inventory model and a real-time hardwood process simulation model were developed and combined into an integrated production planning and control system for hardwood sawmills. The log inventory model was designed to monitor and periodically update the status of the logs in the log yard. The process simulation model was designed to estimate various sawmill...

  16. Financial and Economic Analysis of Reduced Impact Logging

    Treesearch

    Tom Holmes

    2016-01-01

    Concern regarding extensive damage to tropical forests resulting from logging increased dramatically after World War II when mechanized logging systems developed in industrialized countries were deployed in the tropics. As a consequence, tropical foresters began developing logging procedures that were more environmentally benign, and by the 1990s, these practices began...

  17. Infuence of Averaging Method on the Evaluation of a Coastal Ocean Color Event on the U.S. Northeast Coast

    NASA Technical Reports Server (NTRS)

    Acker, James G.; Uz, Stephanie Schollaert; Shen, Suhung; Leptoukh, Gregory G.

    2010-01-01

    Application of appropriate spatial averaging techniques is crucial to correct evaluation of ocean color radiometric data, due to the common log-normal or mixed log-normal distribution of these data. Averaging method is particularly crucial for data acquired in coastal regions. The effect of averaging method was markedly demonstrated for a precipitation-driven event on the U.S. Northeast coast in October-November 2005, which resulted in export of high concentrations of riverine colored dissolved organic matter (CDOM) to New York and New Jersey coastal waters over a period of several days. Use of the arithmetic mean averaging method created an inaccurate representation of the magnitude of this event in SeaWiFS global mapped chl a data, causing it to be visualized as a very large chl a anomaly. The apparent chl a anomaly was enhanced by the known incomplete discrimination of CDOM and phytoplankton chlorophyll in SeaWiFS data; other data sources enable an improved characterization. Analysis using the geometric mean averaging method did not indicate this event to be statistically anomalous. Our results predicate the necessity of providing the geometric mean averaging method for ocean color radiometric data in the Goddard Earth Sciences DISC Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni).

  18. Sonic Kayaks: Environmental monitoring and experimental music by citizens.

    PubMed

    Griffiths, Amber G F; Kemp, Kirsty M; Matthews, Kaffe; Garrett, Joanne K; Griffiths, David J

    2017-11-01

    The Sonic Kayak is a musical instrument used to investigate nature and developed during open hacklab events. The kayaks are rigged with underwater environmental sensors, which allow paddlers to hear real-time water temperature sonifications and underwater sounds, generating live music from the marine world. Sensor data is also logged every second with location, time and date, which allows for fine-scale mapping of water temperatures and underwater noise that was previously unattainable using standard research equipment. The system can be used as a citizen science data collection device, research equipment for professional scientists, or a sound art installation in its own right.

  19. Sonic Kayaks: Environmental monitoring and experimental music by citizens

    PubMed Central

    Kemp, Kirsty M.; Matthews, Kaffe; Garrett, Joanne K.; Griffiths, David J.

    2017-01-01

    The Sonic Kayak is a musical instrument used to investigate nature and developed during open hacklab events. The kayaks are rigged with underwater environmental sensors, which allow paddlers to hear real-time water temperature sonifications and underwater sounds, generating live music from the marine world. Sensor data is also logged every second with location, time and date, which allows for fine-scale mapping of water temperatures and underwater noise that was previously unattainable using standard research equipment. The system can be used as a citizen science data collection device, research equipment for professional scientists, or a sound art installation in its own right. PMID:29190283

  20. Gulf of Mexico Gas Hydrate Joint Industry Project Leg II logging-while-drilling data acquisition and anaylsis

    USGS Publications Warehouse

    Collett, Timothy S.; Lee, Myung W.; Zyrianova, Margarita V.; Mrozewski, Stefan A.; Guerin, Gilles; Cook, Ann E.; Goldberg, Dave S.

    2012-01-01

    One of the objectives of the Gulf of MexicoGasHydrateJointIndustryProjectLegII (GOM JIP LegII) was the collection of a comprehensive suite of logging-while-drilling (LWD) data within gas-hydrate-bearing sand reservoirs in order to make accurate estimates of the concentration of gashydrates under various geologic conditions and to understand the geologic controls on the occurrence of gashydrate at each of the sites drilled during this expedition. The LWD sensors just above the drill bit provided important information on the nature of the sediments and the occurrence of gashydrate. There has been significant advancements in the use of downhole well-logging tools to acquire detailed information on the occurrence of gashydrate in nature: From using electrical resistivity and acoustic logs to identify gashydrate occurrences in wells to where wireline and advanced logging-while-drilling tools are routinely used to examine the petrophysical nature of gashydrate reservoirs and the distribution and concentration of gashydrates within various complex reservoir systems. Recent integrated sediment coring and well-log studies have confirmed that electrical resistivity and acoustic velocity data can yield accurate gashydrate saturations in sediment grain supported (isotropic) systems such as sand reservoirs, but more advanced log analysis models are required to characterize gashydrate in fractured (anisotropic) reservoir systems. In support of the GOM JIP LegII effort, well-log data montages have been compiled and presented in this report which includes downhole logs obtained from all seven wells drilled during this expedition with a focus on identifying and characterizing the potential gas-hydrate-bearing sedimentary section in each of the wells. Also presented and reviewed in this report are the gas-hydrate saturation and sediment porosity logs for each of the wells as calculated from available downhole well logs.

  1. Interpreting the power spectrum of Dansgaard-Oeschger events via stochastic dynamical systems

    NASA Astrophysics Data System (ADS)

    Mitsui, Takahito; Lenoir, Guillaume; Crucifix, Michel

    2017-04-01

    Dansgaard-Oeschger (DO) events are abrupt climate shifts, which are particularly pronounced in the North Atlantic region during glacial periods [Dansgaard et al. 1993]. The signals are most clearly found in δ 18O or log [Ca2+] records of Greenland ice cores. The power spectrum S(f) of DO events has attracted attention over two decades with debates on the apparent 1.5-kyr periodicity [Grootes & Stuiver 1997; Schultz et al. 2002; Ditlevsen et al. 2007] and scaling property over several time scales [Schmitt, Lovejoy, & Schertzer 1995; Rypdal & Rypdal 2016]. The scaling property is written most simply as S(f)˜ f-β , β ≈ 1.4. However, physical as well as underlying dynamics of the periodicity and the scaling property are still not clear. Pioneering works for modelling the spectrum of DO events are done by Cessi (1994) and Ditlevsen (1999), but their model-data comparisons of the spectra are rather qualitative. Here, we show that simple stochastic dynamical systems can generate power spectra statistically consistent with the observed spectra over a wide range of frequency from orbital to the Nyquist frequency (=1/40 yr-1). We characterize the scaling property of the spectrum by defining a local scaling exponentβ _loc. For the NGRIP log [Ca2+] record, the local scaling exponent β _loc increases from ˜ 1 to ˜ 2 as the frequency increases from ˜ 1/5000 yr-1 to ˜ 1/500 yr-1, and β _loc decreases toward zero as the frequency increases from ˜ 1/500 yr-1 to the Nyquist frequency. For the δ 18O record, the local scaling exponent β _loc increases from ˜ 1 to ˜ 1.5 as the frequency increases from ˜ 1/5000 yr^{-1 to ˜ 1/1000 yr-1, and β _loc decreases toward zero as the frequency increases from ˜ 1/1000 yr-1 to the Nyquist frequency. This systematic breaking of a single scaling is reproduced by the simple stochastic models. Especially, the models suggest that the flattening of the spectra starting from multi-centennial scale and ending at the Nyquist frequency results from both non-dynamical (or non-system) noise and 20-yr binning of the ice core records. The modelling part of this research is partially based on the following work: Takahito Mitsui and Michel Crucifix, Influence of external forcings on abrupt millennial-scale climate changes: a statistical modelling study, Climate Dynamics (first online). doi:10.1007/s00382-016-3235-z

  2. Therapy reduction in patients with Down syndrome and myeloid leukemia: the international ML-DS 2006 trial.

    PubMed

    Uffmann, Madita; Rasche, Mareike; Zimmermann, Martin; von Neuhoff, Christine; Creutzig, Ursula; Dworzak, Michael; Scheffers, Lenie; Hasle, Henrik; Zwaan, C Michel; Reinhardt, Dirk; Klusmann, Jan-Henning

    2017-06-22

    Children with myeloid leukemia associated with Down syndrome (ML-DS) have superior outcome compared with non-DS patients, but suffer from higher constitutional cytotoxic drug susceptibility. We analyzed the outcome of 170 pediatric patients with ML-DS enrolled in the prospective, multicenter, open-label, nonrandomized ML-DS 2006 trial by Nordic Society for Pediatric Hematology and Oncology (NOPHO), Dutch Childhood Oncology Group (DCOG), and Acute Myeloid Leukemia-Berlin-Frankfurt-Münster (AML-BFM) study group. Compared with the historical control arm (reduced-intensity protocol for ML-DS patients from the AML-BFM 98 trial), treatment intensity was reduced by lowering the cumulative dose of etoposide (950 to 450 mg/m 2 ) and intrathecal central nervous system prophylaxis while omitting maintenance therapy. Still, 5-year overall survival (89% ± 3% vs 90% ± 4%; P log-rank = .64), event-free survival (EFS; 87% ± 3% vs 89% ± 4%; P log-rank = .71), and cumulative incidence of relapse/nonresponse (CIR/NR; 6% ± 3% vs 6% ± 2%; P Gray = .03) did not significantly differ between the ML-DS 2006 trial and the historical control arm. Poor early treatment response (5-year EFS, 58% ± 16% vs 88% ± 3%; P log rank = .0008) and gain of chromosome 8 (CIR/NR, 16% ± 7% vs 3% ± 2%, P Gray = .02; 5-year EFS, 73% ± 8% vs 91% ± 4%, P log rank = .018) were identified as independent prognostic factors predicting a worse EFS. Five of 7 relapsed patients (71%) with cytogenetic data had trisomy 8. Our study reveals prognostic markers for children with ML-DS and illustrates that reducing therapy did not impair excellent outcome. The trial was registered at EudraCT as #2007-006219-2. © 2017 by The American Society of Hematology.

  3. Comparison between moving and stationary transmitter systems in induction logging

    NASA Astrophysics Data System (ADS)

    Poddar, M.; Caleb Dhanasekaran, P.; Prabhakar Rao, K.

    1985-09-01

    In a general treatment of the theory of induction logging, an exact integral representation has been obtained for the mutual impedance between a vertical dipole transmitter and a coaxial dipole receiver in a three layered earth. Based on this representation, a computer model has been devised using the traditional Slingram system of induction logging and the comparatively new Turam system, ignoring borehole effects. The model results indicate that due to its much larger response, the Turam system is in general preferable to the Slingram in mineral and groundwater investigations where formation conductivity much less than 1 S/m is generally encountered. However, if the surrounding media are conductive (more than 0.1 S/m), the Turam system suffers from large amplitude attenuation and phase rotation of the primary field caused by the conductive surrounding, and is less useful than the Slingram system which does not so suffer, unless the target bed is shallow. Because it is a more complex function of system parameters than the corresponding Slingram log, a Turam log can be conveniently interpreted only by the modern inverse method using a fast algorithm for the forward solution and a high speed digital computer.

  4. Association of Thoracic Aorta Calcium Score With Left Ventricular Hypertrophy and Clinical Outcomes in Patients With Severe Aortic Stenosis After Aortic Valve Replacement.

    PubMed

    Cho, In-Jeong; Chang, Hyuk-Jae; Heo, Ran; Kim, In-Cheol; Sung, Ji Min; Chang, Byung-Chul; Shim, Chi Young; Hong, Geu-Ru; Chung, Namsik

    2017-01-01

    Substantial aortic calcification is known to be associated with aortic stiffening and subsequent left ventricular (LV) hypertrophy. This study examined whether the thoracic aorta calcium score (TACS) is related to LV hypertrophy and whether it leads to an adverse prognosis in patients with severe aortic stenosis (AS) after aortic valve replacement (AVR). We retrospectively reviewed 47 patients (mean age, 64 ± 11 years) with isolated severe AS who underwent noncontrast computed tomography of the entire thoracic aorta and who received AVR. TACS was quantified using the volume method with values becoming log transformed ( log [TACS+1]). Transthoracic echocardiography was performed before and 1 year after the operation. Preoperative LV mass index (LVMI) displayed significant positive correlations with male gender (r = 0.430, p = 0.010) and log (TACS+1) (r = 0.556, p = 0.003). In multivariate linear regression analysis, only log (TACS+1) was independently associated with LVMI, even after adjusting for age, gender, transaortic mean pressure gradient, and coronary or valve calcium score. Independent determinants for postoperative LVMI included log (TACS+1) and preoperative LVMI after 1 year of follow-up echocardiography, adjusting for age, gender, indexed effective orifice area, and coronary or valve calcium score. During a median follow-up period of 54 months after AVR, there were 10 events (21%), which included 4 deaths from all-causes, 3 strokes, 2 inpatient admissions for heart failure, and 1 myocardial infarction. The event-free survival rate was significantly lower for patients with TACS of 2,257 mm 3 or higher compared with those whose TACS was lower than 2,257 mm 3 (log-rank p < 0.001). High TACS was associated with increased LVMI among patients with severe AS. Further, high TACS usefully predicted less regression of LVMI and poor clinical outcomes after AVR. TACS may serve as a useful proxy for predicting LV remodeling and adverse prognosis in patients with severe AS undergoing AVR. Copyright © 2017 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  5. Moment Magnitudes and Local Magnitudes for Small Earthquakes: Implications for Ground-Motion Prediction and b-values

    NASA Astrophysics Data System (ADS)

    Baltay, A.; Hanks, T. C.; Vernon, F.

    2016-12-01

    We illustrate two essential consequences of the systematic difference between moment magnitude and local magnitude for small earthquakes, illuminating the underlying earthquake physics. Moment magnitude, M 2/3 log M0, is uniformly valid for all earthquake sizes [Hanks and Kanamori, 1979]. However, the relationship between local magnitude ML and moment is itself magnitude dependent. For moderate events, 3< M < 7, M and M­L are coincident; for earthquakes smaller than M3, ML log M0 [Hanks and Boore, 1984]. This is a consequence of the saturation of the apparent corner frequency fc as it becoming greater than the largest observable frequency, fmax; In this regime, stress drop no longer controls ground motion. This implies that ML and M differ by a factor of 1.5 for these small events. While this idea is not new, its implications are important as more small-magnitude data are incorporated into earthquake hazard research. With a large dataset of M<3 earthquakes recorded on the ANZA network, we demonstrate striking consequences of the difference between M and ML. ML scales as the log peak ground motions (e.g., PGA or PGV) for these small earthquakes, which yields log PGA log M0 [Boore, 1986]. We plot nearly 15,000 records of PGA and PGV at close stations, adjusted for site conditions and for geometrical spreading to 10 km. The slope of the log of ground motion is 1.0*ML­, or 1.5*M, confirming the relationship, and that fc >> fmax. Just as importantly, if this relation is overlooked, prediction of large-magnitude ground motion from small earthquakes will be misguided. We also consider the effect of this magnitude scale difference on b-value. The oft-cited b-value of 1 should hold for small magnitudes, given M. Use of ML necessitates b=2/3 for the same data set; use of mixed, or unknown, magnitudes complicates the matter further. This is of particular import when estimating the rate of large earthquakes when one has limited data on their recurrence, as is the case for induced earthquakes in the central US.

  6. Impact of hypertension on early outcomes and long-term survival of patients undergoing aortic repair with Stanford A dissection.

    PubMed

    Merkle, Julia; Sabashnikov, Anton; Deppe, Antje-Christin; Zeriouh, Mohamed; Eghbalzadeh, Kaveh; Weber, Carolyn; Rahmanian, Parwis; Kuhn, Elmar; Madershahian, Navid; Kroener, Axel; Choi, Yeong-Hoon; Kuhn-Régnier, Ferdinand; Liakopoulos, Oliver; Wahlers, Thorsten

    2018-04-01

    Stanford A acute aortic dissection (AAD) is a life-threatening emergency, typically occurring in hypertensive patients, requiring immediate surgical repair. The aim of this study was to evaluate early outcomes and long-term survival of hypertensive patients in comparison to normotensive patients suffering from Stanford A AAD. In our center, 240 patients with Stanford A AAD underwent aortic surgical repair from January 2006 to April 2015. After statistical and logistic regression analysis, Kaplan-Meier survival estimation was performed, with up to 9-year follow-up. The proportion of hypertensive patients suffering from Stanford A AAD was 75.4% (n=181). There were only few statistically significant differences in terms of basic demographics, comorbidities, preoperative baseline and clinical characteristics of hypertensive patients in comparison to normotensive patients. Hypertensive patients were significantly older (p=0.008), more frequently received hemi-arch repair (p=0.028) and selective brain perfusion (p=0.001). Our study showed similar statistical results in terms of 30-day mortality (p=0.196), long-term overall cumulative survival of patients (Log-Rank p=0.506) and survival of patients free from cerebrovascular events (Log-Rank p=0.186). Furthermore, subgroup analysis for long-term survival in terms of men (Log-Rank p=0.853), women (Log-Rank p=0.227), patients under and above 65 years of age (Log-Rank p=0.188 and Log-Rank p=0.602, respectively) and patients undergoing one of the three types of aortic repair surgery showed similar results for normotensive and hypertensive patient groups. Subgroup analysis for long-term survival of patients free from cerebrovascular events for women, patients under 65 years of age and patients undergoing aortic arch repair showed significant differences between the two groups in favor of hypertensive patients. Hypertensive patients suffering from Stanford A AAD were older, more frequently received hemi-arch replacement and were not associated with increased risk of 30-day mortality and poorer long-term survival compared to normotensive patients.

  7. Branched-chain amino acids may improve recovery from a vegetative or minimally conscious state in patients with traumatic brain injury: a pilot study.

    PubMed

    Aquilani, Roberto; Boselli, Mirella; Boschi, Federica; Viglio, Simona; Iadarola, Paolo; Dossena, Maurizia; Pastoris, Ornella; Verri, Manuela

    2008-09-01

    To investigate whether supplementation with branched-chain amino acids (BCAAs) may improve recovery of patients with a posttraumatic vegetative or minimally conscious state. Patients were randomly assigned to 15 days of intravenous BCAA supplementation (n=22; 19.6g/d) or an isonitrogenous placebo (n=19). Tertiary care rehabilitation setting. Patients (N=41; 29 men, 12 women; mean age, 49.5+/-21 y) with a posttraumatic vegetative or minimally conscious state, 47+/-24 days after the index traumatic event. Supplementation with BCAAs. Disability Rating Scale (DRS) as log(10)DRS. Fifteen days after admission to the rehabilitation department, the log(10)DRS score improved significantly only in patients who had received BCAAs (log(10)DRS score, 1.365+/-0.08 to 1.294+/-0.05; P<.001), while the log(10)DRS score in the placebo recipients remained virtually unchanged (log(10)DRS score, 1.373+/-0.03 to 1.37+/-0.03; P not significant). The difference in improvement of log(10)DRS score between the 2 groups was highly significant (P<.000). Moreover, 68.2% (n=15) of treated patients achieved a log(10)DRS point score of .477 or higher (3 as geometric mean) that allowed them to exit the vegetative or minimally conscious state. Supplemented BCAAs may improve the recovery from a vegetative or minimally conscious state in patients with posttraumatic vegetative or minimally conscious state.

  8. Microbial Removals by a Novel Biofilter Water Treatment System

    PubMed Central

    Wendt, Christopher; Ives, Rebecca; Hoyt, Anne L.; Conrad, Ken E.; Longstaff, Stephanie; Kuennen, Roy W.; Rose, Joan B.

    2015-01-01

    Two point-of-use drinking water treatment systems designed using a carbon filter and foam material as a possible alternative to traditional biosand systems were evaluated for removal of bacteria, protozoa, and viruses. Two configurations were tested: the foam material was positioned vertically around the carbon filter in the sleeve unit or horizontally in the disk unit. The filtration systems were challenged with Cryptosporidium parvum, Raoultella terrigena, and bacteriophages P22 and MS2 before and after biofilm development to determine average log reduction (ALR) for each organism and the role of the biofilm. There was no significant difference in performance between the two designs, and both designs showed significant levels of removal (at least 4 log10 reduction in viruses, 6 log10 for protozoa, and 8 log10 for bacteria). Removal levels meet or exceeded Environmental Protection Agency (EPA) standards for microbial purifiers. Exploratory test results suggested that mature biofilm formation contributed 1–2 log10 reductions. Future work is recommended to determine field viability. PMID:25758649

  9. Biotransformation in Double-Phase Systems: Physiological Responses of Pseudomonas putida DOT-T1E to a Double Phase Made of Aliphatic Alcohols and Biosynthesis of Substituted Catechols

    PubMed Central

    Rojas, Antonia; Duque, Estrella; Schmid, Andreas; Hurtado, Ana; Ramos, Juan-Luis; Segura, Ana

    2004-01-01

    Pseudomonas putida strain DOT-T1E is highly tolerant to organic solvents, with a logPow (the logarithm of the partition coefficient of a solvent in a two-phase water-octanol system of ≥2.5. Solvent tolerant microorganisms can be exploited to develop double-phase (organic solvent and water) biotransformation systems in which toxic substrates or products are kept in the organic phase. We tested P. putida DOT-T1E tolerance to different aliphatic alcohols with a logPow value between 2 and 4, such as decanol, nonanol, and octanol, which are potentially useful in biotransformations in double-phase systems in which compounds with a logPow around 1.5 are produced. P. putida DOT-T1E responds to aliphatic alcohols as the second phase through cis-to-trans isomerization of unsaturated cis fatty acids and through efflux of these aliphatic alcohols via a series of pumps that also extrude aromatic hydrocarbons. These defense mechanisms allow P. putida DOT-T1E to survive well in the presence of high concentrations of the aliphatic alcohols, and growth with nonanol or decanol occurred at a high rate, whereas in the presence of an octanol double-phase growth was compromised. Our results support that the logPow of aliphatic alcohols correlates with their toxic effects, as octanol (logPow = 2.9) has more negative effects in P. putida cells than 1-nonanol (logPow = 3.4) or 1-decanol (logPow = 4). A P. putida DOT-T1E derivative bearing plasmid pWW0-xylE::Km transforms m-xylene (logPow = 3.2) into 3-methylcatechol (logPow = 1.8). The amount of 3-methylcatechol produced in an aliphatic alcohol/water bioreactor was 10- to 20-fold higher than in an aqueous medium, demonstrating the usefulness of double-phase systems for this particular biotransformation. PMID:15184168

  10. Multilevel mixed effects parametric survival models using adaptive Gauss-Hermite quadrature with application to recurrent events and individual participant data meta-analysis.

    PubMed

    Crowther, Michael J; Look, Maxime P; Riley, Richard D

    2014-09-28

    Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.

  11. On the Log-Normality of Historical Magnetic-Storm Intensity Statistics: Implications for Extreme-Event Probabilities

    NASA Astrophysics Data System (ADS)

    Love, J. J.; Rigler, E. J.; Pulkkinen, A. A.; Riley, P.

    2015-12-01

    An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to -Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, -Dst > 850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42, 2.41] times per century; a 100-yr magnetic storm is identified as having a -Dst > 880 nT (greater than Carrington) but a wide 95% confidence interval of [490, 1187] nT. This work is partially motivated by United States National Science and Technology Council and Committee on Space Research and International Living with a Star priorities and strategic plans for the assessment and mitigation of space-weather hazards.

  12. Simple display system of mechanical properties of cells and their dispersion.

    PubMed

    Shimizu, Yuji; Kihara, Takanori; Haghparast, Seyed Mohammad Ali; Yuba, Shunsuke; Miyake, Jun

    2012-01-01

    The mechanical properties of cells are unique indicators of their states and functions. Though, it is difficult to recognize the degrees of mechanical properties, due to small size of the cell and broad distribution of the mechanical properties. Here, we developed a simple virtual reality system for presenting the mechanical properties of cells and their dispersion using a haptic device and a PC. This system simulates atomic force microscopy (AFM) nanoindentation experiments for floating cells in virtual environments. An operator can virtually position the AFM spherical probe over a round cell with the haptic handle on the PC monitor and feel the force interaction. The Young's modulus of mesenchymal stem cells and HEK293 cells in the floating state was measured by AFM. The distribution of the Young's modulus of these cells was broad, and the distribution complied with a log-normal pattern. To represent the mechanical properties together with the cell variance, we used log-normal distribution-dependent random number determined by the mode and variance values of the Young's modulus of these cells. The represented Young's modulus was determined for each touching event of the probe surface and the cell object, and the haptic device-generating force was calculated using a Hertz model corresponding to the indentation depth and the fixed Young's modulus value. Using this system, we can feel the mechanical properties and their dispersion in each cell type in real time. This system will help us not only recognize the degrees of mechanical properties of diverse cells but also share them with others.

  13. Simple Display System of Mechanical Properties of Cells and Their Dispersion

    PubMed Central

    Shimizu, Yuji; Kihara, Takanori; Haghparast, Seyed Mohammad Ali; Yuba, Shunsuke; Miyake, Jun

    2012-01-01

    The mechanical properties of cells are unique indicators of their states and functions. Though, it is difficult to recognize the degrees of mechanical properties, due to small size of the cell and broad distribution of the mechanical properties. Here, we developed a simple virtual reality system for presenting the mechanical properties of cells and their dispersion using a haptic device and a PC. This system simulates atomic force microscopy (AFM) nanoindentation experiments for floating cells in virtual environments. An operator can virtually position the AFM spherical probe over a round cell with the haptic handle on the PC monitor and feel the force interaction. The Young's modulus of mesenchymal stem cells and HEK293 cells in the floating state was measured by AFM. The distribution of the Young's modulus of these cells was broad, and the distribution complied with a log-normal pattern. To represent the mechanical properties together with the cell variance, we used log-normal distribution-dependent random number determined by the mode and variance values of the Young's modulus of these cells. The represented Young's modulus was determined for each touching event of the probe surface and the cell object, and the haptic device-generating force was calculated using a Hertz model corresponding to the indentation depth and the fixed Young's modulus value. Using this system, we can feel the mechanical properties and their dispersion in each cell type in real time. This system will help us not only recognize the degrees of mechanical properties of diverse cells but also share them with others. PMID:22479595

  14. The origins of post-starburst galaxies at z < 0.05

    NASA Astrophysics Data System (ADS)

    Pawlik, M. M.; Taj Aldeen, L.; Wild, V.; Mendez-Abreu, J.; Lahén, N.; Johansson, P. H.; Jimenez, N.; Lucas, W.; Zheng, Y.; Walcher, C. J.; Rowlands, K.

    2018-06-01

    Post-starburst galaxies can be identified via the presence of prominent Hydrogen Balmer absorption lines in their spectra. We present a comprehensive study of the origin of strong Balmer lines in a volume-limited sample of 189 galaxies with 0.01 < z < 0.05, log ({M}_{\\star }/{M}_{⊙})>9.5 and projected axial ratio b/a > 0.32. We explore their structural properties, environments, emission lines, and star formation histories, and compare them to control samples of star-forming and quiescent galaxies, and simulated galaxy mergers. Excluding contaminants, in which the strong Balmer lines are most likely caused by dust-star geometry, we find evidence for three different pathways through the post-starburst phase, with most events occurring in intermediate-density environments: (1) a significant disruptive event, such as a gas-rich major merger, causing a starburst and growth of a spheroidal component, followed by quenching of the star formation (70 per cent of post-starburst galaxies at 9.5< log ({M}_{\\star}/{M}_{⊙})<10.5 and 60 per cent at log ({M}_{\\star}/{M}_{⊙})>10.5); (2) at 9.5< log ({M}_{\\star}/{M}_{⊙})<10.5, stochastic star formation in blue-sequence galaxies, causing a weak burst and subsequent return to the blue sequence (30 per cent); (3) at log ({M}_{\\star}/{M}_{⊙})>10.5, cyclic evolution of quiescent galaxies which gradually move towards the high-mass end of the red sequence through weak starbursts, possibly as a result of a merger with a smaller gas-rich companion (40 per cent). Our analysis suggests that active galactic nuclei (AGNs) are 'on' for 50 per cent of the duration of the post-starburst phase, meaning that traditional samples of post-starburst galaxies with strict emission-line cuts will be at least 50 per cent incomplete due to the exclusion of narrow-line AGNs.

  15. Assessment of the Technical Training Received by Source Selection Participants in Air Force Systems Command.

    DTIC Science & Technology

    1986-09-01

    60 48.4 Systems 200/400 15 12.1 Contract Administration (PPM 152) 13 10.5 Logistics Management (Log 224) 2 1.6 Government Contract Law (PPM 302) 20...detail. 1. Systems 200/400 2. Contract Administration (PPM 152) 3. Logistics Management (LOG 224) 4. Government Contract Law (PPM 302) 5. Technical...152 Contract Administration o Log 224 Logistics Management o PPM 302 Government Contract Law o QMT 345 Quantitative Technical, Cost, and Price Analysis

  16. Quantitative Generalizations for Catchment Sediment Yield Following Plantation Logging

    NASA Astrophysics Data System (ADS)

    Bathurst, James; Iroume, Andres

    2014-05-01

    While there is a reasonably clear qualitative understanding of the impact of forest plantations on sediment yield, there is a lack of quantitative generalizations. Such generalizations would be helpful for estimating the impacts of proposed forestry operations and would aid the spread of knowledge amongst both relevant professionals and new students. This study therefore analyzed data from the literature to determine the extent to which quantitative statements can be established. The research was restricted to the impact of plantation logging on catchment sediment yield as a function of ground disturbance in the years immediately following logging, in temperate countries, and does not consider landslides consequent upon tree root decay. Twelve paired catchment studies incorporating pre- and post-logging measurements of sediment yield were identified, resulting in forty-three test catchments (including 14 control catchments). Analysis yielded the following principal conclusions: 1) Logging generally provokes maximum annual sediment yields of less than a few hundred t km-2 yr-1; best management practice can reduce this below 100 t km-2 yr-1. 2) At both the annual and event scales, the sediment yield excess of a logged catchment over a control catchment is within one order of magnitude, except with severe ground disturbance. 3) There is no apparent relationship between sediment yield impact and the proportion of catchment logged. The effect depends on which part of the catchment is altered and on its connectivity to the stream network. 4) The majority of catchments delivered their maximum sediment yield in the first two years after logging. The logging impacts were classified in terms of the absolute values of specific sediment yield, the values relative to those in the control catchments for the same period and the values relative both to the control catchment and the pre-logging period. Most studies have been for small catchments (< 10 km2) and temperate regions; the impact at large catchment scales and in tropical regions requires further research.

  17. The development of a full-digital and networkable multi-media based highway information system : phase 1

    DOT National Transportation Integrated Search

    1999-07-26

    This report covers the development of a Multimedia Based Highway Information System (MMHIS). MMHIS extends the capabilities of current photo logging facilities. Photographic logging systems used by highway agencies provide engineers with information ...

  18. 12 CFR 27.4 - Inquiry/Application Log.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 1 2011-01-01 2011-01-01 false Inquiry/Application Log. 27.4 Section 27.4... SYSTEM § 27.4 Inquiry/Application Log. (a) The Comptroller, among other things, may require a bank to maintain a Fair Housing Inquiry/Application Log (“Log”), based upon, but not limited to, one or more of the...

  19. 12 CFR 27.4 - Inquiry/Application Log.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Inquiry/Application Log. 27.4 Section 27.4... SYSTEM § 27.4 Inquiry/Application Log. (a) The Comptroller, among other things, may require a bank to maintain a Fair Housing Inquiry/Application Log (“Log”), based upon, but not limited to, one or more of the...

  20. Utilization and cost of log production from animal loging operations

    Treesearch

    Suraj P. Shrestha; Bobby L. Lanford; Robert B. Rummer; Mark Dubois

    2006-01-01

    Forest harvesting with animals is a labor-intensive operation. It is expensive to use machines on smaller woodlots, which require frequent moves if mechanically logged. So, small logging systems using animals may be more cost effective. In this study, work sampling was used for five animal logging operations in Alabama to measure productive and non-productive time...

  1. Selective logging: do rates of forest turnover in stems, species composition and functional traits decrease with time since disturbance? – A 45 year perspective

    PubMed Central

    Osazuwa-Peters, Oyomoare L.; Jiménez, Iván; Oberle, Brad; Chapman, Colin A.; Zanne, Amy E.

    2015-01-01

    Selective logging, the targeted harvesting of timber trees in a single cutting cycle, is globally rising in extent and intensity. Short-term impacts of selective logging on tropical forests have been widely investigated, but long-term effects on temporal dynamics of forest structure and composition are largely unknown. Understanding these long-term dynamics will help determine whether tropical forests are resilient to selective logging and inform choices between competing demands of anthropogenic use versus conservation of tropical forests. Forest dynamics can be studied within the framework of succession theory, which predicts that temporal turnover rates should decline with time since disturbance. Here, we investigated the temporal dynamics of a tropical forest in Kibale National Park, Uganda over 45 years following selective logging. We estimated turnover rates in stems, species composition, and functional traits (wood density and diameter at breast height), using observations from four censuses in 1989, 1999, 2006, and 2013, of stems ≥ 10 cm diameter within 17 unlogged and 9 logged 200 × 10 m vegetation plots. We used null models to account for interdependencies among turnover rates in stems, species composition, and functional traits. We tested predictions that turnover rates should be higher and decrease with increasing time since the selective logging event in logged forest, but should be less temporally variable in unlogged forest. Overall, we found higher turnover rates in logged forest for all three attributes, but turnover rates did not decline through time in logged forest and was not less temporally variable in unlogged forest. These results indicate that successional models that assume recovery to pre-disturbance conditions are inadequate for predicting the effects of selective logging on the dynamics of the tropical forest in Kibale. Selective logging resulted in persistently higher turnover rates, which may compromise the carbon storage capacity of Kibale’s forest. Selective logging effects may also interact with effects from other global trends, potentially causing major long-term shifts in the dynamics of tropical forests. Similar studies in tropical forests elsewhere will help determine the generality of these conclusions. Ultimately, the view that selective logging is a benign approach to the management of tropical forests should be reconsidered in the light of studies of the effects of this practice on long-term forest dynamics. PMID:26339115

  2. Selective logging: do rates of forest turnover in stems, species composition and functional traits decrease with time since disturbance? - A 45 year perspective.

    PubMed

    Osazuwa-Peters, Oyomoare L; Jiménez, Iván; Oberle, Brad; Chapman, Colin A; Zanne, Amy E

    2015-12-01

    Selective logging, the targeted harvesting of timber trees in a single cutting cycle, is globally rising in extent and intensity. Short-term impacts of selective logging on tropical forests have been widely investigated, but long-term effects on temporal dynamics of forest structure and composition are largely unknown. Understanding these long-term dynamics will help determine whether tropical forests are resilient to selective logging and inform choices between competing demands of anthropogenic use versus conservation of tropical forests. Forest dynamics can be studied within the framework of succession theory, which predicts that temporal turnover rates should decline with time since disturbance. Here, we investigated the temporal dynamics of a tropical forest in Kibale National Park, Uganda over 45 years following selective logging. We estimated turnover rates in stems, species composition, and functional traits (wood density and diameter at breast height), using observations from four censuses in 1989, 1999, 2006, and 2013, of stems ≥ 10 cm diameter within 17 unlogged and 9 logged 200 × 10 m vegetation plots. We used null models to account for interdependencies among turnover rates in stems, species composition, and functional traits. We tested predictions that turnover rates should be higher and decrease with increasing time since the selective logging event in logged forest, but should be less temporally variable in unlogged forest. Overall, we found higher turnover rates in logged forest for all three attributes, but turnover rates did not decline through time in logged forest and was not less temporally variable in unlogged forest. These results indicate that successional models that assume recovery to pre-disturbance conditions are inadequate for predicting the effects of selective logging on the dynamics of the tropical forest in Kibale. Selective logging resulted in persistently higher turnover rates, which may compromise the carbon storage capacity of Kibale's forest. Selective logging effects may also interact with effects from other global trends, potentially causing major long-term shifts in the dynamics of tropical forests. Similar studies in tropical forests elsewhere will help determine the generality of these conclusions. Ultimately, the view that selective logging is a benign approach to the management of tropical forests should be reconsidered in the light of studies of the effects of this practice on long-term forest dynamics.

  3. Rolling Deck to Repository I: Designing a Database Infrastructure

    NASA Astrophysics Data System (ADS)

    Arko, R. A.; Miller, S. P.; Chandler, C. L.; Ferrini, V. L.; O'Hara, S. H.

    2008-12-01

    The NSF-supported academic research fleet collectively produces a large and diverse volume of scientific data, which are increasingly being shared across disciplines and contributed to regional and global syntheses. As both Internet connectivity and storage technology improve, it becomes practical for ships to routinely deliver data and documentation for a standard suite of underway instruments to a central shoreside repository. Routine delivery will facilitate data discovery and integration, quality assessment, cruise planning, compliance with funding agency and clearance requirements, and long-term data preservation. We are working collaboratively with ship operators and data managers to develop a prototype "data discovery system" for NSF-supported research vessels. Our goal is to establish infrastructure for a central shoreside repository, and to develop and test procedures for the routine delivery of standard data products and documentation to the repository. Related efforts are underway to identify tools and criteria for quality control of standard data products, and to develop standard interfaces and procedures for maintaining an underway event log. Development of a shoreside repository infrastructure will include: 1. Deployment and testing of a central catalog that holds cruise summaries and vessel profiles. A cruise summary will capture the essential details of a research expedition (operating institution, ports/dates, personnel, data inventory, etc.), as well as related documentation such as event logs and technical reports. A vessel profile will capture the essential details of a ship's installed instruments (manufacturer, model, serial number, reference location, etc.), with version control as the profile changes through time. The catalog's relational database schema will be based on the UNOLS Data Best Practices Committee's recommendations, and published as a formal XML specification. 2. Deployment and testing of a central repository that holds navigation and routine underway data. Based on discussion with ship operators and data managers at a workgroup meeting in September 2008, we anticipate that a subset of underway data could be delivered from ships to the central repository in near- realtime - enabling the integrated display of ship tracks at a public Web portal, for example - and a full data package could be delivered post-cruise by network transfer or disk shipment. Once ashore, data sets could be distributed to assembly centers such as the Shipboard Automated Meteorological and Oceanographic System (SAMOS) for routine processing, quality assessment, and synthesis efforts - as well as transmitted to national data centers such as NODC and NGDC for permanent archival. 3. Deployment and testing of a basic suite of Web services to make cruise summaries, vessel profiles, event logs, and navigation data easily available. A standard set of catalog records, maps, and navigation features will be published via the Open Archives Initiative (OAI) and Open Geospatial Consortium (OGC) protocols, which can then be harvested by partner data centers and/or embedded in client applications.

  4. Assessing the performance of multi-purpose channel management measures at increasing scales

    NASA Astrophysics Data System (ADS)

    Wilkinson, Mark; Addy, Steve

    2016-04-01

    In addition to hydroclimatic drivers, sediment deposition from high energy river systems can reduce channel conveyance capacity and lead to significant increases in flood risk. There is an increasing recognition that we need to work with the interplay of natural hydrological and morphological processes in order to attenuate flood flows and manage sediment (both coarse and fine). This typically includes both catchment (e.g. woodland planting, wetlands) and river (e.g. wood placement, floodplain reconnection) restoration approaches. The aim of this work was to assess at which scales channel management measures (notably wood placement and flood embankment removal) are most appropriate for flood and sediment management in high energy upland river systems. We present research findings from two densely instrumented research sites in Scotland which regularly experience flood events and have associated coarse sediment problems. We assessed the performance of a range of novel trial measures for three different scales: wooded flow restrictors and gully tree planting at the small scale (<1 km2), floodplain tree planting and engineered log jams at the intermediate scale (5-60 km2), and flood embankment lowering at the large scale (350 km2). Our results suggest that at the smallest scale, care is needed in the installation of flow restrictors. It was found for some restrictors that vertical erosion can occur if the tributary channel bed is disturbed. Preliminary model evidence suggested they have a very limited impact on channel discharge and flood peak delay owing to the small storage areas behind the structures. At intermediate scales, the ability to trap sediment by engineered log jams was limited. Of the 45 engineered log jams installed, around half created a small geomorphic response and only 5 captured a significant amount of coarse material (during one large flood event). As scale increases, the chance of damage or loss of wood placement is greatest. Monitoring highlights the importance of structure design (porosity and degree of channel blockage) and placement in zones of high sediment transport to optimise performance. At the large scale, well designed flood embankment lowering can improve connectivity to the floodplain during low to medium return period events. However, ancillary works to stabilise the bank failed thus emphasising the importance of letting natural processes readjust channel morphology and hydrological connections to the floodplain. Although these trial measures demonstrated limited effects, this may be in part owing to restrictions in the range of hydroclimatological conditions during the study period and further work is needed to assess the performance under more extreme conditions. This work will contribute to refining guidance for managing channel coarse sediment problems in the future which in turn could help mitigate flooding using natural approaches.

  5. Incidence and risk factors of thromboembolism in systemic lupus erythematosus: a comparison of three ethnic groups.

    PubMed

    Mok, Chi Chiu; Tang, Sandy Shuk Kuen; To, Chi Hung; Petri, Michelle

    2005-09-01

    To compare the incidence and risk factors for thromboembolic events in systemic lupus erythematosus (SLE) patients of different ethnic backgrounds. SLE patients who were newly diagnosed or were referred within 6 months of diagnosis between 1996 and 2002 were prospectively followed up for the occurrence of thromboembolic events. Cumulative hazard and risk factors for thromboembolism were evaluated and compared among patients of different ethnic origins. We studied 625 patients who fulfilled the American College of Rheumatology criteria for SLE (89% women): 258 Chinese, 140 African Americans, and 227 Caucasians. The mean +/- SD age at SLE diagnosis was 35.7 +/- 14 years. After a followup of 3,094 patient-years, 48 arterial events and 40 venous events occurred in 83 patients. The overall incidence of arterial and venous thromboembolism was 16/1,000 patient-years and 13/1,000 patient-years, respectively. The cumulative hazard of arterial events at 60 months after the diagnosis of SLE was 8.5%, 8.1%, and 5.1% for the Chinese, African Americans, and Caucasians, respectively. The corresponding cumulative risk of venous events was 3.7%, 6.6%, and 10.3%, respectively (P = 0.008 for Chinese versus Caucasians, by log rank test). Smoking, obesity, antiphospholipid antibodies, and use of antimalarial agents and exogenous estrogens were less frequent in the Chinese patients. In Cox regression models, low levels of high-density lipoprotein (HDL) cholesterol, Chinese ethnicity, oral ulcers, and serositis predicted arterial events, whereas male sex, low levels of HDL cholesterol, antiphospholipid antibodies, non-Chinese ethnicity, obesity, renal disease, and hemolytic anemia predicted venous events. There are ethnic differences in the incidence of arterial and venous thromboembolism in patients with SLE that cannot be fully explained by the clinical factors studied. Further evaluation of other genetic and immunologic factors is warranted.

  6. Hydrocarbon potential of pre-Pennsylvanian rocks in Roosevelt County, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pitt, W.D.

    The hydrocarbon potential of pre-Pennsylvanian rocks in Roosevelt County was appraised from data available in published reports, scout tickets, lithology logs, and other well data at the log libraries in Roswell and Socorro, New Mexico, and Midland, Texas. Elevations from lithology logs were used when differing from scout tickets or other sources. Thickness and data other than lithology logs were assumed to be sufficiently accurate if they fitted the control obtained by contouring. The lithology and reservoir potential of the systems of rock that subcrop beneath the Pennsylvanian System in Roosevelt County are summarized.

  7. The impact of a personal digital assistant (PDA) case log in a medical student clerkship.

    PubMed

    Ho, Kendall; Lauscher, Helen Novak; Broudo, Marc; Jarvis-Selinger, Sandra; Fraser, Joan; Hewes, Deborah; Scott, Ian

    2009-10-01

    Medical education literature emphasizes that reflection and self-audit are pivotal steps in learning and that personal digital assistants (PDAs) have potential as decision support tools. The purpose was to examine the efficacy of PDA-based resources and patient-encounter logging systems among 3rd-year medical clerks during pediatrics rotations. Students in rotations were assigned to control (using paper-based logs and references) or intervention groups (using PDA-based logs and resources). Students completed pre- and postrotation Paediatrics Competency Surveys, participated in focus groups, and were compared on year-end examination grades. Use of PDA logs far outweighed that of paper logs (1,020 PDA logs and 87 paper logs). PDA logs were ranked significantly higher in enhancing learning and reflection than paper logs (t = 2.52, p < .01). PDA logs also facilitated specific learning experiences. PDA-based patient-encounter logs appear to be effective case documentation and reflection tools. The difference in number of logs between control and intervention groups demonstrates the utility of the PDA for "point-of-care" patient logging.

  8. Optimizing Unmanned Aircraft System Scheduling

    DTIC Science & Technology

    2008-06-01

    COL_MISSION_NAME)) If Trim( CStr (rMissions(iRow, COL_MISSION_REQUIRED))) <> "" Then If CLng(rMissions(iRow, COL_MISSION_REQUIRED)) > CLng...logFN, "s:" & CStr (s) & " " For iRow = 1 To top Print #logFN, stack(iRow) & "," Next iRow Print #logFN...340" 60 Print #logFN, "m:" & CStr (s) & " " For iRow = 1 To top Print #logFN, lMissionPeriod(iRow

  9. Use of biopartitioning micellar chromatography and RP-HPLC for the determination of blood-brain barrier penetration of α-adrenergic/imidazoline receptor ligands, and QSPR analysis.

    PubMed

    Vucicevic, J; Popovic, M; Nikolic, K; Filipic, S; Obradovic, D; Agbaba, D

    2017-03-01

    For this study, 31 compounds, including 16 imidazoline/α-adrenergic receptor (IRs/α-ARs) ligands and 15 central nervous system (CNS) drugs, were characterized in terms of the retention factors (k) obtained using biopartitioning micellar and classical reversed phase chromatography (log k BMC and log k wRP , respectively). Based on the retention factor (log k wRP ) and slope of the linear curve (S) the isocratic parameter (φ 0 ) was calculated. Obtained retention factors were correlated with experimental log BB values for the group of examined compounds. High correlations were obtained between logarithm of biopartitioning micellar chromatography (BMC) retention factor and effective permeability (r(log k BMC /log BB): 0.77), while for RP-HPLC system the correlations were lower (r(log k wRP /log BB): 0.58; r(S/log BB): -0.50; r(φ 0 /P e ): 0.61). Based on the log k BMC retention data and calculated molecular parameters of the examined compounds, quantitative structure-permeability relationship (QSPR) models were developed using partial least squares, stepwise multiple linear regression, support vector machine and artificial neural network methodologies. A high degree of structural diversity of the analysed IRs/α-ARs ligands and CNS drugs provides wide applicability domain of the QSPR models for estimation of blood-brain barrier penetration of the related compounds.

  10. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  11. Big Data for Infectious Disease Surveillance and Modeling

    PubMed Central

    Bansal, Shweta; Chowell, Gerardo; Simonsen, Lone; Vespignani, Alessandro; Viboud, Cécile

    2016-01-01

    We devote a special issue of the Journal of Infectious Diseases to review the recent advances of big data in strengthening disease surveillance, monitoring medical adverse events, informing transmission models, and tracking patient sentiments and mobility. We consider a broad definition of big data for public health, one encompassing patient information gathered from high-volume electronic health records and participatory surveillance systems, as well as mining of digital traces such as social media, Internet searches, and cell-phone logs. We introduce nine independent contributions to this special issue and highlight several cross-cutting areas that require further research, including representativeness, biases, volatility, and validation, and the need for robust statistical and hypotheses-driven analyses. Overall, we are optimistic that the big-data revolution will vastly improve the granularity and timeliness of available epidemiological information, with hybrid systems augmenting rather than supplanting traditional surveillance systems, and better prospects for accurate infectious diseases models and forecasts. PMID:28830113

  12. Continuous water sampling and water analysis in estuaries

    USGS Publications Warehouse

    Schemel, L.E.; Dedini, L.A.

    1982-01-01

    Salinity, temperature, light transmission, oxygen saturation, pH, pCO2, chlorophyll a fluorescence, and the concentrations of nitrate, nitrite, dissolved silica, orthophosphate, and ammonia are continuously measured with a system designed primarily for estuarine studies. Near-surface water (2-m depth) is sampled continuously while the vessel is underway; on station, water to depths of 100 m is sampled with a submersible pump. The system is comprised of commercially available instruments, equipment, and components, and of specialized items designed and fabricated by the authors. Data are read from digital displays, analog strip-chart recorders, and a teletype printout, and can be logged in disc storage for subsequent plotting. Data records made in San Francisco Bay illustrate physical, biological, and chemical estuarine processes, such as mixing and phytoplankton net production. The system resolves large- and small-scale events, which contributes to its reliability and usefulness.

  13. Pharmacogenetic association study of warfarin safety endpoints in Puerto Ricans.

    PubMed

    Valentín, Isa I; Rivera, Giselle; Nieves-Plaza, Mariely; Cruz, Iadelisse; Renta, Jessica Y; Cadilla, Carmen L; Feliu, Juan F; Seip, Richard L; Ruaño, Gualberto; Duconge, Jorge

    2014-09-01

    This study was intended to determine the incidence rate of warfarin-related adverse events (e.g., bleeding) in Puerto Ricans and whether a genetic association between warfarin pharmacogenes and any of these adverse events was observed over the initiation period (i.e., the first 90 days of therapy). We conducted an observational, retrospective cohort study of pharmacogenetic association in 122 warfarin-treated, male, Puerto Rican patients (69.9 +/- 9.6 years) from the Veterans Affair Caribbean Healthcare System (VACHS) who consented to participate. Genotyping was performed using the CYP2C9 and VKORC1 assays by Luminex. Event-free survival curves were estimated using the Kaplan-Meier method and analyzed by log-rank test. Cox regression models were constructed and hazard ratios (HR) calculated. Carriers of functional CYP2C9 and VKORC1 polymorphisms demonstrated a higher incidence rate of multiple adverse events (i.e., 5.2 vs. 1.0 cases per 100 patient-months; RR = 4.8, p = 0.12) than did wild types. A significant association was observed between multiple adverse events and carrier status (HR = 2.5; 95% CI: 1.0-6.3, p = 0.04). However, no significant associations between genotypes and individual outcomes over the first 90 days of therapy were found. The association of CYP2C9 and VKORC1 genotypes and risks for adverse events due to exposure to warfarin was examined for the first time in Puerto Ricans. Despite a lack of association with individual events in this study population, our findings revealed a potential utility of genotyping for the prevention of multiple adverse events during warfarin therapy.

  14. Association between vascular calcification assessed by simple radiography and non-fatal cardiovascular events in hemodialysis patients.

    PubMed

    Petrauskiene, Vaida; Vaiciuniene, Ruta; Bumblyte, Inga Arune; Kuzminskis, Vytautas; Ziginskiene, Edita; Grazulis, Saulius; Jonaitiene, Egle

    2016-12-01

    Vascular calcification (VC) is one of the factors associated with cardiovascular mortality in hemodialysis (HD) patients. Recommendations concerning screening for VC differ. Possible ability to prevent and reversibility of VC are major subjects on debate whether screening for VC could improve outcomes of renal patients. The objective of the study was to evaluate the significance of simple vascular calcification score (SVCS) based on plane radiographic films and to test its association with non-fatal cardiovascular events in patients on chronic HD. A study population consisted of 95 prevalent HD patients in the HD unit of Hospital of Lithuanian University of Health sciences Kaunas Clinics. Clinical data and laboratory tests information were collected from medical records. SVCS was evaluated as it is described by Adragao et al. After measurement of VC, HD patients were observed for novel non-fatal cardiovascular events. Patients were divided into two groups: SVCS≥3 (57 patients [60%]) and <3 (38 patients [40%]). The Kaplan-Meier survival curves show a significant difference in non-fatal cardiovascular events in the group with SVCS≥3 vs. <3 group (26.3% vs. 7.8%; log rank 5,49; P=0.018). Multivariate Cox regression analysis confirmed a negative impact of VC, hyperphosphatemia, and lower ejection fraction on cardiovascular events. No statistically significant differences were observed comparing parameters of Ca-P metabolism disorders between groups with different SVCS. On separate analysis, the presence of VC in hands was also associated with higher rate of novel cardiovascular events (score 0 goup-5 events [10.6%] vs. score≥1 group-13 events [27%], log rank P=0.035). VC assessed by simple and inexpensive radiological method was an independent predictor of novel non-fatal cardiovascular events in HD patients. Copyright © 2016 Association Société de néphrologie. Published by Elsevier SAS. All rights reserved.

  15. American Meteor Society Fireball reporting system and mobile application

    NASA Astrophysics Data System (ADS)

    Hankey, M.

    2014-07-01

    The American Meteor Society (AMS) founded in 1911 pioneered the visual study of meteors and has collected data relating to meteor observations and bright fireballs for over 100 years. In December 2010, the online fireball reporting system was upgraded to an interactive application that utilizes Google Maps and other programmatic methods to pinpoint the observer's location, azimuth and elevation values with a high degree of precision. The AMS has collected 10s of 1000s of witness reports relating to 100s of events each year since the new application was released. Three dimensional triangulation methods that average the data collected from witnesses have been developed that can determine the start and end points of the meteor with an accuracy of <50 km (when compared to published solutions provided by operators of all sky cameras). RA and DEC radiant estimates can also be computed for all significant events reported to the AMS. With the release of the mobile application, the AMS is able to collect more precise elevation angles than through the web application. Users can file a new report directly on the phone or update the values submitted through a web report. After web users complete their fireball report online, they are prompted to download the app and update their observation with the more precise data provided by the sensors in the mobile device. The mobile app also provides an accurate means for the witness to report the elapsed time of the fireball. To log this value, the user drags the device across the sky where they saw the fireball. This process is designed to require no button click or user interaction to start and stop the time recording. A count down initiates the process and once the user's phone crosses the plane of azimuth for the end point of the fireball the velocity timer automatically stops. Users are asked to log the recording three times in an effort to minimize error. The three values are then averaged into a final score. Once enough witnesses have filed reports, elapsed time data collected from the mobile phone can be used to determine the velocity of the fireball. With the velocity, trajectory solution and RA/DEC the AMS can plot orbital estimates for significant fireball events reported to the society. Our hope is that overtime this catalog of events will reveal patterns relating to the origins of bright fireballs at certain times of year. The AMS also hopes to be able to associate fireball events reported to the society with known meteor showers when RA/DEC radiant estimates fall close enough to those of known showers. In addition to the enhanced fireball reporting application, the AMS Mobile App provides a meteor shower calendar with information, radiant maps and moon conditions for all upcoming showers. There is also a meteor observing function inside the app that enables meteor observers to log meteor observations directly on the phone and have that data uploaded to the AMS online database and associated with that users observing profile. To record observations the user simply points the device at the part of the sky where they saw the meteor. They then drag their finger across the screen in the direction the meteor traveled. The user is then prompted to enter the magnitude of the event and associate the meteor with a known shower that is active for that date. When the user completes their session, all of the data for each meteor along with the information relating to the session is uploaded to the AMS website. Users can then review the data online in the AMS member's area. Data across all users can be aggregated for statistical analysis and ZHR estimates. Currently the AMS has over 10,000 registered users and facebook followers. In 2013 over 680,000 people visited the AMS website and the society received over 18,000 witness reports relating to 713 confirmed unique fireball events.

  16. Active Wireline Heave Compensation for Ocean Drilling

    NASA Astrophysics Data System (ADS)

    Goldberg, D.; Liu, T.; Swain, K.; Furman, C.; Iturrino, G. J.

    2014-12-01

    The up-and-down heave motion of a ship causes a similar motion on any instruments tethered on wireline cable below it. If the amplitude of this motion is greater than a few tens of cm, significant discrepancy in the depth below the ship is introduced, causing uncertainty in the acquired data. Large and irregular cabled motions also increase the risk of damaging tethered instruments, particularly those with relatively delicate sensors. In 2005, Schlumberger and Deep Down, Inc built an active wireline heave compensator (AHC) system for use onboard the JOIDES Resolution to compensate for heave motion on wireline logging tools deployed in scientific drill holes. The goals for the new AHC system were to (1) design a reliable heave compensation system; and (2) devise a robust and quantitative methodology for routine assessment of compensation efficiency (CE) during wireline operations. Software programs were developed to monitor CE and the dynamics of logging tools in real-time, including system performance under variable parameters such as water depth, sea state, cable length, logging speed and direction. We present the CE results from the AHC system on the JOIDES Resolution during a 5-year period of recent IODP operations and compare the results to those from previous compensation systems deployed during ODP and IODP. Based on new data under heave conditions of ±0.2-2.0 m and water depths of 300-4,800 m in open holes, the system reduces 65-80% of downhole tool displacement under stationary conditions and 50-60% during normal logging operations. Moreover, down/up tool motion at low speeds (300-600 m/h) reduces the system's CE values by 15-20%, and logging down at higher speeds (1,000-1,200 m/h) reduces CE values by 55-65%. Furthermore, the system yields slightly lower CE values of 40-50% without tension feedback of the downhole cable while logging. These results indicate that the new system's compensation efficiency is comparable to or better than previous systems, with additional advantages that include upgradable compensation control software and the capability for continued assessment under varying environmental conditions. Future integration of downhole cable dynamics as an input feedback could further improve CE during logging operations.­

  17. 78 FR 21396 - Notice of a Federal Advisory Committee Meeting: Manufactured Housing Consensus Committee...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-10

    ... specifying the permissible scope and conduct of monitoring; and Be organized and carry out its business in a...-12 III. Review Log of Proposal: Log 1 24 CFR 3285--Alternative Foundation System Testing. Log 80 24...-fhydelarge.html ; Video explaining ASTM D6007: http://www.ntainc.com/video-fhyde.html . Log 81 24 CFR 3280...

  18. Tree damage from skyline logging in a western larch/Douglas-fir stand

    Treesearch

    Robert E. Benson; Michael J. Gonsior

    1981-01-01

    Damage to shelterwood leave trees and to understory trees in shelterwood and clearcut logging units logged with skyline yarders was measured, and related to stand conditions, harvesting specifications, and yarding system-terrain interactions. About 23 percent of the marked leave trees in the shelterwood units were killed in logging, and about 10 percent had moderate to...

  19. Gulf of Mexico Gas Hydrate Joint Industry Project Leg II logging-while-drilling data acquisition and analysis

    USGS Publications Warehouse

    Collett, Timothy S.; Lee, Wyung W.; Zyrianova, Margarita V.; Mrozewski, Stefan A.; Guerin, Gilles; Cook, Ann E.; Goldberg, Dave S.

    2012-01-01

    One of the objectives of the Gulf of Mexico Gas Hydrate Joint Industry Project Leg II (GOM JIP Leg II) was the collection of a comprehensive suite of logging-while-drilling (LWD) data within gas-hydrate-bearing sand reservoirs in order to make accurate estimates of the concentration of gas hydrates under various geologic conditions and to understand the geologic controls on the occurrence of gas hydrate at each of the sites drilled during this expedition. The LWD sensors just above the drill bit provided important information on the nature of the sediments and the occurrence of gas hydrate. There has been significant advancements in the use of downhole well-logging tools to acquire detailed information on the occurrence of gas hydrate in nature: From using electrical resistivity and acoustic logs to identify gas hydrate occurrences in wells to where wireline and advanced logging-while-drilling tools are routinely used to examine the petrophysical nature of gas hydrate reservoirs and the distribution and concentration of gas hydrates within various complex reservoir systems. Recent integrated sediment coring and well-log studies have confirmed that electrical resistivity and acoustic velocity data can yield accurate gas hydrate saturations in sediment grain supported (isotropic) systems such as sand reservoirs, but more advanced log analysis models are required to characterize gas hydrate in fractured (anisotropic) reservoir systems. In support of the GOM JIP Leg II effort, well-log data montages have been compiled and presented in this report which includes downhole logs obtained from all seven wells drilled during this expedition with a focus on identifying and characterizing the potential gas-hydrate-bearing sedimentary section in each of the wells. Also presented and reviewed in this report are the gas-hydrate saturation and sediment porosity logs for each of the wells as calculated from available downhole well logs.

  20. Log-less metadata management on metadata server for parallel file systems.

    PubMed

    Liao, Jianwei; Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally.

  1. Log-Less Metadata Management on Metadata Server for Parallel File Systems

    PubMed Central

    Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally. PMID:24892093

  2. Microbial removals by a novel biofilter water treatment system.

    PubMed

    Wendt, Christopher; Ives, Rebecca; Hoyt, Anne L; Conrad, Ken E; Longstaff, Stephanie; Kuennen, Roy W; Rose, Joan B

    2015-04-01

    Two point-of-use drinking water treatment systems designed using a carbon filter and foam material as a possible alternative to traditional biosand systems were evaluated for removal of bacteria, protozoa, and viruses. Two configurations were tested: the foam material was positioned vertically around the carbon filter in the sleeve unit or horizontally in the disk unit. The filtration systems were challenged with Cryptosporidium parvum, Raoultella terrigena, and bacteriophages P22 and MS2 before and after biofilm development to determine average log reduction (ALR) for each organism and the role of the biofilm. There was no significant difference in performance between the two designs, and both designs showed significant levels of removal (at least 4 log10 reduction in viruses, 6 log10 for protozoa, and 8 log10 for bacteria). Removal levels meet or exceeded Environmental Protection Agency (EPA) standards for microbial purifiers. Exploratory test results suggested that mature biofilm formation contributed 1-2 log10 reductions. Future work is recommended to determine field viability. © The American Society of Tropical Medicine and Hygiene.

  3. Integration of the stratigraphic aspects of very large sea-floor databases using information processing

    USGS Publications Warehouse

    Jenkins, Clinton N.; Flocks, J.; Kulp, M.; ,

    2006-01-01

    Information-processing methods are described that integrate the stratigraphic aspects of large and diverse collections of sea-floor sample data. They efficiently convert common types of sea-floor data into database and GIS (geographical information system) tables, visual core logs, stratigraphic fence diagrams and sophisticated stratigraphic statistics. The input data are held in structured documents, essentially written core logs that are particularly efficient to create from raw input datasets. Techniques are described that permit efficient construction of regional databases consisting of hundreds of cores. The sedimentological observations in each core are located by their downhole depths (metres below sea floor - mbsf) and also by a verbal term that describes the sample 'situation' - a special fraction of the sediment or position in the core. The main processing creates a separate output event for each instance of top, bottom and situation, assigning top-base mbsf values from numeric or, where possible, from word-based relative locational information such as 'core catcher' in reference to sampler device, and recovery or penetration length. The processing outputs represent the sub-bottom as a sparse matrix of over 20 sediment properties of interest, such as grain size, porosity and colour. They can be plotted in a range of core-log programs including an in-built facility that better suits the requirements of sea-floor data. Finally, a suite of stratigraphic statistics are computed, including volumetric grades, overburdens, thicknesses and degrees of layering. ?? The Geological Society of London 2006.

  4. Integration of QR codes into an anesthesia information management system for resident case log management.

    PubMed

    Avidan, Alexander; Weissman, Charles; Levin, Phillip D

    2015-04-01

    Quick response (QR) codes containing anesthesia syllabus data were introduced into an anesthesia information management system. The code was generated automatically at the conclusion of each case and available for resident case logging using a smartphone or tablet. The goal of this study was to evaluate the use and usability/user-friendliness of such system. Resident case logging practices were assessed prior to introducing the QR codes. QR code use and satisfactions amongst residents was reassessed at three and six months. Before QR code introduction only 12/23 (52.2%) residents maintained a case log. Most of the remaining residents (9/23, 39.1%) expected to receive a case list from the anesthesia information management system database at the end of their residency. At three months and six months 17/26 (65.4%) and 15/25 (60.0%) residents, respectively, were using the QR codes. Satisfaction was rated as very good or good. QR codes for residents' case logging with smartphones or tablets were successfully introduced in an anesthesia information management system and used by most residents. QR codes can be successfully implemented into medical practice to support data transfer. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. Performance of a completely automated system for monitoring CMV DNA in plasma.

    PubMed

    Mengelle, C; Sandres-Sauné, K; Mansuy, J-M; Haslé, C; Boineau, J; Izopet, J

    2016-06-01

    Completely automated systems for monitoring CMV-DNA in plasma samples are now available. Evaluate analytical and clinical performances of the VERIS™/MDx System CMV Assay(®). Analytical performance was assessed using quantified quality controls. Clinical performance was assessed by comparison with the COBAS(®) Ampliprep™/COBAS(®) Taqman CMV test using 169 plasma samples that had tested positive with the in-house technique in whole blood. The specificity of the VERIS™/MDx System CMV Assay(®) was 99% [CI 95%: 97.7-100]. Intra-assay reproducibilities were 0.03, 0.04, 0.05 and 0.04 log10IU/ml (means 2.78, 3.70, 4.64 and 5.60 log10IU/ml) for expected values of 2.70, 3.70, 4.70 and 5.70 log10IU/ml. The inter-assay reproducibilities were 0.12 and 0.08 (means 6.30 and 2.85 log10IU/ml) for expected values of 6.28 and 2.80 log10IU/ml. The lower limit of detection was 14.6IU/ml, and the assay was linear from 2.34 to 5.58 log10IU/ml. The results for the positive samples were concordant (r=0.71, p<0.0001; slope of Deming regression 0.79 [CI 95%: 0.56-1.57] and y-intercept 0.79 [CI 95%: 0.63-0.95]). The VERIS™/MDx System CMV Assay(®) detected 18 more positive samples than did the COBAS(®) Ampliprep™/COBAS(®) Taqman CMV test and the mean virus load were higher (0.41 log10IU/ml). Patient monitoring on 68 samples collected from 17 immunosuppressed patients showed similar trends between the two assays. As secondary question, virus loads detected by the VERIS™/MDx System CMV Assay(®) were compared to those of the in-house procedure on whole blood. The results were similar between the two assays (-0.09 log10IU/ml) as were the patient monitoring trends. The performances of the VERIS™/MDx System CMV Assay(®) facilitated its routine use in monitoring CMV-DNA loads in plasma samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Store-and-feedforward adaptive gaming system for hand-finger motion tracking in telerehabilitation.

    PubMed

    Lockery, Daniel; Peters, James F; Ramanna, Sheela; Shay, Barbara L; Szturm, Tony

    2011-05-01

    This paper presents a telerehabilitation system that encompasses a webcam and store-and-feedforward adaptive gaming system for tracking finger-hand movement of patients during local and remote therapy sessions. Gaming-event signals and webcam images are recorded as part of a gaming session and then forwarded to an online healthcare content management system (CMS) that separates incoming information into individual patient records. The CMS makes it possible for clinicians to log in remotely and review gathered data using online reports that are provided to help with signal and image analysis using various numerical measures and plotting functions. Signals from a 6 degree-of-freedom magnetic motion tracking system provide a basis for video-game sprite control. The MMT provides a path for motion signals between common objects manipulated by a patient and a computer game. During a therapy session, a webcam that captures images of the hand together with a number of performance metrics provides insight into the quality, efficiency, and skill of a patient.

  7. Event-Driven Simulation and Analysis of an Underwater Acoustic Local Area Network

    DTIC Science & Technology

    2010-06-01

    Successful number of data packets % b. PSUP = Successful number of Utility packets % c. PSB = Successful number of byte Tx. % d. PSPRT = Number of sub...g. PFU = Number of failed utilities Tx failures with time log of failure % h. PTO = Number of Time-outs 55 function [PSDP,PSUP, PSB ,PSPRT,PFP,PFSP...transmitted PSB = 0 ; % Number of Bytes transmitted PSPRT = 0; % Number of sub-packets retransmitted PFP = 0; % Number of failed packets event PFSP

  8. Association between circulating fibroblast growth factor 21 and mortality in end-stage renal disease.

    PubMed

    Kohara, Marina; Masuda, Takahiro; Shiizaki, Kazuhiro; Akimoto, Tetsu; Watanabe, Yuko; Honma, Sumiko; Sekiguchi, Chuji; Miyazawa, Yasuharu; Kusano, Eiji; Kanda, Yoshinobu; Asano, Yasushi; Kuro-O, Makoto; Nagata, Daisuke

    2017-01-01

    Fibroblast growth factor 21 (FGF21) is an endocrine factor that regulates glucose and lipid metabolism. Circulating FGF21 predicts cardiovascular events and mortality in type 2 diabetes mellitus, including early-stage chronic kidney disease, but its impact on clinical outcomes in end-stage renal disease (ESRD) patients remains unclear. This study enrolled 90 ESRD patients receiving chronic hemodialysis who were categorized into low- and high-FGF21 groups by the median value. We investigated the association between circulating FGF21 levels and the cardiovascular event and mortality during a median follow-up period of 64 months. A Kaplan-Meier analysis showed that the mortality rate was significantly higher in the high-FGF21 group than in the low-FGF21 group (28.3% vs. 9.1%, log-rank, P = 0.034), while the rate of cardiovascular events did not significantly differ between the two groups (30.4% vs. 22.7%, log-rank, P = 0.312). In multivariable Cox models adjusted a high FGF21 level was an independent predictor of all-cause mortality (hazard ratio: 3.98; 95% confidence interval: 1.39-14.27, P = 0.009). Higher circulating FGF21 levels were associated with a high mortality rate, but not cardiovascular events in patient with ESRD, suggesting that circulating FGF21 levels serve as a predictive marker for mortality in these subjects.

  9. Fermilab Recycler Ring BPM Upgrade Based on Digital Receiver Technology

    NASA Astrophysics Data System (ADS)

    Webber, R.; Crisp, J.; Prieto, P.; Voy, D.; Briegel, C.; McClure, C.; West, R.; Pordes, S.; Mengel, M.

    2004-11-01

    Electronics for the 237 BPMs in the Fermilab Recycler Ring have been upgraded from a log-amplifier based system to a commercially produced digitizer-digital down converter based system. The hardware consists of a pre-amplifier connected to a split-plate BPM, an analog differential receiver-filter module and an 8-channel 80-MHz digital down converter VME board. The system produces position and intensity with a dynamic range of 30 dB and a resolution of ±10 microns. The position measurements are made on 2.5-MHz bunched beam and barrier buckets of the un-bunched beam. The digital receiver system operates in one of six different signal processing modes that include 2.5-MHz average, 2.5-MHz bunch-by-bunch, 2.5-MHz narrow band, unbunched average, un-bunched head/tail and 89-kHz narrow band. Receiver data is acquired on any of up to sixteen clock events related to Recycler beam transfers and other machine activities. Data from the digital receiver board are transferred to the front-end CPU for position and intensity computation on an on-demand basis through the VME bus. Data buffers are maintained for each of the acquisition events and support flash, closed orbit and turn-by-turn measurements. A calibration system provides evaluation of the BPM signal path and application programs.

  10. Atmospheric neutrino observations in the MINOS far detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, John Derek

    2007-09-01

    This thesis presents the results of atmospheric neutrino observations from a 12.23 ktyr exposure of the 5.42 kt MINOS Far Detector between 1st August 2003 until 1st March 2006. The separation of atmospheric neutrino events from the large background of cosmic muon events is discussed. A total of 277 candidate contained vertex v/more » $$\\bar{v}$$ μ CC data events are observed, with an expectation of 354.4±47.4 events in the absence of neutrino oscillations. A total of 182 events have clearly identified directions, 77 data events are identified as upward going, 105 data events are identified as downward going. The ratio between the measured and expected up/down ratio is: R$$data\\atop{u/d}$$/R$$MC\\atop{u/d}$$ = 0.72$$+0.13\\atop{-0.11}$$(stat.)± 0.04 (sys.). This is 2.1σ away from the expectation for no oscillations. A total of 167 data events have clearly identified charge, 112 are identified as v μ events, 55 are identified as $$\\bar{v}$$ μ events. This is the largest sample of charge-separated contained-vertex atmospheric neutrino interactions so far observed. The ratio between the measured and expected $$\\bar{v}$$ μ/v μ ratio is: R$$data\\atop{$$\\bar{v}$v}$/ R$$MC\\atop{$$\\bar{v}$v}$ = 0.93 $$+0.19\\atop{-0.15}$$ (stat.) ± 0.12 (sys.). This is consistent with v μ and $$\\bar{v}$$ μ having the same oscillation parameters. Bayesian methods were used to generate a log(L/E) value for each event. A maximum likelihood analysis is used to determine the allowed regions for the oscillation parameters Δm$$2\\atop{32}$$ and sin 22θ 23. The likelihood function uses the uncertainty in log(L/E) to bin events in order to extract as much information from the data as possible. This fit rejects the null oscillations hypothesis at the 98% confidence level. A fit to independent v μ and $$\\bar{v}$$ μ oscillation assuming maximal mixing for both is also performed. The projected sensitivity after an exposure of 25 ktyr is also discussed.« less

  11. Control of Cryptosporidium with wastewater treatment to prevent its proliferation in the water cycle.

    PubMed

    Suwa, M; Suzuki, Y

    2003-01-01

    The outbreak of Cryptosporidiosis in 1996 in Japan is thought to have been enlarged by the proliferation of Cryptosporidium in the water cycle from wastewater to drinking water through the river system. From this experience, the wastewater system must have functions to remove Cryptosporidium oocysts effectively. Efficiencies of wastewater treatment processes to remove oocysts were investigated using pilot plants receiving municipal wastewater. An activated sludge process and a following sand filter showed removal efficiencies of 2 log and 0.5 log, respectively. Poly-aluminium chloride dosage improved the efficiencies by 3 log for the activated sludge process and by 2 log for the sand filter. Chemical precipitation of raw wastewater with poly-aluminium chloride could achieve 1 to 3 log removal according on the coagulant concentration.

  12. Singular and combined effects of blowdown, salvage logging, and wildfire on forest floor and soil mercury pools.

    PubMed

    Mitchell, Carl P J; Kolka, Randall K; Fraver, Shawn

    2012-08-07

    A number of factors influence the amount of mercury (Hg) in forest floors and soils, including deposition, volatile emission, leaching, and disturbances such as fire. Currently the impact on soil Hg pools from other widespread forest disturbances such as blowdown and management practices like salvage logging are unknown. Moreover, ecological and biogeochemical responses to disturbances are generally investigated within a single-disturbance context, with little currently known about the impact of multiple disturbances occurring in rapid succession. In this study we capitalize on a combination of blowdown, salvage logging and fire events in the sub-boreal region of northern Minnesota to assess both the singular and combined effects of these disturbances on forest floor and soil total Hg concentrations and pools. Although none of the disturbance combinations affected Hg in mineral soil, we did observe significant effects on both Hg concentrations and pools in the forest floor. Blowdown increased the mean Hg pool in the forest floor by 0.76 mg Hg m(-2) (223%). Salvage logging following blowdown created conditions leading to a significantly more severe forest floor burn during wildfire, which significantly enhanced Hg emission. This sequence of combined events resulted in a mean loss of approximately 0.42 mg Hg m(-2) (68% of pool) from the forest floor, after conservatively accounting for potential losses via enhanced soil leaching and volatile emissions between the disturbance and sampling dates. Fire alone or blowdown followed by fire did not significantly affect the total Hg concentrations or pools in the forest floor. Overall, unexpected consequences for soil Hg accumulation and by extension, atmospheric Hg emission and risk to aquatic biota, may result when combined impacts are considered in addition to singular forest floor and soil disturbances.

  13. Epileptic Seizure Detection with Log-Euclidean Gaussian Kernel-Based Sparse Representation.

    PubMed

    Yuan, Shasha; Zhou, Weidong; Wu, Qi; Zhang, Yanli

    2016-05-01

    Epileptic seizure detection plays an important role in the diagnosis of epilepsy and reducing the massive workload of reviewing electroencephalography (EEG) recordings. In this work, a novel algorithm is developed to detect seizures employing log-Euclidean Gaussian kernel-based sparse representation (SR) in long-term EEG recordings. Unlike the traditional SR for vector data in Euclidean space, the log-Euclidean Gaussian kernel-based SR framework is proposed for seizure detection in the space of the symmetric positive definite (SPD) matrices, which form a Riemannian manifold. Since the Riemannian manifold is nonlinear, the log-Euclidean Gaussian kernel function is applied to embed it into a reproducing kernel Hilbert space (RKHS) for performing SR. The EEG signals of all channels are divided into epochs and the SPD matrices representing EEG epochs are generated by covariance descriptors. Then, the testing samples are sparsely coded over the dictionary composed by training samples utilizing log-Euclidean Gaussian kernel-based SR. The classification of testing samples is achieved by computing the minimal reconstructed residuals. The proposed method is evaluated on the Freiburg EEG dataset of 21 patients and shows its notable performance on both epoch-based and event-based assessments. Moreover, this method handles multiple channels of EEG recordings synchronously which is more speedy and efficient than traditional seizure detection methods.

  14. Online guideline assist in intensive care medicine--is the login-authentication a sufficient trigger for reminders?

    PubMed

    Röhrig, Rainer; Meister, Markus; Michel-Backofen, Achim; Sedlmayr, Martin; Uphus, Dirk; Katzer, Christian; Rose, Thomas

    2006-01-01

    Rising cost pressure due to the implementation of the DRG-System and quality assurance lead to an increased use of therapy standards and standard operating procedures (SOPs) in intensive care medicine. The intention of the German Scientific Society supported project "OLGA" (Online Guideline Assist) is to develop a prototype of a knowledge based system supporting physicians of an intensive care unit in recognizing the indication for and selecting a specific guideline or SOP. While the response of the prototype on user entries can be displayed as a signal on the used workstation itself, the location and time for a reminder of scheduled or missed procedures or reactions to imported information is a difficult issue. One possible approach to this task is the display of non acknowledged reminders or recommendations while logging on to a system. The objective of this study is to analyse user behaviour of the physicians working on the surgical intensive care unit to decide whether the login authentication is a sufficient trigger for clinical reminding. The surgical intensive care unit examined in this study comprises 14 beds. Medical care is provided by physicians working in shifts 24 hours a day, 7 days a week, with two anaesthetists at a time and an additional senior consultant during daytime. The entire documentation (examinations, medication, orders, care) is performed using the patient data management system ICUData. The authentication process of the physicians was logged and analysed. Throughout the observation period from December 13th 2005 to January 11th 2006 3563 physician logins were counted in total. The mean span between logins was in 11.3 minutes (SD 14.4), the median 7 minutes. The 75% centile was 14 minutes, the 95% centile 38 min. Intervals greater than 60 minutes occurred in 75%, and greater than 90 minutes in 25% of the days. It seems reasonable that reminders sent during authentication are able to enforce workflow compliance. It is possible to send notifications caused by external events to the physician depending on the importance of the event. Serious events with high urgency should be reliably passed using wireless pager or handheld technology. It seems that after the implementation of the prototype guideline assist further investigation is needed to monitor changes in authentication behaviour and reactions to the guideline advisory. This is also required to investigate the influence of unit's size, medical specialty and actual ward workload.

  15. A graphical automated detection system to locate hardwood log surface defects using high-resolution three-dimensional laser scan data

    Treesearch

    Liya Thomas; R. Edward Thomas

    2011-01-01

    We have developed an automated defect detection system and a state-of-the-art Graphic User Interface (GUI) for hardwood logs. The algorithm identifies defects at least 0.5 inch high and at least 3 inches in diameter on barked hardwood log and stem surfaces. To summarize defect features and to build a knowledge base, hundreds of defects were measured, photographed, and...

  16. Transaction Logging.

    ERIC Educational Resources Information Center

    Jones, S.; And Others

    1997-01-01

    Discusses the use of transaction logging in Okapi-related projects to allow search algorithms and user interfaces to be investigated, evaluated, and compared. A series of examples is presented, illustrating logging software for character-based and graphical user interface systems, and demonstrating the usefulness of relational database management…

  17. DIY soundcard based temperature logging system. Part II: applications

    NASA Astrophysics Data System (ADS)

    Nunn, John

    2016-11-01

    This paper demonstrates some simple applications of how temperature logging systems may be used to monitor simple heat experiments, and how the data obtained can be analysed to get some additional insight into the physical processes.

  18. Software systems for operation, control, and monitoring of the EBEX instrument

    NASA Astrophysics Data System (ADS)

    Milligan, Michael; Ade, Peter; Aubin, François; Baccigalupi, Carlo; Bao, Chaoyun; Borrill, Julian; Cantalupo, Christopher; Chapman, Daniel; Didier, Joy; Dobbs, Matt; Grainger, Will; Hanany, Shaul; Hillbrand, Seth; Hubmayr, Johannes; Hyland, Peter; Jaffe, Andrew; Johnson, Bradley; Kisner, Theodore; Klein, Jeff; Korotkov, Andrei; Leach, Sam; Lee, Adrian; Levinson, Lorne; Limon, Michele; MacDermid, Kevin; Matsumura, Tomotake; Miller, Amber; Pascale, Enzo; Polsgrove, Daniel; Ponthieu, Nicolas; Raach, Kate; Reichborn-Kjennerud, Britt; Sagiv, Ilan; Tran, Huan; Tucker, Gregory S.; Vinokurov, Yury; Yadav, Amit; Zaldarriaga, Matias; Zilic, Kyle

    2010-07-01

    We present the hardware and software systems implementing autonomous operation, distributed real-time monitoring, and control for the EBEX instrument. EBEX is a NASA-funded balloon-borne microwave polarimeter designed for a 14 day Antarctic flight that circumnavigates the pole. To meet its science goals the EBEX instrument autonomously executes several tasks in parallel: it collects attitude data and maintains pointing control in order to adhere to an observing schedule; tunes and operates up to 1920 TES bolometers and 120 SQUID amplifiers controlled by as many as 30 embedded computers; coordinates and dispatches jobs across an onboard computer network to manage this detector readout system; logs over 3 GiB/hour of science and housekeeping data to an onboard disk storage array; responds to a variety of commands and exogenous events; and downlinks multiple heterogeneous data streams representing a selected subset of the total logged data. Most of the systems implementing these functions have been tested during a recent engineering flight of the payload, and have proven to meet the target requirements. The EBEX ground segment couples uplink and downlink hardware to a client-server software stack, enabling real-time monitoring and command responsibility to be distributed across the public internet or other standard computer networks. Using the emerging dirfile standard as a uniform intermediate data format, a variety of front end programs provide access to different components and views of the downlinked data products. This distributed architecture was demonstrated operating across multiple widely dispersed sites prior to and during the EBEX engineering flight.

  19. Defects in Hardwood Veneer Logs: Their Frequency and Importance

    Treesearch

    E.S. Harrar

    1954-01-01

    Most southern hardwood veneer and plywood plants have some method of classifying logs by grade to control the purchase price paid for logs bought on the open market. Such log-grading systems have been developed by experience and are dependent to a large extent upon the ability of the grader and his knowledge of veneer grades and yields required for the specific product...

  20. User Evaluation of the NASA Technical Report Server Recommendation Service

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bollen, Johan; Calhoun, JoAnne R.; Mackey, Calvin E.

    2004-01-01

    We present the user evaluation of two recommendation server methodologies implemented for the NASA Technical Report Server (NTRS). One methodology for generating recommendations uses log analysis to identify co-retrieval events on full-text documents. For comparison, we used the Vector Space Model (VSM) as the second methodology. We calculated cosine similarities and used the top 10 most similar documents (based on metadata) as recommendations . We then ran an experiment with NASA Langley Research Center (LaRC) staff members to gather their feedback on which method produced the most quality recommendations. We found that in most cases VSM outperformed log analysis of co-retrievals. However, analyzing the data revealed the evaluations may have been structurally biased in favor of the VSM generated recommendations. We explore some possible methods for combining log analysis and VSM generated recommendations and suggest areas of future work.

  1. User Evaluation of the NASA Technical Report Server Recommendation Service

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Bollen, Johan; Calhoun, JoAnne R.; Mackey, Calvin E.

    2004-01-01

    We present the user evaluation of two recommendation server methodologies implemented for the NASA Technical Report Server (NTRS). One methodology for generating recommendations uses log analysis to identify co-retrieval events on full-text documents. For comparison, we used the Vector Space Model (VSM) as the second methodology. We calculated cosine similarities and used the top 10 most similar documents (based on metadata) as 'recommendations'. We then ran an experiment with NASA Langley Research Center (LaRC) staff members to gather their feedback on which method produced the most 'quality' recommendations. We found that in most cases VSM outperformed log analysis of co-retrievals. However, analyzing the data revealed the evaluations may have been structurally biased in favor of the VSM generated recommendations. We explore some possible methods for combining log analysis and VSM generated recommendations and suggest areas of future work.

  2. Console Log Keeping Made Easier - Tools and Techniques for Improving Quality of Flight Controller Activity Logs

    NASA Technical Reports Server (NTRS)

    Scott, David W.; Underwood, Debrah (Technical Monitor)

    2002-01-01

    At the Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) for International Space Station (ISS), each flight controller maintains detailed logs of activities and communications at their console position. These logs are critical for accurately controlling flight in real-time as well as providing a historical record and troubleshooting tool. This paper describes logging methods and electronic formats used at the POIC and provides food for thought on their strengths and limitations, plus proposes some innovative extensions. It also describes an inexpensive PC-based scheme for capturing and/or transcribing audio clips from communications consoles. Flight control activity (e.g. interpreting computer displays, entering data/issuing electronic commands, and communicating with others) can become extremely intense. It's essential to document it well, but the effort to do so may conflict with actual activity. This can be more than just annoying, as what's in the logs (or just as importantly not in them) often feeds back directly into the quality of future operations, whether short-term or long-term. In earlier programs, such as Spacelab, log keeping was done on paper, often using position-specific shorthand, and the other reader was at the mercy of the writer's penmanship. Today, user-friendly software solves the legibility problem and can automate date/time entry, but some content may take longer to finish due to individual typing speed and less use of symbols. File layout can be used to great advantage in making types of information easy to find, and creating searchable master logs for a given position is very easy and a real lifesaver in reconstructing events or researching a given topic. We'll examine log formats from several console position, and the types of information that are included and (just as importantly) excluded. We'll also look at when a summary or synopsis is effective, and when extensive detail is needed.

  3. Smoothing spline ANOVA frailty model for recurrent event data.

    PubMed

    Du, Pang; Jiang, Yihua; Wang, Yuedong

    2011-12-01

    Gap time hazard estimation is of particular interest in recurrent event data. This article proposes a fully nonparametric approach for estimating the gap time hazard. Smoothing spline analysis of variance (ANOVA) decompositions are used to model the log gap time hazard as a joint function of gap time and covariates, and general frailty is introduced to account for between-subject heterogeneity and within-subject correlation. We estimate the nonparametric gap time hazard function and parameters in the frailty distribution using a combination of the Newton-Raphson procedure, the stochastic approximation algorithm (SAA), and the Markov chain Monte Carlo (MCMC) method. The convergence of the algorithm is guaranteed by decreasing the step size of parameter update and/or increasing the MCMC sample size along iterations. Model selection procedure is also developed to identify negligible components in a functional ANOVA decomposition of the log gap time hazard. We evaluate the proposed methods with simulation studies and illustrate its use through the analysis of bladder tumor data. © 2011, The International Biometric Society.

  4. Analysis of alcohol-based hand sanitizer delivery systems: efficacy of foam, gel, and wipes against influenza A (H1N1) virus on hands.

    PubMed

    Larson, Elaine L; Cohen, Bevin; Baxter, Kathleen A

    2012-11-01

    Minimal research has been published evaluating the effectiveness of hand hygiene delivery systems (ie, rubs, foams, or wipes) at removing viruses from hands. The purposes of this study were to determine the effect of several alcohol-based hand sanitizers in removing influenza A (H1N1) virus, and to compare the effectiveness of foam, gel, and hand wipe products. Hands of 30 volunteers were inoculated with H1N1 and randomized to treatment with foam, gel, or hand wipe applied to half of each volunteer's finger pads. The log(10) count of each subject's treated and untreated finger pads were averaged. Log(10) reductions were calculated from these differences and averaged within treatment group. Between-treatment analysis compared changes from the untreated finger pads using analysis of covariance with treatment as a factor and the average log(10) untreated finger pads as the covariate. Log(10) counts on control finger pads were 2.7-5.3 log(10) of the 50% infectious dose for tissue culture (TCID(50)/0.1 mL) (mean, 3.8 ± 0.5 log(10) TCID(50)/0.1 mL), and treated finger pad counts for all test products were 0.5-1.9 log(10) TCID(50)/0.1 mL (mean, 0.53 ± 0.17 log(10) TCID(50)/0.1 mL). Treatments with all products resulted in a significant reduction in viral titers (>3 logs) at their respective exposure times that were statistically comparable. All 3 delivery systems (foam, gel, and wipe) produced significantly reduced viral counts on hands. Copyright © 2012 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  5. Fuzzy inference system for identification of geological stratigraphy off Prydz Bay, East Antarctica

    NASA Astrophysics Data System (ADS)

    Singh, Upendra K.

    2011-12-01

    The analysis of well logging data plays key role in the exploration and development of hydrocarbon reservoirs. Various well log parameters such as porosity, gamma ray, density, transit time and resistivity, help in classification of strata and estimation of the physical, electrical and acoustical properties of the subsurface lithology. Strong and conspicuous changes in some of the log parameters associated with any particular geological stratigraphy formation are function of its composition, physical properties that help in classification. However some substrata show moderate values in respective log parameters and make difficult to identify the kind of strata, if we go by the standard variability ranges of any log parameters and visual inspection. The complexity increases further with more number of sensors involved. An attempt is made to identify the kinds of stratigraphy from well logs over Prydz bay basin, East Antarctica using fuzzy inference system. A model is built based on few data sets of known stratigraphy and further the network model is used as test model to infer the lithology of a borehole from their geophysical logs, not used in simulation. Initially the fuzzy based algorithm is trained, validated and tested on well log data and finally identifies the formation lithology of a hydrocarbon reservoir system of study area. The effectiveness of this technique is demonstrated by the analysis of the results for actual lithologs and coring data of ODP Leg 188. The fuzzy results show that the training performance equals to 82.95% while the prediction ability is 87.69%. The fuzzy results are very encouraging and the model is able to decipher even thin layer seams and other strata from geophysical logs. The result provides the significant sand formation of depth range 316.0- 341.0 m, where core recovery is incomplete.

  6. Pegaptanib: choroidal neovascularization in patients with age-related macular degeneration and previous arterial thromboembolic events.

    PubMed

    Battaglia Parodi, Maurizio; Di Bartolo, Emanuele; Brue, Claudia; Cappello, Ezio; Furino, Claudio; Giuffrida, Sebastiano; Imparato, Manuela; Reibaldi, Michele

    2018-01-01

    To evaluate the efficacy and the rate of side effects of the pegylated aptamer pegaptanib in the treatment of patients with choroidal neovascularization (CNV) secondary to age-related macular degeneration (AMD) and a history of previous arterial thromboembolic events (ATEs). Twenty-three eyes of 23 patients with subfoveal CNV due to AMD and cerebrovascular accidents (n = 12) and myocardial infarction (n = 11) in the previous 6 months received intravitreal pegaptanib 0.3 mg according to a pro re nata regimen and were followed for 12 months. The paired Student t test was used to evaluate mean changes in best-corrected visual acuity (BCVA; primary outcome measure) and central foveal thickness (CFT). The mean patient age was 71.5 ± 4.6 years; there were 14 women and 9 men. The CNV was type 1, 2, and 3 in 18, 3, and 2 eyes, respectively. The mean BCVA improved from 0.67 ± 0.23 logMAR at baseline to 0.52 ± 0.31 logMAR at the end of 12-month follow-up (p = 0.044). Thirty-five percent of patients achieved ≥3 Early Treatment Diabetic Retinopathy Study lines improvement at 12 months. Mean CFT at baseline (381 ± 111 µm) decreased to 304 ± 82 µm at 12 months (p = 0.008). Patients received a mean of 4.3 ± 1.3 (range 3-7) injections. No systemic or ocular side effects occurred; no patient experienced further ATEs. Intravitreal pegaptanib can be considered a viable treatment option for patients with AMD-related CNV who are at high risk of ATEs.

  7. Meta-Analysis of the Reduction of Norovirus and Male-Specific Coliphage Concentrations in Wastewater Treatment Plants.

    PubMed

    Pouillot, Régis; Van Doren, Jane M; Woods, Jacquelina; Plante, Daniel; Smith, Mark; Goblick, Gregory; Roberts, Christopher; Locas, Annie; Hajen, Walter; Stobo, Jeffrey; White, John; Holtzman, Jennifer; Buenaventura, Enrico; Burkhardt, William; Catford, Angela; Edwards, Robyn; DePaola, Angelo; Calci, Kevin R

    2015-07-01

    Human norovirus (NoV) is the leading cause of foodborne illness in the United States and Canada. Wastewater treatment plant (WWTP) effluents impacting bivalve mollusk-growing areas are potential sources of NoV contamination. We have developed a meta-analysis that evaluates WWTP influent concentrations and log10 reductions of NoV genotype I (NoV GI; in numbers of genome copies per liter [gc/liter]), NoV genotype II (NoV GII; in gc/liter), and male-specific coliphage (MSC; in number of PFU per liter), a proposed viral surrogate for NoV. The meta-analysis included relevant data (2,943 measurements) reported in the scientific literature through September 2013 and previously unpublished surveillance data from the United States and Canada. Model results indicated that the mean WWTP influent concentration of NoV GII (3.9 log10 gc/liter; 95% credible interval [CI], 3.5, 4.3 log10 gc/liter) is larger than the value for NoV GI (1.5 log10 gc/liter; 95% CI, 0.4, 2.4 log10 gc/liter), with large variations occurring from one WWTP to another. For WWTPs with mechanical systems and chlorine disinfection, mean log10 reductions were -2.4 log10 gc/liter (95% CI, -3.9, -1.1 log10 gc/liter) for NoV GI, -2.7 log10 gc/liter (95% CI, -3.6, -1.9 log10 gc/liter) for NoV GII, and -2.9 log10 PFU per liter (95% CI, -3.4, -2.4 log10 PFU per liter) for MSCs. Comparable values for WWTPs with lagoon systems and chlorine disinfection were -1.4 log10 gc/liter (95% CI, -3.3, 0.5 log10 gc/liter) for NoV GI, -1.7 log10 gc/liter (95% CI, -3.1, -0.3 log10 gc/liter) for NoV GII, and -3.6 log10 PFU per liter (95% CI, -4.8, -2.4 PFU per liter) for MSCs. Within WWTPs, correlations exist between mean NoV GI and NoV GII influent concentrations and between the mean log10 reduction in NoV GII and the mean log10 reduction in MSCs. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  8. Extreme temperatures, foundation species, and abrupt ecosystem change: an example from an iconic seagrass ecosystem.

    PubMed

    Thomson, Jordan A; Burkholder, Derek A; Heithaus, Michael R; Fourqurean, James W; Fraser, Matthew W; Statton, John; Kendrick, Gary A

    2015-04-01

    Extreme climatic events can trigger abrupt and often lasting change in ecosystems via the reduction or elimination of foundation (i.e., habitat-forming) species. However, while the frequency/intensity of extreme events is predicted to increase under climate change, the impact of these events on many foundation species and the ecosystems they support remains poorly understood. Here, we use the iconic seagrass meadows of Shark Bay, Western Australia--a relatively pristine subtropical embayment whose dominant, canopy-forming seagrass, Amphibolis antarctica, is a temperate species growing near its low-latitude range limit--as a model system to investigate the impacts of extreme temperatures on ecosystems supported by thermally sensitive foundation species in a changing climate. Following an unprecedented marine heat wave in late summer 2010/11, A. antarctica experienced catastrophic (>90%) dieback in several regions of Shark Bay. Animal-borne video footage taken from the perspective of resident, seagrass-associated megafauna (sea turtles) revealed severe habitat degradation after the event compared with a decade earlier. This reduction in habitat quality corresponded with a decline in the health status of largely herbivorous green turtles (Chelonia mydas) in the 2 years following the heat wave, providing evidence of long-term, community-level impacts of the event. Based on these findings, and similar examples from diverse ecosystems, we argue that a generalized framework for assessing the vulnerability of ecosystems to abrupt change associated with the loss of foundation species is needed to accurately predict ecosystem trajectories in a changing climate. This includes seagrass meadows, which have received relatively little attention in this context. Novel research and monitoring methods, such as the analysis of habitat and environmental data from animal-borne video and data-logging systems, can make an important contribution to this framework. © 2014 John Wiley & Sons Ltd.

  9. Urine Injury Biomarkers and Risk of Adverse Outcomes in Recipients of Prevalent Kidney Transplants: The Folic Acid for Vascular Outcome Reduction in Transplantation Trial

    PubMed Central

    Carpenter, Myra A.; Weiner, Daniel E.; Levey, Andrew S.; Pfeffer, Marc; Kusek, John W.; Cai, Jianwen; Hunsicker, Lawrence G.; Park, Meyeon; Bennett, Michael; Liu, Kathleen D.; Hsu, Chi-yuan

    2016-01-01

    Recipients of kidney transplants (KTR) are at increased risk for cardiovascular events, graft failure, and death. It is unknown whether urine kidney injury biomarkers are associated with poor outcomes among KTRs. We conducted a post hoc analysis of the Folic Acid for Vascular Outcome Reduction in Transplantation (FAVORIT) Trial using a case-cohort study design, selecting participants with adjudicated cardiovascular events, graft failure, or death. Urine neutrophil gelatinase–associated lipocalin (NGAL), kidney injury molecule-1 (KIM-1), IL-18, and liver–type fatty acid binding protein (L-FABP) were measured in spot urine samples and standardized to urine creatinine concentration. We adjusted for demographics, cardiovascular risk factors, eGFR, and urine albumin-to-creatinine ratio. Patients had 291 cardiovascular events, 257 graft failure events, and 359 deaths. Each log increase in urine NGAL/creatinine independently associated with a 24% greater risk of cardiovascular events (adjusted hazard ratio [aHR], 1.24; 95% confidence interval [95% CI], 1.06 to 1.45), a 40% greater risk of graft failure (aHR, 1.40; 95% CI, 1.16 to 1.68), and a 44% greater risk of death (aHR, 1.44; 95% CI, 1.26 to 1.65). Urine KIM-1/creatinine and IL-18/creatinine independently associated with greater risk of death (aHR, 1.29; 95% CI, 1.03 to 1.61 and aHR, 1.25; 95% CI, 1.04 to 1.49 per log increase, respectively) but not with risk of cardiovascular events or graft failure. Urine L-FABP did not associate with any study outcomes. In conclusion, among prevalent KTRs, higher urine NGAL, KIM-1, and IL-18 levels independently and differentially associated with greater risk of adverse outcomes. PMID:26538631

  10. Validation of an Electronic System for Recording Medical Student Patient Encounters

    PubMed Central

    Nkoy, Flory L.; Petersen, Sarah; Matheny Antommaria, Armand H.; Maloney, Christopher G.

    2008-01-01

    The Liaison Committee for Medical Education requires monitoring of the students’ clinical experiences. Student logs, typically used for this purpose, have a number of limitations. We used an electronic system called Patient Tracker to passively generate student encounter data. The data contained in Patient Tracker was compared to the information reported on student logs and data abstracted from the patients’ charts. Patient Tracker identified 30% more encounters than the student logs. Compared to the student logs, Patient Tracker contained a higher average number of diagnoses per encounter (2.28 vs. 1.03, p<0.01). The diagnostic data contained in Patient Tracker was also more accurate under 4 different definitions of accuracy. Only 1.3% (9/677) of diagnoses in Patient Tracker vs. 16.9% (102/601) diagnoses in the logs could not be validated in patients’ charts (p<0.01). Patient Tracker is a more effective and accurate tool for documenting student clinical encounters than the conventional student logs. PMID:18999155

  11. Early presence of anti-angiogenesis-related adverse events as a potential biomarker of antitumor efficacy in metastatic gastric cancer patients treated with apatinib: a cohort study.

    PubMed

    Liu, Xinyang; Qin, Shukui; Wang, Zhichao; Xu, Jianming; Xiong, Jianping; Bai, Yuxian; Wang, Zhehai; Yang, Yan; Sun, Guoping; Wang, Liwei; Zheng, Leizhen; Xu, Nong; Cheng, Ying; Guo, Weijian; Yu, Hao; Liu, Tianshu; Lagiou, Pagona; Li, Jin

    2017-09-05

    Reliable biomarkers of apatinib response in gastric cancer (GC) are lacking. We investigated the association between early presence of common adverse events (AEs) and clinical outcomes in metastatic GC patients. We conducted a retrospective cohort study using data on 269 apatinib-treated GC patients in two clinical trials. AEs were assessed at baseline until 28 days after the last dose of apatinib. Clinical outcomes were compared between patients with and without hypertension (HTN), proteinuria, or hand and foot syndrome (HFS) in the first 4 weeks. Time-to-event variables were assessed using Kaplan-Meier methods and Cox proportional hazard regression models. Binary endpoints were assessed using logistic regression models. Landmark analyses were performed as sensitivity analyses. Predictive model was analyzed, and risk scores were calculated to predict overall survival. Presence of AEs in the first 4 weeks was associated with prolonged median overall survival (169 vs. 103 days, log-rank p = 0.0039; adjusted hazard ratio (HR) 0.64, 95% confidence interval [CI] 0.64-0.84, p = 0.001), prolonged median progression-free survival (86.5 vs. 62 days, log-rank p = 0.0309; adjusted HR 0.69, 95% CI 0.53-0.91, p = 0.007), and increased disease control rate (54.67 vs. 32.77%; adjusted odds ratio 2.67, p < 0.001). Results remained significant in landmark analyses. The onset of any single AE or any combinations of the AEs were all statistically significantly associated with prolonged OS, except for the presence of proteinuria. An AE-based prediction model and subsequently derived scoring system showed high calibration and discrimination in predicting overall survival. Presence of HTN, proteinuria, or HFS during the first cycle of apatinib treatment was a viable biomarker of antitumor efficacy in metastatic GC patients.

  12. Historical Evaluation of Groundwater Responses to Underground Injection Controls in an Urban Watershed

    NASA Astrophysics Data System (ADS)

    Harrison, M.; Haggerty, R.; Santelmann, M. V.

    2017-12-01

    Underground injection controls (UICs) are drywells designed to recharge stormwater to alleviate flooding events. The development of UICs affect the dynamics of the urban hydrologic setting in which more than half of precipitation can be recharged directly into UICs systems. This study seeks to better understand how the development of UICs affect groundwater levels and streamflows. The Portland, OR metropolitan area consist of well over 10,000 of UICs to mitigate flooding during storm events. This study evaluates historical precipitation, streamflow, and groundwater levels from over 20 monitoring wells within a watershed in the city Portland, OR along with well log data of UICs. UICs within the study area are approximately 30 feet in depth and have noted to contribute to about 12% of recharge. This study evaluates the dynamics of groundwater levels in relation towards the development of UICs. The results of obtained from this analysis is applied to model seasonal groundwater, precipitation, and streamflow relationships within a neighborhood subcatchment.

  13. A Mixed Methods Small Pilot Study to Describe the Effects of Upper Limb Training Using a Virtual Reality Gaming System in People with Chronic Stroke

    PubMed Central

    O'Connor, Deborah A.; Smith, Phil; Moss, Sylvia; Allsop, Lizzie; Edge, Wendy

    2017-01-01

    Introduction. This small pilot study aimed to examine the feasibility of an upper limb rehabilitation system (the YouGrabber) in a community rehabilitation centre, qualitatively explore participant experiences, and describe changes after using it. Methods and Material. Chronic stroke participants attending a community rehabilitation centre in the UK were randomised to either a YouGrabber or a gym group and completed 18 training sessions over 12 weeks. The motor activity log, box and block, and fatigue severity score were administered by a blinded assessor before and after the intervention. Semistructured interviews were used to ascertain participants' views about using the YouGrabber. Results. Twelve participants (6 females) with chronic stroke were recruited. All adhered to the intervention. There were no adverse events, dropouts, or withdrawal. There were no significant differences between the YouGrabber and gym groups although there were significant within group improvements on the motor activity log (median change: 0.59, range: 0.2–1.25; p < 0.05) within the YouGrabber group. Participants reported that the YouGrabber was motivational but they expressed frustration with technical challenges. Conclusions. The YouGrabber appeared practical and may improve upper limb activities in people several months after stroke. Future work could examine cognition, cost effectiveness, and different training intensities. PMID:28197341

  14. A Mixed Methods Small Pilot Study to Describe the Effects of Upper Limb Training Using a Virtual Reality Gaming System in People with Chronic Stroke.

    PubMed

    Stockley, Rachel C; O'Connor, Deborah A; Smith, Phil; Moss, Sylvia; Allsop, Lizzie; Edge, Wendy

    2017-01-01

    Introduction . This small pilot study aimed to examine the feasibility of an upper limb rehabilitation system (the YouGrabber) in a community rehabilitation centre, qualitatively explore participant experiences, and describe changes after using it. Methods and Material . Chronic stroke participants attending a community rehabilitation centre in the UK were randomised to either a YouGrabber or a gym group and completed 18 training sessions over 12 weeks. The motor activity log, box and block, and fatigue severity score were administered by a blinded assessor before and after the intervention. Semistructured interviews were used to ascertain participants' views about using the YouGrabber. Results . Twelve participants (6 females) with chronic stroke were recruited. All adhered to the intervention. There were no adverse events, dropouts, or withdrawal. There were no significant differences between the YouGrabber and gym groups although there were significant within group improvements on the motor activity log (median change: 0.59, range: 0.2-1.25; p < 0.05) within the YouGrabber group. Participants reported that the YouGrabber was motivational but they expressed frustration with technical challenges. Conclusions . The YouGrabber appeared practical and may improve upper limb activities in people several months after stroke. Future work could examine cognition, cost effectiveness, and different training intensities.

  15. Geological and geophysical analysis of Coso Geothermal Exploration Hole No. 1 (CGEH-1), Coso Hot Springs KGRA, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galbraith, R.M.

    1978-05-01

    The Coso Geothermal Exploration Hole number one (CGEH-1) was drilled in the Coso Hot Springs KGRA, California, from September 2 to December 2, 1977. Chip samples were collected at ten foot intervals and extensive geophysical logging surveys were conducted to document the geologic character of the geothermal system as penetrated by CGEH-1. The major rock units encountered include a mafic metamorphic sequence and a leucogranite which intruded the metamorphic rocks. Only weak hydrothermal alteration was noted in these rocks. Drillhole surveys and drilling rate data indicate that the geothermal system is structurally controlled and that the drillhole itself was stronglymore » influenced by structural zones. Water chemistry indicates that this geothermal resource is a hot-water rather than a vapor-dominated system. Several geophysical logs were employed to characcterize the drillhole geology. The natural gamma and neutron porosity logs indicate gross rock type and the accoustic logs indicate fractured rock and potentially permeable zones. A series of temperature logs run as a function of time during and after the completion of drilling were most useful in delineating the zones of maximum heat flux. Convective heat flow and temperatures greater than 350/sup 0/F appear to occur only along an open fracture system encountered between depths of 1850 and 2775 feet. Temperature logs indicate a negative thermal gradient below 3000 feet.« less

  16. Geological and geophysical analysis of Coso Geothermal Exploration Hole No. 1 (CGEH-1), Coso Hot Springs KGRA, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galbraith, R.M.

    1978-05-01

    The Coso Geothermal Exploration Hole number one (CGEH-1) was drilled in the Coso Hot Springs KGRA, California from September 2 to December 2, 1977. Chip samples were collected at ten foot intervals and extensive geophysical logging surveys were conducted to document the geologic character of the geothermal system as penetrated by CGEH-1. The major rock units encountered include a mafic metamorphic sequence and a leucogranite which intruded the metamorphic rocks. Only weak hydrothermal alteration was noted in these rocks. Drillhole surveys and drilling rate data indicate that the geothermal system is structurally controlled and that the drillhole itself was stronglymore » influenced by structural zones. Water chemistry indicates that this geothermal resource is a hot-water rather than a vapor-dominated system. Several geophysical logs were employed to characterize the drillhole geology. The natural gamma and neutron porosity logs indicate gross rock type and the acoustic logs indicate fractured rock and potentially permeable zones. A series of temperature logs run as a function of time during and after the completion of drilling were most useful in delineating the zones of maximum heat flux. Convective heat flow and temperatures greater than 350/sup 0/F appear to occur only along an open fracture system encountered between depths of 1850 and 2775 feet. Temperature logs indicate a negative thermal gradient below 3000 feet.« less

  17. Well log characterization of natural gas-hydrates

    USGS Publications Warehouse

    Collett, Timothy S.; Lee, Myung W.

    2012-01-01

    In the last 25 years there have been significant advancements in the use of well-logging tools to acquire detailed information on the occurrence of gas hydrates in nature: whereas wireline electrical resistivity and acoustic logs were formerly used to identify gas-hydrate occurrences in wells drilled in Arctic permafrost environments, more advanced wireline and logging-while-drilling (LWD) tools are now routinely used to examine the petrophysical nature of gas-hydrate reservoirs and the distribution and concentration of gas hydrates within various complex reservoir systems. Resistivity- and acoustic-logging tools are the most widely used for estimating the gas-hydrate content (i.e., reservoir saturations) in various sediment types and geologic settings. Recent integrated sediment coring and well-log studies have confirmed that electrical-resistivity and acoustic-velocity data can yield accurate gas-hydrate saturations in sediment grain-supported (isotropic) systems such as sand reservoirs, but more advanced log-analysis models are required to characterize gas hydrate in fractured (anisotropic) reservoir systems. New well-logging tools designed to make directionally oriented acoustic and propagation-resistivity log measurements provide the data needed to analyze the acoustic and electrical anisotropic properties of both highly interbedded and fracture-dominated gas-hydrate reservoirs. Advancements in nuclear magnetic resonance (NMR) logging and wireline formation testing (WFT) also allow for the characterization of gas hydrate at the pore scale. Integrated NMR and formation testing studies from northern Canada and Alaska have yielded valuable insight into how gas hydrates are physically distributed in sediments and the occurrence and nature of pore fluids(i.e., free water along with clay- and capillary-bound water) in gas-hydrate-bearing reservoirs. Information on the distribution of gas hydrate at the pore scale has provided invaluable insight on the mechanisms controlling the formation and occurrence of gas hydrate in nature along with data on gas-hydrate reservoir properties (i.e., porosities and permeabilities) needed to accurately predict gas production rates for various gas-hydrate production schemes.

  18. Target recognition of log-polar ladar range images using moment invariants

    NASA Astrophysics Data System (ADS)

    Xia, Wenze; Han, Shaokun; Cao, Jie; Yu, Haoyong

    2017-01-01

    The ladar range image has received considerable attentions in the automatic target recognition field. However, previous research does not cover target recognition using log-polar ladar range images. Therefore, we construct a target recognition system based on log-polar ladar range images in this paper. In this system combined moment invariants and backpropagation neural network are selected as shape descriptor and shape classifier, respectively. In order to fully analyze the effect of log-polar sampling pattern on recognition result, several comparative experiments based on simulated and real range images are carried out. Eventually, several important conclusions are drawn: (i) if combined moments are computed directly by log-polar range images, translation, rotation and scaling invariant properties of combined moments will be invalid (ii) when object is located in the center of field of view, recognition rate of log-polar range images is less sensitive to the changing of field of view (iii) as object position changes from center to edge of field of view, recognition performance of log-polar range images will decline dramatically (iv) log-polar range images has a better noise robustness than Cartesian range images. Finally, we give a suggestion that it is better to divide field of view into recognition area and searching area in the real application.

  19. Synopsis of a computer program designed to interface a personal computer with the fast data acquisition system of a time-of-flight mass spectrometer

    NASA Technical Reports Server (NTRS)

    Bechtel, R. D.; Mateos, M. A.; Lincoln, K. A.

    1988-01-01

    Briefly described are the essential features of a computer program designed to interface a personal computer with the fast, digital data acquisition system of a time-of-flight mass spectrometer. The instrumentation was developed to provide a time-resolved analysis of individual vapor pulses produced by the incidence of a pulsed laser beam on an ablative material. The high repetition rate spectrometer coupled to a fast transient recorder captures complete mass spectra every 20 to 35 microsecs, thereby providing the time resolution needed for the study of this sort of transient event. The program enables the computer to record the large amount of data generated by the system in short time intervals, and it provides the operator the immediate option of presenting the spectral data in several different formats. Furthermore, the system does this with a high degree of automation, including the tasks of mass labeling the spectra and logging pertinent instrumental parameters.

  20. A Scientific Data Provenance API for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raju, Bibi; Elsethagen, Todd O.; Stephan, Eric G.

    Data provenance has been an active area of research as a means to standardize how the origin of data, process event history, and what or who was responsible for influencing results is explained. There are two approaches to capture provenance information. The first approach is to collect observed evidence produced by an executing application using log files, event listeners, and temporary files that are used by the application or application developer. The provenance translated from these observations is an interpretation of the provided evidence. The second approach is called disclosed because the application provides a firsthand account of the provenancemore » based on the anticipated questions on data flow, process flow, and responsible agents. Most observed provenance collection systems collect lot of provenance information during an application run or workflow execution. The common trend in capturing provenance is to collect all possible information, then attempt to find relevant information, which is not efficient. Existing disclosed provenance system APIs do not work well in distributed environment and have trouble finding where to fit the individual pieces of provenance information. This work focuses on determining more reliable solutions for provenance capture. As part of the Integrated End-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project, an API was developed, called Producer API (PAPI), which can disclose application targeted provenance, designed to work in distributed environments by means of unique object identification methods. The provenance disclosure approach used adds additional metadata to the provenance information to uniquely identify the pieces and connect them together. PAPI uses a common provenance model to support this provenance integration across disclosure sources. The API also provides the flexibility to let the user decide what to do with the collected provenance. The collected provenance can be sent to a triple store using REST services or it can be logged to a file.« less

  1. Sedimentology and Sedimentary Dynamics of the Desmoinesian Cherokee Group, Deep Anadarko Basin, Texas Panhandle

    NASA Astrophysics Data System (ADS)

    Hu, N.; Loucks, R.; Frebourg, G.

    2015-12-01

    Understanding the spatial variability of deep-water facies is critical to deep-water research because of its revealing information about the relationship between desity flow processes and their resultant sedimentary sequences. The Cherokee Group in the Anadarko Basin, northeastern Texas Panhandle, provides an opportunity to investigate an icehouse-greenhouse Pennsylvanian hybrid system that well demonstrates the intricacies of vertical and lateral facies relationships in an unconfined fan-delta fed deep-water slope to basinal setting. The stratigraphic section ranges in thickness from 150 to 460 m. The cyclic sedimentation and foreland basin tectonics resulted in a complex stratal architecture that was sourced by multiple areas of sediment input. This investigation consists of wireline-log and core data. Five-thousand wireline logs were correlated in an area of over 9500 sq km to map out six depositional sequences that are separated by major flooding events. These events are correlative over the whole area of study. Six cores, that sample nearly the complete section, were described for lithofacies. Lithofacies are recognized based on depositional features and mineralogy:(1) Subarkose, (2) Lithicarkoses, (3) Sandy siliciclastic conglomerate, (4) Muddy calcareous conglomerate, (5) Crinoidal packstone, (6) Oodic grainstone, (7)Pelodic grainstone, (8) Ripple laminated mudrock, (9) faint laminated mudrock. The integration of isopachs of depositional sequences with the lithofacies has allowed the delineation of the spatial and temporal evolution of the slope to basin-floor system. Thin-to-thick bedded turbidites, hyperconcentrated density flow deposits (slurry beds), and debris and mud flow deposits were observed and can be used to better predicte lithofacies distributions in areas that have less data control. These mixed siliciclastic and carbonate deposits can be carrier beds for the hydrocarbons generated from the enclosing organic-rich (TOC ranges from 0.55 to 6.77wt%), dysareobic to anaerobic mudstones.

  2. Postfire logging in riparian areas.

    PubMed

    Reeves, Gordon H; Bisson, Peter A; Rieman, Bruce E; Benda, Lee E

    2006-08-01

    We reviewed the behavior of wildfire in riparian zones, primarily in the western United States, and the potential ecological consequences of postfire logging. Fire behavior in riparian zones is complex, but many aquatic and riparian organisms exhibit a suite of adaptations that allow relatively rapid recovery after fire. Unless constrained by other factors, fish tend to rebound relatively quickly, usually within a decade after a wildfire. Additionally, fire and subsequent erosion events contribute wood and coarse sediment that can create and maintain productive aquatic habitats over time. The potential effects of postfire logging in riparian areas depend on the landscape context and disturbance history of a site; however available evidence suggests two key management implications: (1) fire in riparian areas creates conditions that may not require intervention to sustain the long-term productivity of the aquatic network and (2) protection of burned riparian areas gives priority to what is left rather than what is removed. Research is needed to determine how postfire logging in riparian areas has affected the spread of invasive species and the vulnerability of upland forests to insect and disease outbreaks and how postfire logging will affect the frequency and behavior of future fires. The effectiveness of using postfire logging to restore desired riparian structure and function is therefore unproven, but such projects are gaining interest with the departure of forest conditions from those that existed prior to timber harvest, fire suppression, and climate change. In the absence of reliable information about the potential consequence of postfire timber harvest, we conclude that providing postfire riparian zones with the same environmental protections they received before they burned isjustified ecologically Without a commitment to monitor management experiments, the effects of postfire riparian logging will remain unknown and highly contentious.

  3. Bilateral implantation of +2.5 D multifocal intraocular lens and contralateral implantation of +2.5 D and +3.0 D multifocal intraocular lenses: Clinical outcomes.

    PubMed

    Nuijts, Rudy M M A; Jonker, Soraya M R; Kaufer, Robert A; Lapid-Gortzak, Ruth; Mendicute, Javier; Martinez, Cristina Peris; Schmickler, Stefanie; Kohnen, Thomas

    2016-02-01

    To assess the clinical visual outcomes of bilateral implantation of Restor +2.5 diopter (D) multifocal intraocular lenses (IOLs) and contralateral implantation of a Restor +2.5 D multifocal IOL in the dominant eye and Restor +3.0 D multifocal IOL in the fellow eye. Multicenter study at 8 investigative sites. Prospective randomized parallel-group patient-masked 2-arm study. This study comprised adults requiring bilateral cataract extraction followed by multifocal IOL implantation. The primary endpoint was corrected intermediate visual acuity (CIVA) at 60 cm, and the secondary endpoint was corrected near visual acuity (CNVA) at 40 cm. Both endpoints were measured 3 months after implantation with a noninferiority margin of Δ = 0.1 logMAR. In total, 103 patients completed the study (53 bilateral, 50 contralateral). At 3 months, the mean CIVA at 60 cm was 0.13 logMAR and 0.10 logMAR in the bilateral group and contralateral group, respectively (difference 0.04 logMAR), achieving noninferiority. Noninferiority was not attained for CNVA at 40 cm; mean values at 3 months for bilateral and contralateral implantation were 0.26 logMAR and 0.11 logMAR, respectively (difference 0.15 logMAR). Binocular defocus curves suggested similar performance in distance vision between the 2 groups. Treatment-emergent ocular adverse events rates were similar between the groups. Bilateral implantation of the +2.5 D multifocal IOL resulted in similar distance as contralateral implantation of the +2.5 D multifocal IOL and +3.0 D multifocal IOL for intermediate vision (60 cm), while noninferiority was not achieved for near distances (40 cm). Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  4. Tree Diamter Effects on Cost and Productivity of Cut-to-Length Systems

    Treesearch

    Matthew A. Holtzscher; Bobby L. Lanford

    1997-01-01

    Currently, there is a lack of economic information concerning cut-to-length harvesting systems. This study examined and measured the different costs of operating cut-to-length logging equipment over a range of average stand diameters at breast height. Three different cut-to-length logging systems were examined in this study. Systems included: 1) felier-buncher/manual/...

  5. Magnetic susceptibility well-logging unit with single power supply thermoregulation system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seeley, R. L.

    1985-11-05

    The magnetic susceptibility well-logging unit with single power supply thermoregulation system provides power from a single surface power supply over a well-logging cable to an integrated circuit voltage regulator system downhole. This voltage regulator system supplies regulated voltages to a temperature control system and also to a Maxwell bridge sensing unit which includes the solenoid of a magnetic susceptibility probe. The temperature control system is provided with power from the voltage regulator system and operates to permit one of several predetermined temperatures to be chosen, and then operates to maintain the solenoid of a magnetic susceptibility probe at this chosenmore » temperature. The temperature control system responds to a temperature sensor mounted upon the probe solenoid to cause resistance heaters concentrically spaced from the probe solenoid to maintain the chosen temperature. A second temperature sensor on the probe solenoid provides a temperature signal to a temperature transmitting unit, which initially converts the sensed temperature to a representative voltage. This voltage is then converted to a representative current signal which is transmitted by current telemetry over the well logging cable to a surface electronic unit which then reconverts the current signal to a voltage signal.« less

  6. Interpretation of well logs in a carbonate aquifer

    USGS Publications Warehouse

    MacCary, L.M.

    1978-01-01

    This report describes the log analysis of the Randolph and Sabial core holes in the Edwards aquifer in Texas, with particular attention to the principles that can be applied generally to any carbonate system. The geologic and hydrologic data were obtained during the drilling of the two holes, from extensive laboratory analysis of the cores, and from numerous geophysical logs run in the two holes. Some logging methods are inherently superiors to others for the analysis of limestone and dolomite aquifers. Three such systems are the dentistry, neutron, and acoustic-velocity (sonic) logs. Most of the log analysis described here is based on the interpretation of suites of logs from these three systems. In certain instances, deeply focused resistivity logs can be used to good advantage in carbonate rock studies; this technique is used to computer the water resistivity in the Randolph core hole. The rocks penetrated by the Randolph core hole are typical of those carbonates that have undergone very little solution by recent ground-water circulation. There are few large solutional openings; the water is saline; and the rocks are dark, dolomitic, have pore space that is interparticle or intercrystalline, and contain unoxidized organic material. The total porosity of rocks in the saline zone is higher than that of rocks in the fresh-water aquifer; however, the intrinsic permeability is much less in the saline zone because there are fewer large solutional openings. The Sabinal core hole penetrates a carbonate environment that has experienced much solution by ground water during recent geologic time. The rocks have high secondary porosities controlled by sedimentary structures within the rock; the water is fresh; and the dominant rock composition is limestone. The relative percentages of limestone and dolomite, the average matrix (grain) densities of the rock mixtures , and the porosity of the rock mass can be calculated from density, neutron, and acoustic logs. With supporting data from resistivity logs, the formation water quality can be estimated, as well as the relative cementation or tortuosity of the rock. Many of these properties calculated from logs can be verified by analysis of the core available from test holes drilled in the saline and fresh water zones.

  7. Geology of the surficial aquifer system, Dade County, Florida; lithologic logs

    USGS Publications Warehouse

    Causaras, C.R.

    1986-01-01

    The geologic framework of the surficial aquifer system in Dade County, Florida, was investigated as part of a longterm study by the USGS in cooperation with the South Florida Water Management District, to describe the geology, hydrologic characteristics, and groundwater quality of the surficial aquifer system. Thirty-three test wells were drilled completely through the surficial aquifer system and into the underlying, relatively impermeable units of the Tamiami and Hawthorn Formations. Detailed lithologic logs were made from microscopic examination of rock cuttings and cores obtained from these wells. The logs were used to prepare geologic sections that show the lithologic variations, thickness of the lithologic units, and different geologic formations that comprise the aquifers system. (Author 's abstract)

  8. Path spectra derived from inversion of source and site spectra for earthquakes in Southern California

    NASA Astrophysics Data System (ADS)

    Klimasewski, A.; Sahakian, V. J.; Baltay, A.; Boatwright, J.; Fletcher, J. B.; Baker, L. M.

    2017-12-01

    A large source of epistemic uncertainty in Ground Motion Prediction Equations (GMPEs) is derived from the path term, currently represented as a simple geometric spreading and intrinsic attenuation term. Including additional physical relationships between the path properties and predicted ground motions would produce more accurate and precise, region-specific GMPEs by reclassifying some of the random, aleatory uncertainty as epistemic. This study focuses on regions of Southern California, using data from the Anza network and Southern California Seismic network to create a catalog of events magnitude 2.5 and larger from 1998 to 2016. The catalog encompasses regions of varying geology and therefore varying path and site attenuation. Within this catalog of events, we investigate several collections of event region-to-station pairs, each of which share similar origin locations and stations so that all events have similar paths. Compared with a simple regional GMPE, these paths consistently have high or low residuals. By working with events that have the same path, we can isolate source and site effects, and focus on the remaining residual as path effects. We decompose the recordings into source and site spectra for each unique event and site in our greater Southern California regional database using the inversion method of Andrews (1986). This model represents each natural log record spectra as the sum of its natural log event and site spectra, while constraining each record to a reference site or Brune source spectrum. We estimate a regional, path-specific anelastic attenuation (Q) and site attenuation (t*) from the inversion site spectra and corner frequency from the inversion event spectra. We then compute the residuals between the observed record data, and the inversion model prediction (event*site spectra). This residual is representative of path effects, likely anelastic attenuation along the path that varies from the regional median attenuation. We examine the residuals for our different sets independently to see how path terms differ between event-to-station collections. The path-specific information gained from this can inform development of terms for regional GMPEs, through understanding of these seismological phenomena.

  9. Pharmacogenetic Association Study of Warfarin Safety Endpoints in Puerto Ricans

    PubMed Central

    Valentín, Isa I.; Rivera, Giselle; Nieves-Plaza, Mariely; Cruz, Iadelisse; Renta, Jessica Y.; Cadilla, Carmen L.; Feliu, Juan F.; Seip, Richard L.; Ruaño, Gualberto; Duconge, Jorge

    2014-01-01

    Objective This study was intended to determine the incidence rate of warfarin-related adverse events (e.g., bleeding) in Puerto Ricans and whether a genetic association between warfarin pharmacogenes and any of these adverse events was observed over the initiation period (i.e., the first 90 days of therapy). Methods We conducted an observational, retrospective cohort study of pharmacogenetic association in 122 warfarin-treated, male, Puerto Rican patients (69.9 ±9.6 years) from the Veterans Affair Caribbean Healthcare System (VACHS) who consented to participate. Genotyping was performed using the CYP2C9 and VKORC 1 assays by Luminex. Event-free survival curves were estimated using the Kaplan–Meier method and analyzed by log-rank test. Cox regression models were constructed and hazard ratios (HR) calculated. Results Carriers of functional CYP2C9 and VKORC1 polymorphisms demonstrated a higher incidence rate of multiple adverse events (i.e., 5.2 vs. 1.0 cases per 100 patient-months; RR = 4.8, p = 0.12) than did wild types. A significant association was observed between multiple adverse events and carrier status (HR = 2.5; 95% CI : 1.0–6.3, p = 0.04). However, no significant associations between genotypes and individual outcomes over the first 90 days of therapy were found. Conclusion The association of CYP2C9 and VKORC1 genotypes and risks for adverse events due to exposure to warfarin was examined for the first time in Puerto Ricans. Despite a lack of association with individual events in this study population, our findings revealed a potential utility of genotyping for the prevention of multiple adverse events during warfarin therapy. PMID:25244877

  10. Narrow log-periodic modulations in non-Markovian random walks

    NASA Astrophysics Data System (ADS)

    Diniz, R. M. B.; Cressoni, J. C.; da Silva, M. A. A.; Mariz, A. M.; de Araújo, J. M.

    2017-12-01

    What are the necessary ingredients for log-periodicity to appear in the dynamics of a random walk model? Can they be subtle enough to be overlooked? Previous studies suggest that long-range damaged memory and negative feedback together are necessary conditions for the emergence of log-periodic oscillations. The role of negative feedback would then be crucial, forcing the system to change direction. In this paper we show that small-amplitude log-periodic oscillations can emerge when the system is driven by positive feedback. Due to their very small amplitude, these oscillations can easily be mistaken for numerical finite-size effects. The models we use consist of discrete-time random walks with strong memory correlations where the decision process is taken from memory profiles based either on a binomial distribution or on a delta distribution. Anomalous superdiffusive behavior and log-periodic modulations are shown to arise in the large time limit for convenient choices of the models parameters.

  11. Building analytical platform with Big Data solutions for log files of PanDA infrastructure

    NASA Astrophysics Data System (ADS)

    Alekseev, A. A.; Barreiro Megino, F. G.; Klimentov, A. A.; Korchuganova, T. A.; Maendo, T.; Padolski, S. V.

    2018-05-01

    The paper describes the implementation of a high-performance system for the processing and analysis of log files for the PanDA infrastructure of the ATLAS experiment at the Large Hadron Collider (LHC), responsible for the workload management of order of 2M daily jobs across the Worldwide LHC Computing Grid. The solution is based on the ELK technology stack, which includes several components: Filebeat, Logstash, ElasticSearch (ES), and Kibana. Filebeat is used to collect data from logs. Logstash processes data and export to Elasticsearch. ES are responsible for centralized data storage. Accumulated data in ES can be viewed using a special software Kibana. These components were integrated with the PanDA infrastructure and replaced previous log processing systems for increased scalability and usability. The authors will describe all the components and their configuration tuning for the current tasks, the scale of the actual system and give several real-life examples of how this centralized log processing and storage service is used to showcase the advantages for daily operations.

  12. Strangelove ocean at era boundaries, terrestrial or extraterrestrial cause

    NASA Astrophysics Data System (ADS)

    Hsue, Kenneth J.

    Negative perturbations in carbon-isotope value of calcite in pelagic sediments were found at times of biotic crisis, marking horizons which are, or were proposed as era boundaries: Cretaceous/Tertiary (K/T), Permian/Triassic (P/T), and Precambrian/Cambrian (PreC/C). The anomaly was also found at several other mass-extinction horizons, such as terminal Ordovician, Frasnian-Famenian, etc. Studies of K/T boundary indicate that only the planktic fraction of the sediments has the negative isotope anomaly, whereas the benthic fraction has the same value across the boundary. This geochemical signal is thus considered a record of strangelove ocean, or an ocean where isotope fractionation of dissolved carbonate ions in surface waters (by biotic function of planktic organisms) has been significantly reduced because of the drastic reduction of the biomass in the oceans. The reduction of marine biomass at each of the era boundaries was related to chemical pollution of the oceans as a consequence of a catastrophic event; a pH decrease of 0.5 could inhibit the fertility of planktons. Studies of earthquakes, volcanic eruptions, and meteorite-impact occurrences have indicated a linearly inverse log/log relationship between the magnitude and frequency of events. The frequency of era boundaries in geologic history supports the postulate that the rare events causing those biotic crises were large bolide-impacts.

  13. A comparison of moment-based methods of estimation for the log Pearson type 3 distribution

    NASA Astrophysics Data System (ADS)

    Koutrouvelis, I. A.; Canavos, G. C.

    2000-06-01

    The log Pearson type 3 distribution is a very important model in statistical hydrology, especially for modeling annual flood series. In this paper we compare the various methods based on moments for estimating quantiles of this distribution. Besides the methods of direct and mixed moments which were found most successful in previous studies and the well-known indirect method of moments, we develop generalized direct moments and generalized mixed moments methods and a new method of adaptive mixed moments. The last method chooses the orders of two moments for the original observations by utilizing information contained in the sample itself. The results of Monte Carlo experiments demonstrated the superiority of this method in estimating flood events of high return periods when a large sample is available and in estimating flood events of low return periods regardless of the sample size. In addition, a comparison of simulation and asymptotic results shows that the adaptive method may be used for the construction of meaningful confidence intervals for design events based on the asymptotic theory even with small samples. The simulation results also point to the specific members of the class of generalized moments estimates which maintain small values for bias and/or mean square error.

  14. Strangelove ocean at era boundaries, terrestrial or extraterrestrial cause

    NASA Technical Reports Server (NTRS)

    Hsue, Kenneth J.

    1988-01-01

    Negative perturbations in carbon-isotope value of calcite in pelagic sediments were found at times of biotic crisis, marking horizons which are, or were proposed as era boundaries: Cretaceous/Tertiary (K/T), Permian/Triassic (P/T), and Precambrian/Cambrian (PreC/C). The anomaly was also found at several other mass-extinction horizons, such as terminal Ordovician, Frasnian-Famenian, etc. Studies of K/T boundary indicate that only the planktic fraction of the sediments has the negative isotope anomaly, whereas the benthic fraction has the same value across the boundary. This geochemical signal is thus considered a record of strangelove ocean, or an ocean where isotope fractionation of dissolved carbonate ions in surface waters (by biotic function of planktic organisms) has been significantly reduced because of the drastic reduction of the biomass in the oceans. The reduction of marine biomass at each of the era boundaries was related to chemical pollution of the oceans as a consequence of a catastrophic event; a pH decrease of 0.5 could inhibit the fertility of planktons. Studies of earthquakes, volcanic eruptions, and meteorite-impact occurrences have indicated a linearly inverse log/log relationship between the magnitude and frequency of events. The frequency of era boundaries in geologic history supports the postulate that the rare events causing those biotic crises were large bolide-impacts.

  15. Identification of coal seam strata from geophysical logs of borehole using Adaptive Neuro-Fuzzy Inference System

    NASA Astrophysics Data System (ADS)

    Yegireddi, Satyanarayana; Uday Bhaskar, G.

    2009-01-01

    Different parameters obtained through well-logging geophysical sensors such as SP, resistivity, gamma-gamma, neutron, natural gamma and acoustic, help in identification of strata and estimation of the physical, electrical and acoustical properties of the subsurface lithology. Strong and conspicuous changes in some of the log parameters associated with any particular stratigraphy formation, are function of its composition, physical properties and help in classification. However some substrata show moderate values in respective log parameters and make difficult to identify or assess the type of strata, if we go by the standard variability ranges of any log parameters and visual inspection. The complexity increases further with more number of sensors involved. An attempt is made to identify the type of stratigraphy from borehole geophysical log data using a combined approach of neural networks and fuzzy logic, known as Adaptive Neuro-Fuzzy Inference System. A model is built based on a few data sets (geophysical logs) of known stratigraphy of in coal areas of Kothagudem, Godavari basin and further the network model is used as test model to infer the lithology of a borehole from their geophysical logs, not used in simulation. The results are very encouraging and the model is able to decipher even thin cola seams and other strata from borehole geophysical logs. The model can be further modified to assess the physical properties of the strata, if the corresponding ground truth is made available for simulation.

  16. Quality of the log-geometric distribution extrapolation for smaller undiscovered oil and gas pool size

    USGS Publications Warehouse

    Chenglin, L.; Charpentier, R.R.

    2010-01-01

    The U.S. Geological Survey procedure for the estimation of the general form of the parent distribution requires that the parameters of the log-geometric distribution be calculated and analyzed for the sensitivity of these parameters to different conditions. In this study, we derive the shape factor of a log-geometric distribution from the ratio of frequencies between adjacent bins. The shape factor has a log straight-line relationship with the ratio of frequencies. Additionally, the calculation equations of a ratio of the mean size to the lower size-class boundary are deduced. For a specific log-geometric distribution, we find that the ratio of the mean size to the lower size-class boundary is the same. We apply our analysis to simulations based on oil and gas pool distributions from four petroleum systems of Alberta, Canada and four generated distributions. Each petroleum system in Alberta has a different shape factor. Generally, the shape factors in the four petroleum systems stabilize with the increase of discovered pool numbers. For a log-geometric distribution, the shape factor becomes stable when discovered pool numbers exceed 50 and the shape factor is influenced by the exploration efficiency when the exploration efficiency is less than 1. The simulation results show that calculated shape factors increase with those of the parent distributions, and undiscovered oil and gas resources estimated through the log-geometric distribution extrapolation are smaller than the actual values. ?? 2010 International Association for Mathematical Geology.

  17. Decontamination of materials contaminated with Francisella philomiragia or MS2 bacteriophage using PES-Solid, a solid source of peracetic acid.

    PubMed

    Buhr, T L; Young, A A; Johnson, C A; Minter, Z A; Wells, C M

    2014-08-01

    The aim of the study was to develop test methods and evaluate survival of Francisella philomiragia cells and MS2 bacteriophage after exposure to PES-Solid (a solid source of peracetic acid) formulations with or without surfactants. Francisella philomiragia cells (≥7·6 log10 CFU) or MS2 bacteriophage (≥6·8 log10 PFU) were deposited on seven different test materials and treated with three different PES-Solid formulations, three different preneutralized samples and filter controls at room temperature for 15 min. There were 0-1·3 log10 CFU (<20 cells) of cell survival, or 0-1·7 log10 (<51 PFU) of bacteriophage survival in all 21 test combinations (organism, formulation and substrate) containing reactive PES-Solid. In addition, the microemulsion (Dahlgren Surfactant System) showed ≤2 log10 (100 cells) of viable F. philomiragia cells, indicating the microemulsion achieved <2 log10 CFU on its own. Three PES-Solid formulations and one microemulsion system (DSS) inactivated F. philomiragia cells and/or MS2 bacteriophage that were deposited on seven different materials. A test method was developed to show that reactive PES-Solid formulations and a microemulsion system (DSS) inactivated >6 log10 CFU/PFU F. philomiragia cells and/or MS2 bacteriophage on different materials. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  18. Endogenous System Microbes as Treatment Process ...

    EPA Pesticide Factsheets

    Monitoring the efficacy of treatment strategies to remove pathogens in decentralized systems remains a challenge. Evaluating log reduction targets by measuring pathogen levels is hampered by their sporadic and low occurrence rates. Fecal indicator bacteria are used in centralized systems to indicate the presence of fecal pathogens, but are ineffective decentralized treatment process indicators as they generally occur at levels too low to assess log reduction targets. System challenge testing by spiking with high loads of fecal indicator organisms, like MS2 coliphage, has limitations, especially for large systems. Microbes that are endogenous to the decentralized system, occur in high abundances and mimic removal rates of bacterial, viral and/or parasitic protozoan pathogens during treatment could serve as alternative treatment process indicators to verify log reduction targets. To identify abundant microbes in wastewater, the bacterial and viral communities were examined using deep sequencing. Building infrastructure-associated bacteria, like Zoogloea, were observed as dominant members of the bacterial community in graywater. In blackwater, bacteriophage of the order Caudovirales constituted the majority of contiguous sequences from the viral community. This study identifies candidate treatment process indicators in decentralized systems that could be used to verify log removal during treatment. The association of the presence of treatment process indic

  19. Intrathecal Pump Exposure to Electromagnetic Interference: A Report of Device Interrogation following Multiple ECT Sessions.

    PubMed

    Bicket, Mark C; Hanna, George M

    2016-02-01

    Intrathecal drug delivery systems represent an increasingly common treatment modality for patients with a variety of conditions, including chronic pain and spasticity. Pumps rely on electronic programming to properly control and administer highly concentrated medications. Electromagnetic interference (EMI) is a known exposure that may cause a potential patient safety issue stemming from direct patient injury, pump damage, or changes to pump operation or flow rate. The objective of our case report was to describe an approach to evaluating a patient with a pump prior to and following exposure to EMI from electroconvulsive therapy (ECT), as well as to document findings from device interrogations associated with this event. Case report. Academic university-based pain management center. We present the case of a patient with an intrathecal pump who underwent multiple exposures to EMI in the form of 42 ECT sessions. Interrogation of the intrathecal drug delivery system revealed no safety issues following ECT sessions. At no time were error messages, unintentional changes in event logs, unintentional changes in pump settings, or evidence of pump stall or over-infusion noted. Communication with multiple entities (patient, family, consulting physicians, and device manufacturer) and maintaining vigilance through device interrogation both before and after EMI exposure are appropriate safeguards to mitigate the risk and detect potential adverse events of EMI with intrathecal drug delivery systems. Given the infrequent reports of device exposure to ECT, best practices may be derived from experience with EMI exposure from magnetic resonance imaging (MRI). Although routine EMI exposure to intrathecal drug delivery systems should be avoided, we describe one patient with repeated exposure to ECT without apparent complication.

  20. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  1. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  2. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  3. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  4. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  5. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  6. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  7. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  8. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  9. A simplified hardwood log-sawing program for three-dimensional profile data

    Treesearch

    R. Edward Thomas

    2011-01-01

    Current laser scanning systems in sawmills collect low-resolution three-dimensional (3D) profiles of logs. However, these scanners are capable of much more. As a demonstration, the U.S. Forest Service, Forestry Sciences Laboratory in Princeton, WV, constructed a 3D laser log scanner using off -the-shelf industrial scanning components.

  10. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  11. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  12. 40 CFR 91.412 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the...

  13. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  14. 40 CFR 89.409 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records the...

  15. 40 CFR 90.412 - Data logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection...

  16. The Tunisian Jurassic aquifer in the North African Sahara aquifer system: information derived from two-dimensional seismic reflection and well logs

    NASA Astrophysics Data System (ADS)

    Ben Lasmar, Rafika; Guellala, Rihab; Garrach, Mohamed; Mahroug, Ali; Sarsar Naouali, Benen; Inoubli, Mohamed Hédi

    2017-12-01

    Southern Tunisia is an arid area where socio-economic activities are dependent on groundwater resources. The presented study aims to better characterize the Jurassic aquifer based on geological and geophysical data, with a view to develop a rational exploitation program. Well logs are used to precisely determine the position and composition of the known Jurassic aquifer layers and to identify others able to produce good quality water. The logs show that limestones, sandstones and dolomites of the Krachoua, Techout and Foum Tataouine formations are the main Jurassic aquifers. Sixty-eight seismic-reflection sections are integrated within this study. The interpolation between the interpreted sections leads to the construction of isochronous isopach maps and geoseismic sections, and their analysis finds that compressive and extensive tectonic deformations have influenced the Jurassic aquifer geometry. The Hercynian orogeny phase manifestation is remarkable in that there are several stratigraphic gaps in the Jurassic sequence. The E-W, NW-SE, and NNW-SSE accidents, reactivated in normal faults since the Permian to Lower Cretaceous epochs, have generated the structures found in the Jurassic series, such as subsided and raised blocks. Their syn-sedimentary activity has controlled the thickness and facies of these series. The Cretaceous, Tortonian and Post-Villafranchian compressions are responsible for the Jurassic-deposits folding in some localities. The highlighted tectonic and sedimentary events have an important impact on the Jurassic aquifer function by favoring the Jurassic aquifer interconnections and their connections with the Triassic and Cretaceous permeable series.

  17. Stochastic theory of log-periodic patterns

    NASA Astrophysics Data System (ADS)

    Canessa, Enrique

    2000-12-01

    We introduce an analytical model based on birth-death clustering processes to help in understanding the empirical log-periodic corrections to power law scaling and the finite-time singularity as reported in several domains including rupture, earthquakes, world population and financial systems. In our stochastic theory log-periodicities are a consequence of transient clusters induced by an entropy-like term that may reflect the amount of co-operative information carried by the state of a large system of different species. The clustering completion rates for the system are assumed to be given by a simple linear death process. The singularity at t0 is derived in terms of birth-death clustering coefficients.

  18. Quaternion Based Thermal Condition Monitoring System

    NASA Astrophysics Data System (ADS)

    Wong, Wai Kit; Loo, Chu Kiong; Lim, Way Soong; Tan, Poi Ngee

    In this paper, we will propose a new and effective machine condition monitoring system using log-polar mapper, quaternion based thermal image correlator and max-product fuzzy neural network classifier. Two classification characteristics namely: peak to sidelobe ratio (PSR) and real to complex ratio of the discrete quaternion correlation output (p-value) are applied in the proposed machine condition monitoring system. Large PSR and p-value observe in a good match among correlation of the input thermal image with a particular reference image, while small PSR and p-value observe in a bad/not match among correlation of the input thermal image with a particular reference image. In simulation, we also discover that log-polar mapping actually help solving rotation and scaling invariant problems in quaternion based thermal image correlation. Beside that, log-polar mapping can have a two fold of data compression capability. Log-polar mapping can help smoother up the output correlation plane too, hence makes a better measurement way for PSR and p-values. Simulation results also show that the proposed system is an efficient machine condition monitoring system with accuracy more than 98%.

  19. Geophysical log database for the Floridan aquifer system and southeastern Coastal Plain aquifer system in Florida and parts of Georgia, Alabama, and South Carolina

    USGS Publications Warehouse

    Williams, Lester J.; Raines, Jessica E.; Lanning, Amanda E.

    2013-04-04

    A database of borehole geophysical logs and other types of data files were compiled as part of ongoing studies of water availability and assessment of brackish- and saline-water resources. The database contains 4,883 logs from 1,248 wells in Florida, Georgia, Alabama, South Carolina, and from a limited number of offshore wells of the eastern Gulf of Mexico and the Atlantic Ocean. The logs can be accessed through a download directory organized by state and county for onshore wells and in a single directory for the offshore wells. A flat file database is provided that lists the wells, their coordinates, and the file listings.

  20. The design and implementation of an automated system for logging clinical experiences using an anesthesia information management system.

    PubMed

    Simpao, Allan; Heitz, James W; McNulty, Stephen E; Chekemian, Beth; Brenn, B Randall; Epstein, Richard H

    2011-02-01

    Residents in anesthesia training programs throughout the world are required to document their clinical cases to help ensure that they receive adequate training. Current systems involve self-reporting, are subject to delayed updates and misreported data, and do not provide a practicable method of validation. Anesthesia information management systems (AIMS) are being used increasingly in training programs and are a logical source for verifiable documentation. We hypothesized that case logs generated automatically from an AIMS would be sufficiently accurate to replace the current manual process. We based our analysis on the data reporting requirements of the American College of Graduate Medical Education (ACGME). We conducted a systematic review of ACGME requirements and our AIMS record, and made modifications after identifying data element and attribution issues. We studied 2 methods (parsing of free text procedure descriptions and CPT4 procedure code mapping) to automatically determine ACGME case categories and generated AIMS-based case logs and compared these to assignments made by manual inspection of the anesthesia records. We also assessed under- and overreporting of cases entered manually by our residents into the ACGME website. The parsing and mapping methods assigned cases to a majority of the ACGME categories with accuracies of 95% and 97%, respectively, as compared with determinations made by 2 residents and 1 attending who manually reviewed all procedure descriptions. Comparison of AIMS-based case logs with reports from the ACGME Resident Case Log System website showed that >50% of residents either underreported or overreported their total case counts by at least 5%. The AIMS database is a source of contemporaneous documentation of resident experience that can be queried to generate valid, verifiable case logs. The extent of AIMS adoption by academic anesthesia departments should encourage accreditation organizations to support uploading of AIMS-based case log files to improve accuracy and to decrease the clerical burden on anesthesia residents.

  1. Visual Indicators on Vaccine Boxes as Early Warning Tools to Identify Potential Freeze Damage.

    PubMed

    Angoff, Ronald; Wood, Jillian; Chernock, Maria C; Tipping, Diane

    2015-07-01

    The aim of this study was to determine whether the use of visual freeze indicators on vaccines would assist health care providers in identifying vaccines that may have been exposed to potentially damaging temperatures. Twenty-seven sites in Connecticut involved in the Vaccine for Children Program participated. In addition to standard procedures, visual freeze indicators (FREEZEmarker ® L; Temptime Corporation, Morris Plains, NJ) were affixed to each box of vaccine that required refrigeration but must not be frozen. Temperatures were monitored twice daily. During the 24 weeks, all 27 sites experienced triggered visual freeze indicator events in 40 of the 45 refrigerators. A total of 66 triggered freeze indicator events occurred in all 4 types of refrigerators used. Only 1 of the freeze events was identified by a temperature-monitoring device. Temperatures recorded on vaccine data logs before freeze indicator events were within the 35°F to 46°F (2°C to 8°C) range in all but 1 instance. A total of 46,954 doses of freeze-sensitive vaccine were stored at the time of a visual freeze indicator event. Triggered visual freeze indicators were found on boxes containing 6566 doses (14.0% of total doses). Of all doses stored, 14,323 doses (30.5%) were of highly freeze-sensitive vaccine; 1789 of these doses (12.5%) had triggered indicators on the boxes. Visual freeze indicators are useful in the early identification of freeze events involving vaccines. Consideration should be given to including these devices as a component of the temperature-monitoring system for vaccines.

  2. Valve Health Monitoring System Utilizing Smart Instrumentation

    NASA Technical Reports Server (NTRS)

    Jensen, Scott L.; Drouant, George J.

    2006-01-01

    The valve monitoring system is a stand alone unit with network capabilities for integration into a higher level health management system. The system is designed for aiding in failure predictions of high-geared ball valves and linearly actuated valves. It performs data tracking and archiving for identifying degraded performance. The data collection types are cryogenic cycles, total cycles, inlet temperature, body temperature torsional strain, linear bonnet strain, preload position, total travel and total directional changes. Events are recorded and time stamped in accordance with the IRIG B True Time. The monitoring system is designed for use in a Class 1 Division II explosive environment. The basic configuration consists of several instrumentation sensor units and a base station. The sensor units are self contained microprocessor controlled and remotely mountable in three by three by two inches. Each unit is potted in a fire retardant substance without any cavities and limited to low operating power for maintaining safe operation in a hydrogen environment. The units are temperature monitored to safeguard against operation outside temperature limitations. Each contains 902-928 MHz band digital transmitters which meet Federal Communication Commission's requirements and are limited to a 35 foot transmission radius for preserving data security. The base-station controller correlates data from the sensor units and generates data event logs on a compact flash memory module for database uploading. The entries are also broadcast over an Ethernet network. Nitrogen purged National Electrical Manufactures Association (NEMA) Class 4 enclosures are used to house the base-station

  3. CRISPR-based screening of genomic island excision events in bacteria.

    PubMed

    Selle, Kurt; Klaenhammer, Todd R; Barrangou, Rodolphe

    2015-06-30

    Genomic analysis of Streptococcus thermophilus revealed that mobile genetic elements (MGEs) likely contributed to gene acquisition and loss during evolutionary adaptation to milk. Clustered regularly interspaced short palindromic repeats-CRISPR-associated genes (CRISPR-Cas), the adaptive immune system in bacteria, limits genetic diversity by targeting MGEs including bacteriophages, transposons, and plasmids. CRISPR-Cas systems are widespread in streptococci, suggesting that the interplay between CRISPR-Cas systems and MGEs is one of the driving forces governing genome homeostasis in this genus. To investigate the genetic outcomes resulting from CRISPR-Cas targeting of integrated MGEs, in silico prediction revealed four genomic islands without essential genes in lengths from 8 to 102 kbp, totaling 7% of the genome. In this study, the endogenous CRISPR3 type II system was programmed to target the four islands independently through plasmid-based expression of engineered CRISPR arrays. Targeting lacZ within the largest 102-kbp genomic island was lethal to wild-type cells and resulted in a reduction of up to 2.5-log in the surviving population. Genotyping of Lac(-) survivors revealed variable deletion events between the flanking insertion-sequence elements, all resulting in elimination of the Lac-encoding island. Chimeric insertion sequence footprints were observed at the deletion junctions after targeting all of the four genomic islands, suggesting a common mechanism of deletion via recombination between flanking insertion sequences. These results established that self-targeting CRISPR-Cas systems may direct significant evolution of bacterial genomes on a population level, influencing genome homeostasis and remodeling.

  4. Valve health monitoring system utilizing smart instrumentation

    NASA Astrophysics Data System (ADS)

    Jensen, Scott L.; Drouant, George J.

    2006-05-01

    The valve monitoring system is a stand alone unit with network capabilities for integration into a higher level health management system. The system is designed for aiding in failure predictions of high-geared ball valves and linearly actuated valves. It performs data tracking and archiving for identifying degraded performance. The data collection types are: cryogenic cycles, total cycles, inlet temperature, outlet temperature, body temperature, torsional strain, linear bonnet strain, preload position, total travel, and total directional changes. Events are recorded and time stamped in accordance with the IRIG B True Time. The monitoring system is designed for use in a Class 1 Division II explosive environment. The basic configuration consists of several instrumentation sensor units and a base station. The sensor units are self contained microprocessor controlled and remotely mountable in three by three by two inches. Each unit is potted in a fire retardant substance without any cavities and limited to low operating power for maintaining safe operation in a hydrogen environment. The units are temperature monitored to safeguard against operation outside temperature limitations. Each contains 902-928 MHz band digital transmitters which meet Federal Communication Commissions requirements and are limited to a 35 foot transmission radius for preserving data security. The base-station controller correlates related data from the sensor units and generates data event logs on a compact flash memory module for database uploading. The entries are also broadcast over an Ethernet network. Nitrogen purged National Electrical Manufactures Association (NEMA) Class 4 Enclosures are used to house the base-station.

  5. Analyzing Medical Image Search Behavior: Semantics and Prediction of Query Results.

    PubMed

    De-Arteaga, Maria; Eggel, Ivan; Kahn, Charles E; Müller, Henning

    2015-10-01

    Log files of information retrieval systems that record user behavior have been used to improve the outcomes of retrieval systems, understand user behavior, and predict events. In this article, a log file of the ARRS GoldMiner search engine containing 222,005 consecutive queries is analyzed. Time stamps are available for each query, as well as masked IP addresses, which enables to identify queries from the same person. This article describes the ways in which physicians (or Internet searchers interested in medical images) search and proposes potential improvements by suggesting query modifications. For example, many queries contain only few terms and therefore are not specific; others contain spelling mistakes or non-medical terms that likely lead to poor or empty results. One of the goals of this report is to predict the number of results a query will have since such a model allows search engines to automatically propose query modifications in order to avoid result lists that are empty or too large. This prediction is made based on characteristics of the query terms themselves. Prediction of empty results has an accuracy above 88%, and thus can be used to automatically modify the query to avoid empty result sets for a user. The semantic analysis and data of reformulations done by users in the past can aid the development of better search systems, particularly to improve results for novice users. Therefore, this paper gives important ideas to better understand how people search and how to use this knowledge to improve the performance of specialized medical search engines.

  6. Proliferation of Escherichia coli O157:H7 in Soil-Substitute and Hydroponic Microgreen Production Systems.

    PubMed

    Xiao, Zhenlei; Bauchan, Gary; Nichols-Russell, Lydia; Luo, Yaguang; Wang, Qin; Nou, Xiangwu

    2015-10-01

    Radish (Raphanus sativus var. longipinnatus) microgreens were produced from seeds inoculated with Escherichia coli O157:H7 by using peat moss-based soil-substitute and hydroponic production systems. E. coli populations on the edible and inedible parts of harvested microgreen plants (7 days postseeding) and in growth medium were examined. E. coli O157:H7 was shown to survive and proliferate significantly during microgreen growth in both production systems, with a higher level in the hydroponic production system. At the initial seed inoculation level of 3.7 log CFU/g, E. coli O157:H7 populations on the edible part of microgreen plants reached 2.3 and 2.1 log CFU/g (overhead irrigation and bottom irrigation, respectively) for microgreens from the soil-substitute production system and reached 5.7 log CFU/g for those hydroponically grown. At a higher initial inoculation of 5.6 log CFU/g seeds, the corresponding E. coli O157:H7 populations on the edible parts of microgreens grown in these production systems were 3.4, 3.6, and 5.3 log CFU/g, respectively. Examination of the spatial distribution of bacterial cells on different parts of microgreen plants showed that contaminated seeds led to systematic contamination of whole plants, including both edible and inedible parts, and seed coats remained the focal point of E. coli O157:H7 survival and growth throughout the period of microgreen production.

  7. Reference manual for data base on Nevada well logs

    USGS Publications Warehouse

    Bauer, E.M.; Cartier, K.D.

    1995-01-01

    The U.S. Geological Survey and Nevada Division of Water Resources are cooperatively using a data base for are cooperatively using a data base for managing well-log information for the State of Nevada. The Well-Log Data Base is part of an integrated system of computer data bases using the Ingres Relational Data-Base Management System, which allows efficient storage and access to water information from the State Engineer's office. The data base contains a main table, two ancillary tables, and nine lookup tables, as well as a menu-driven system for entering, updating, and reporting on the data. This reference guide outlines the general functions of the system and provides a brief description of data tables and data-entry screens.

  8. Analysis of the observed and intrinsic durations of Swift/BAT gamma-ray bursts

    NASA Astrophysics Data System (ADS)

    Tarnopolski, Mariusz

    2016-07-01

    The duration distribution of 947 GRBs observed by Swift/BAT, as well as its subsample of 347 events with measured redshift, allowing to examine the durations in both the observer and rest frames, are examined. Using a maximum log-likelihood method, mixtures of two and three standard Gaussians are fitted to each sample, and the adequate model is chosen based on the value of the difference in the log-likelihoods, Akaike information criterion and Bayesian information criterion. It is found that a two-Gaussian is a better description than a three-Gaussian, and that the presumed intermediate-duration class is unlikely to be present in the Swift duration data.

  9. Optical and Event-Duration Variables Affecting Self-Motion Perception.

    DTIC Science & Technology

    1985-11-01

    0000 SZKO 6 .511 .16 2.22 .0467 GK 2 .463 .14 6.19 .0029 ZFK 4 .468 .14 2.66 .0336 Total 5939 328.401 100.00 - - Note. Each effect was tested using...0 . 322 1.322 2.322 3.322 4.322 5.322 LOG 2 PREVIEW PERIOD (s) (0) (1.25) (2.50) (5.0) (10.0) (20.0) (40.0) PREVIEW PERIOD (s) Figure 19. Percent...DECELERATING 6-.- - z 0 w1) 3 0 z F0 I, I I I 1 I 0 . 322 1 . 322 2.322 3.322 4.: 322 5.322 LOG2 PREVIEW PERIOD (s) ~(0) (1.25) (2.50) (5.0) (10.0) (20.0) (40.0

  10. Computer analysis of digital well logs

    USGS Publications Warehouse

    Scott, James H.

    1984-01-01

    A comprehensive system of computer programs has been developed by the U.S. Geological Survey for analyzing digital well logs. The programs are operational on a minicomputer in a research well-logging truck, making it possible to analyze and replot the logs while at the field site. The minicomputer also serves as a controller of digitizers, counters, and recorders during acquisition of well logs. The analytical programs are coordinated with the data acquisition programs in a flexible system that allows the operator to make changes quickly and easily in program variables such as calibration coefficients, measurement units, and plotting scales. The programs are designed to analyze the following well-logging measurements: natural gamma-ray, neutron-neutron, dual-detector density with caliper, magnetic susceptibility, single-point resistance, self potential, resistivity (normal and Wenner configurations), induced polarization, temperature, sonic delta-t, and sonic amplitude. The computer programs are designed to make basic corrections for depth displacements, tool response characteristics, hole diameter, and borehole fluid effects (when applicable). Corrected well-log measurements are output to magnetic tape or plotter with measurement units transformed to petrophysical and chemical units of interest, such as grade of uranium mineralization in percent eU3O8, neutron porosity index in percent, and sonic velocity in kilometers per second.

  11. Predicting Correctness of Problem Solving from Low-Level Log Data in Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Cetintas, Suleyman; Si, Luo; Xin, Yan Ping; Hord, Casey

    2009-01-01

    This paper proposes a learning based method that can automatically determine how likely a student is to give a correct answer to a problem in an intelligent tutoring system. Only log files that record students' actions with the system are used to train the model, therefore the modeling process doesn't require expert knowledge for identifying…

  12. The Weak Link HP-41C hand-held calculator program

    Treesearch

    Ross A. Phillips; Penn A. Peters; Gary D. Falk

    1982-01-01

    The Weak Link hand-held calculator program (HP-41C) quickly analyzes a system for logging production and costs. The production equations model conventional chain saw, skidder, loader, and tandemaxle truck operations in eastern mountain areas. Production of each function of the logging system may be determined so that the system may be balanced for minimum cost. The...

  13. Performance of a logging truck with a central tire inflation system.

    Treesearch

    John A. Sturos; Douglas B. Brumm; Andrew Lehto

    1995-01-01

    Describes the performance of an 11-axle logging truck with a central tire inflation system. Results included reduced damages to roads, improved ride of the truck, improved drawbar pull, and reduced rolling resistance. Road construction costs were reduced 62%, primarily due to using 33% less gravel.

  14. Cost of wetland protection using cable logging systems

    Treesearch

    Chris B. LeDoux; John E. Baumgras

    1990-01-01

    Forest managers, loggers, land-use planners, and other decision makers need an understanding of estimating the cost of protecting wetlands using cable logging systems to harvest timber products. Results suggest that protection costs can range from $244.75 to $489.50 per acre depending on the degree of protection desired.

  15. Analyzing Web Server Logs to Improve a Site's Usage. The Systems Librarian

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2005-01-01

    This column describes ways to streamline and optimize how a Web site works in order to improve both its usability and its visibility. The author explains how to analyze logs and other system data to measure the effectiveness of the Web site design and search engine.

  16. Treatment of Suicide Attempters With Bipolar Disorder: A Randomized Clinical Trial Comparing Lithium and Valproate in the Prevention of Suicidal Behavior

    PubMed Central

    Oquendo, Maria A.; Galfalvy, Hanga C.; Currier, Dianne; Grunebaum, Michael F.; Sher, Leo; Sullivan, Gregory M.; Burke, Ainsley K.; Harkavy-Friedman, Jill; Sublette, M. Elizabeth; Parsey, Ramin V.; Mann, J. John

    2013-01-01

    Objective Bipolar disorder is associated with high risk for suicidal acts. Observational studies suggest a protective effect of lithium against suicidal behavior. However, testing this effect in randomized clinical trials is logistically and ethically challenging. The authors tested the hypothesis that lithium offers bipolar patients with a history of suicide attempt greater protection against suicidal behavior compared to valproate. Method Patients with bipolar disorder and past suicide attempts (N=98) were randomly assigned to treatment with lithium or valproate, plus adjunctive medications as indicated, in a double-blind 2.5-year trial. An intent-to-treat analysis was performed using the log-rank test for survival data. Two models were fitted: time to suicide attempt and time to suicide event (attempt or hospitalization or change in medication in response to suicide plans). Results There were 45 suicide events in 35 participants, including 18 suicide attempts made by 14 participants, six from the lithium group and eight from the valproate group. There were no suicides. Intent-to-treat analysis using the log-rank test showed no differences between treatment groups in time to suicide attempt or to suicide event. Post hoc power calculations revealed that the modest sample size, reflective of challenges in recruitment, only permits detection of a relative risk of 5 or greater. Conclusions Despite the high frequency of suicide events during the study, this randomized controlled trial detected no difference between lithium and valproate in time to suicide attempt or suicide event in a sample of suicide attempters with bipolar disorder. However, smaller clinically significant differences between the two drugs were not ruled out. PMID:21768611

  17. Critical care procedure logging using handheld computers

    PubMed Central

    Carlos Martinez-Motta, J; Walker, Robin; Stewart, Thomas E; Granton, John; Abrahamson, Simon; Lapinsky, Stephen E

    2004-01-01

    Introduction We conducted this study to evaluate the feasibility of implementing an internet-linked handheld computer procedure logging system in a critical care training program. Methods Subspecialty trainees in the Interdepartmental Division of Critical Care at the University of Toronto received and were trained in the use of Palm handheld computers loaded with a customized program for logging critical care procedures. The procedures were entered into the handheld device using checkboxes and drop-down lists, and data were uploaded to a central database via the internet. To evaluate the feasibility of this system, we tracked the utilization of this data collection system. Benefits and disadvantages were assessed through surveys. Results All 11 trainees successfully uploaded data to the central database, but only six (55%) continued to upload data on a regular basis. The most common reason cited for not using the system pertained to initial technical problems with data uploading. From 1 July 2002 to 30 June 2003, a total of 914 procedures were logged. Significant variability was noted in the number of procedures logged by individual trainees (range 13–242). The database generated by regular users provided potentially useful information to the training program director regarding the scope and location of procedural training among the different rotations and hospitals. Conclusion A handheld computer procedure logging system can be effectively used in a critical care training program. However, user acceptance was not uniform, and continued training and support are required to increase user acceptance. Such a procedure database may provide valuable information that may be used to optimize trainees' educational experience and to document clinical training experience for licensing and accreditation. PMID:15469577

  18. Best opening face system for sweepy, eccentric logs : a user’s guide

    Treesearch

    David W. Lewis

    1985-01-01

    Log breakdown simulation models have gained rapid acceptance within the sawmill industry in the last 15 years. Although they have many advantages over traditional decision making tools, the existing models do not calculate yield correctly when used to simulate the breakdown of eccentric, sweepy logs in North American sawmills producing softwood dimension lumber. In an...

  19. Using the Logarithmic Concentration Diagram, Log "C", to Teach Acid-Base Equilibrium

    ERIC Educational Resources Information Center

    Kovac, Jeffrey

    2012-01-01

    Acid-base equilibrium is one of the most important and most challenging topics in a typical general chemistry course. This article introduces an alternative to the algebraic approach generally used in textbooks, the graphical log "C" method. Log "C" diagrams provide conceptual insight into the behavior of aqueous acid-base systems and allow…

  20. 47 CFR 76.1700 - Records to be maintained by cable system operators.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... programming); § 76.1704 (proof-of-performance test data); and § 76.1706 (signal leakage logs and repair... children's programming); § 76.1704 (proof-of-performance test data); and § 76.1706 (signal leakage logs and... programming); § 76.1704 (proof-of-performance test data); and § 76.1706 (signal leakage logs and repair...

  1. Assessing the feasibility and profitability of cable logging in southern upland hardwood forests

    Treesearch

    Chris B. LeDoux; Dennis M. May; Tony Johnson; Richard H. Widmann

    1995-01-01

    Procedures developed to assess available timber supplies from upland hardwood forest statistics reported by the USDA Forest Services' Forest Inventory and Analysis unit were modified to assess the feasibility and profitability of cable logging in southern upland hardwood forests. Depending on the harvest system and yarding distance used, cable logging can be...

  2. Rule-driven defect detection in CT images of hardwood logs

    Treesearch

    Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt

    2000-01-01

    This paper deals with automated detection and identification of internal defects in hardwood logs using computed tomography (CT) images. We have developed a system that employs artificial neural networks to perform tentative classification of logs on a pixel-by-pixel basis. This approach achieves a high level of classification accuracy for several hardwood species (...

  3. Optimal message log reclamation for independent checkpointing

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Fuchs, W. Kent

    1993-01-01

    Independent (uncoordinated) check pointing for parallel and distributed systems allows maximum process autonomy but suffers from possible domino effects and the associated storage space overhead for maintaining multiple checkpoints and message logs. In most research on check pointing and recovery, it was assumed that only the checkpoints and message logs older than the global recovery line can be discarded. It is shown how recovery line transformation and decomposition can be applied to the problem of efficiently identifying all discardable message logs, thereby achieving optimal garbage collection. Communication trace-driven simulation for several parallel programs is used to show the benefits of the proposed algorithm for message log reclamation.

  4. Ips Bark Beetles in the South

    Treesearch

    Michael D. Conner; Robert C. Wilkinson

    1983-01-01

    Ips beetles usually attack weakened, dying, or recently felled trees and fresh logging debris. Large numbers Ips may build up when natural events such as lightning storms, ice storms, tornadoes, wildfires, and droughts create large amounts of pine suitable for the breeding of these beetles. Ips populations may also build up following forestry activities, such as...

  5. 47 CFR 10.350 - CMAS testing requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... maintenance windows. (3) A Participating CMS Provider may forego an RMT if the RMT is pre-empted by actual... Gateway Administrator using a defined test message. Real event codes or alert messages shall not be used... automated log of RMT messages received by the CMS Provider Gateway from the Federal Alert Gateway. (b...

  6. 47 CFR 10.350 - CMAS Testing requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... maintenance windows. (3) A Participating CMS Provider may forego an RMT if the RMT is pre-empted by actual... Gateway Administrator using a defined test message. Real event codes or alert messages shall not be used... automated log of RMT messages received by the CMS Provider Gateway from the Federal Alert Gateway. (b...

  7. Linking vegetation patterns to potential smoke production and fire hazard

    Treesearch

    Roger D. Ottmar; Ernesto Alvarado

    2004-01-01

    During the past 80 years, various disturbances (such as wildfire and wind events) and management actions (including fire exclusion, logging, and domestic livestock grazing) have significantly modified the composition and structure of forests and ranges across the western United States. The resulting fuel loadings directly influence potential smoke production from...

  8. Signal Event Context: Trace Technologies of the habit@online

    ERIC Educational Resources Information Center

    Luke, Robert

    2003-01-01

    Web portals--those online environments that encourage users to trade personal information for the opportunity to personalise the information space--are experiencing a considerable resurgence in popularity. Web portals are web sites that allow users to log on with a username and password and create their very own datastructure. This datastructure…

  9. Results of well-bore flow logging for six water-production wells completed in the Santa Fe Group aquifer system, Albuquerque, New Mexico, 1996-98

    USGS Publications Warehouse

    Thorn, Conde R.

    2000-01-01

    Over the last several years, an improved conceptual understanding of the aquifer system in the Albuquerque area, New Mexico, has lead to better knowledge about the location and extent of the aquifer system. This information will aid with the refinement of ground-water simulation and with the location of sites for future water-production wells. With an impeller-type flowmeter, well-bore flow was logged under pumping conditions along the screened interval of the well bore in six City of Albuquerque water-production wells: the Ponderosa 3, Love 6, Volcano Cliffs 1, Gonzales 2, Zamora 2, and Gonzales 3 wells. From each of these six wells, a well-bore flow log was collected that represents the cumulative upward well-bore flow. Evaluation of the well-bore flow log for each well allowed delineation of the more productive zones supplying water to the well along the logged interval. Yields from the more productive zones in the six wells ranged from about 70 to 880 gallons per minute. The lithology of these zones is predominantly gravel and sand with varying amounts of sandy clay.

  10. A new high-precision borehole-temperature logging system used at GISP2, Greenland, and Taylor Dome, Antarctica

    USGS Publications Warehouse

    Clow, G.D.; Saltus, R.W.; Waddington, E.D.

    1996-01-01

    We describe a high-precision (0.1-1.0 mK) borehole-temperature (BT) logging system developed at the United States Geological Survey (USGS) for use in remote polar regions. We discuss calibration, operational and data-processing procedures, and present an analysis of the measurement errors. The system is modular to facilitate calibration procedures and field repairs. By interchanging logging cables and temperature sensors, measurements can be made in either shallow air-filled boreholes or liquid-filled holes up to 7 km deep. Data can be acquired in either incremental or continuous-logging modes. The precision of data collected by the new logging system is high enough to detect and quantify various thermal effects at the milli-Kelvin level. To illustrate this capability, we present sample data from the 3 km deep borehole at GISP2, Greenland, and from a 130m deep air-filled hole at Taylor Dome, Antarctica. The precision of the processed GTSP2 continuous temperature logs is 0.25-0.34 mK, while the accuracy is estimated to be 4.5 mK. The effects of fluid convection and the dissipation of the thermal disturbance caused by drilling the borehole are clearly visible in the data. The precision of the incremental Taylor Dome measurements varies from 0.11 to 0.32mK, depending on the wind strength during the experiments. With this precision, we found that temperature fluctuations and multi-hour trends in the BT measurements correlate well with atmospheric-pressure changes.

  11. Mail LOG: Program operating instructions

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The operating instructions for the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, are provided. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG program has the following four modes of operation: (1) INPUT - putting new records into the data base (2) REVISE - changing or modifying existing records in the data base (3) SEARCH - finding special records existing in the data base (4) ARCHIVE - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the INPUT and SEARCH modes. The MAIL LOG data base consists of three main subfiles: Incoming and outgoing mail correspondence; Design Information Releases and Releases and Reports; and Drawings and Engineering orders.

  12. Log-normal distribution from a process that is not multiplicative but is additive.

    PubMed

    Mouri, Hideaki

    2013-10-01

    The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.

  13. Seasonal variation and partitioning of endocrine disrupting chemicals in waters and sediments of the Pearl River system, South China.

    PubMed

    Gong, Jian; Duan, Dandan; Yang, Yu; Ran, Yong; Chen, Diyun

    2016-12-01

    Endocrine disrupting chemicals (EDCs) were seasonally investigated in surface water, suspended particulate matter, and sediments of the Pearl River Delta (PRD), South China. EDC concentrations in the surface water were generally higher in the summer than in winter. The surface water in the investigated rivers was heavily contaminated by the phenolic xenoestrogens. Moreover, the in-situ log K soc and log K poc values and their regression with log K ow in the field experiments suggest that binding mechanisms other than hydrophobic interaction are present for the sedimentary organic carbon and particulate organic carbon (SOC/POC). The logK soc -logK ow and logK poc -logK ow regression analyses imply that higher complexity of nonhydrophobic interactions with EDCs is present on the SOC samples comparing with the POC samples, which is related to their different sources. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Test-bench system for a borehole azimuthal acoustic reflection imaging logging tool

    NASA Astrophysics Data System (ADS)

    Liu, Xianping; Ju, Xiaodong; Qiao, Wenxiao; Lu, Junqiang; Men, Baiyong; Liu, Dong

    2016-06-01

    The borehole azimuthal acoustic reflection imaging logging tool (BAAR) is a new generation of imaging logging tool, which is able to investigate stratums in a relatively larger range of space around the borehole. The BAAR is designed based on the idea of modularization with a very complex structure, so it has become urgent for us to develop a dedicated test-bench system to debug each module of the BAAR. With the help of a test-bench system introduced in this paper, test and calibration of BAAR can be easily achieved. The test-bench system is designed based on the client/server model. The hardware system mainly consists of a host computer, an embedded controlling board, a bus interface board, a data acquisition board and a telemetry communication board. The host computer serves as the human machine interface and processes the uploaded data. The software running on the host computer is designed based on VC++. The embedded controlling board uses Advanced Reduced Instruction Set Machines 7 (ARM7) as the micro controller and communicates with the host computer via Ethernet. The software for the embedded controlling board is developed based on the operating system uClinux. The bus interface board, data acquisition board and telemetry communication board are designed based on a field programmable gate array (FPGA) and provide test interfaces for the logging tool. To examine the feasibility of the test-bench system, it was set up to perform a test on BAAR. By analyzing the test results, an unqualified channel of the electronic receiving cabin was discovered. It is suggested that the test-bench system can be used to quickly determine the working condition of sub modules of BAAR and it is of great significance in improving production efficiency and accelerating industrial production of the logging tool.

  15. New technological developments provide deep-sea sediment density flow insights: the Monterey Coordinated Canyon Experiment

    NASA Astrophysics Data System (ADS)

    O'Reilly, T. C.; Kieft, B.; Chaffey, M. R.; Wolfson-Schwehr, M.; Herlien, R.; Bird, L.; Klimov, D.; Paull, C. K.; Gwiazda, R.; Lundsten, E. M.; Anderson, K.; Caress, D. W.; Sumner, E. J.; Simmons, S.; Parsons, D. R.; Talling, P.; Rosenberger, K. J.; Xu, J.; Maier, K. L.; Gales, J. A.

    2017-12-01

    The Monterey Coordinated Canyon Experiment (CCE) deployed an array of instruments along the Monterey Canyon floor to characterize the structure, velocity and frequency of sediment flows. CCE utilized novel technologies developed at MBARI to capture sediment flow data in unprecedented detail. 1. The Seafloor Instrument Node (SIN) at 1850 meters depth housed 3 ADCPs at 3 different frequencies, CTD, current meter, oxygen optode, fluorometer/backscatter sensor, and logged data at 10 second intervals or faster. The SIN included an acoustic modem for communication with shore through a Wave Glider relay, and provided high-resolution measurements of three flow events during three successive deployments over 1.5 years. 2. Beachball-sized Benthic Event Detectors (BEDs) were deployed on or under the seafloor to measure the characteristics of sediment density flows. Each BED recorded data from a pressure sensor and a 3-axis accelerometer and gyro to characterize motions during transport events (e.g. tumble vs rotation). An acoustic modem capable of operating through more than a meter of sediment enabled communications with a ship or autonomous surface vehicle. Multiple BEDs were deployed at various depths in the canyon during CCE, detecting and measuring many transport events; one BED moved 9 km down canyon in 50 minutes during one event. 3. Wave Glider Hot Spot (HS), equipped with acoustic and RF modems, acted as data relay between SIN, BEDs and shore, and acoustically located BEDs after sediment density flows.. In some cases HS relayed BED motion data to shore within a few hours of the event. HS provided an acoustic console to the SIN, allowing shore-based users to check SIN health and status, perform maintenance, etc. 4. Mapping operations were conducted 4 times at the SIN site to quantify depositional and erosional patterns, utilizing a prototype ultra-high-resolution mapping system on the ROV Doc Ricketts. The system consists of a 400-kHz Reson 7125 multibeam sonar, a 3DatDepth SL1 subsea LiIDAR, two stereo color cameras, and a Kearfott SeaDevil INS. At a survey altitude of 3 m above the bed, the mapping system provides 5-cm resolution multibeam bathymetry, 1-cm resolution lidar bathymetry, and 2-mm resolution photomosaics. We will describe the design and full capabilities of these novel systems.

  16. The Auckland Cataract Study: co-morbidity, surgical techniques, and clinical outcomes in a public hospital service

    PubMed Central

    Riley, Andrew F; Malik, Tahira Y; Grupcheva, Christina N; Fisk, Michael J; Craig, Jennifer P; McGhee, Charles N

    2002-01-01

    Aim: To prospectively assess cataract surgery in a major New Zealand public hospital by defining presenting clinical parameters and surgical and clinical outcomes in a cohort of subjects just below threshold for treatment, based upon a points based prioritisation system. Methods: The prospective observational study comprised 488 eyes of 480 subjects undergoing consecutive cataract operations at Auckland Hospital. All subjects underwent extensive ophthalmic examination before and after surgery. Details of the surgical procedure, including any intraoperative difficulties or complications, were documented. Postoperative review was performed at 1 day and 4 weeks after surgery. Demographic data, clinical outcomes, and adverse events were correlated by an independent assessor. Results: The mean age at surgery was 74.9 (SD 9.6) years with a female predominance (62%). Significant systemic disease affected 80% of subjects, with 20% of the overall cohort exhibiting diabetes mellitus. 26% of eyes exhibited coexisting ocular disease and in 7.6% this affected best spectacle corrected visual acuity (BSCVA). A mean spherical equivalent of −0.49 (1.03) D and mean BSCVA of 0.9 (0.6) log MAR units (Snellen equivalent approximately 6/48) was noted preoperatively. Local anaesthesia was employed in 99.8% of subjects (94.9% sub-Tenon's). The majority of procedures (97.3%) were small incision phacoemulsification with foldable lens implant. Complications included: 4.9% posterior capsule tears, 3.8% cystoid macular oedema, and one case (0.2%) of endophthalmitis. Mean BSCVA after surgery was 0.1 (0.2) log MAR units (6/7.5 Snellen equivalent), with a mean spherical equivalent of −0.46 (0.89) D, and was 6/12 or better in 88% of all eyes. A drop in BSCVA, thought to be directly attributable to the surgical intervention, was recorded in a small percentage of eyes (1.5%) after surgery. Conclusion: This study provides a representative assessment of the management of cataract in the New Zealand public hospital system. A predominantly elderly, female population, frequently exhibiting significant systemic illness and coexisting ocular disease, relatively advanced cataracts, and poor BSCVA, presented for cataract surgery. The majority of subjects underwent small incision, phacoemulsification, day case surgery. While almost 90% achieved at least 6/12 BSCVA post-surgery, approximately 5% sustained an adverse intraoperative event and 1.5% of eyes exhibited a reduction in BSCVA postoperatively. PMID:11815345

  17. Replication in the Harp File System

    DTIC Science & Technology

    1981-07-01

    Shrira Michael Williams iadly 1991 © Massachusetts Institute of Technology (To appear In the Proceedings of the Thirteenth ACM Symposium on Operating...S., Spector, A. Z., and Thompson, D. S. Distributed Logging for Transaction Processing. ACM Special Interest Group on Management of Data 1987 Annual ...System. USENIX Conference Proceedings , June, 1990, pp. 63-71. 15. Hagmann, R. Reimplementing the Cedar File System Using Logging and Group Commit

  18. Individual stem value recovery of modified and conventional tree-length systems in the southeastern United States

    Treesearch

    Amanda H. Lang; Shawn A. Baker; W. Dale Greene; Glen E. Murphy

    2010-01-01

    We compared value recovery of a modified treelength (MTL) logging system that measures product diameter and length using a Waratah 626 harvester head to that of a treelength (TL) system that estimates dimensions. A field test compared the actual value cut to the maximum potential value suggested by the log bucking optimization program Assessment of Value by Individual...

  19. Characterizing student navigation in educational multiuser virtual environments: A case study using data from the River City project

    NASA Astrophysics Data System (ADS)

    Dukas, Georg

    Though research in emerging technologies is vital to fulfilling their incredible potential for educational applications, it is often fraught with analytic challenges related to large datasets. This thesis explores these challenges in researching multiuser virtual environments (MUVEs). In a MUVE, users assume a persona and traverse a virtual space often depicted as a physical world, interacting with other users and digital artifacts. As students participate in MUVE-based curricula, detailed records of their paths through the virtual world are typically collected in event logs. Although many studies have demonstrated the instructional power of MUVEs (e.g., Barab, Hay, Barnett, & Squire, 2001; Ketelhut, Dede, Clarke, Nelson, & Bowman, 2008), none have successfully quantified these student paths for analysis in the aggregate. This thesis constructs several frameworks for conducting research involving student navigational choices in MUVEs based on a case study of data generated from the River City project. After providing a context for the research and an introduction to the River City dataset, the first part of this thesis explores the issues associated with data compression and presents a grounded theory approach (Glaser & Strauss, 1967) to the cleaning, compacting, and coding or MUVE datasets. In summary of this section, I discuss the implication of preparation choices for further analysis. Second, two conceptually different approaches to analyzing behavioral sequences are investigated. For each approach, a theoretical context, description of possible exploratory and confirmatory methods, and illustrative examples from River City are provided. The thesis then situates these specific analytic approaches within the constellation of possible research utilizing MUVE event log data. Finally, based on the lessons of River City and the investigation of a spectrum of possible event logs, a set of design heuristics for data collection in MUVEs is constructed and a possible future for research in these environments is envisioned.

  20. Limits on the Ultra-bright Fast Radio Burst Population from the CHIME Pathfinder

    NASA Astrophysics Data System (ADS)

    Amiri, M.; Bandura, K.; Berger, P.; Bond, J. R.; Cliche, J. F.; Connor, L.; Deng, M.; Denman, N.; Dobbs, M.; Domagalski, R. S.; Fandino, M.; Gilbert, A. J.; Good, D. C.; Halpern, M.; Hanna, D.; Hincks, A. D.; Hinshaw, G.; Höfer, C.; Hsyu, G.; Klages, P.; Landecker, T. L.; Masui, K.; Mena-Parra, J.; Newburgh, L. B.; Oppermann, N.; Pen, U. L.; Peterson, J. B.; Pinsonneault-Marotte, T.; Renard, A.; Shaw, J. R.; Siegel, S. R.; Sigurdson, K.; Smith, K.; Storer, E.; Tretyakov, I.; Vanderlinde, K.; Wiebe, D. V.; Scientific Collaboration20, CHIME

    2017-08-01

    We present results from a new incoherent-beam fast radio burst (FRB) search on the Canadian Hydrogen Intensity Mapping Experiment (CHIME) Pathfinder. Its large instantaneous field of view (FoV) and relative thermal insensitivity allow us to probe the ultra-bright tail of the FRB distribution, and to test a recent claim that this distribution’s slope, α \\equiv -\\tfrac{\\partial {log}N}{\\partial {log}S}, is quite small. A 256-input incoherent beamformer was deployed on the CHIME Pathfinder for this purpose. If the FRB distribution were described by a single power law with α = 0.7, we would expect an FRB detection every few days, making this the fastest survey on the sky at present. We collected 1268 hr of data, amounting to one of the largest exposures of any FRB survey, with over 2.4 × 105 deg2 hr. Having seen no bursts, we have constrained the rate of extremely bright events to <13 sky-1 day-1 above ˜ 220\\sqrt{(τ /{ms})} {Jy} {ms} for τ between 1.3 and 100 ms, at 400-800 MHz. The non-detection also allows us to rule out α ≲ 0.9 with 95% confidence, after marginalizing over uncertainties in the GBT rate at 700-900 MHz, though we show that for a cosmological population and a large dynamic range in flux density, α is brightness dependent. Since FRBs now extend to large enough distances that non-Euclidean effects are significant, there is still expected to be a dearth of faint events and relative excess of bright events. Nevertheless we have constrained the allowed number of ultra-intense FRBs. While this does not have significant implications for deeper, large-FoV surveys like full CHIME and APERTIF, it does have important consequences for other wide-field, small dish experiments.

  1. The relative importance of noise level and number of events on human reactions to noise: Community survey findings and study methods

    NASA Technical Reports Server (NTRS)

    Fields, J. M.

    1980-01-01

    The data from seven surveys of community response to environmental noise are reanalyzed to assess the relative influence of peak noise levels and the numbers of noise events on human response. The surveys do not agree on the value of the tradeoff between the effects of noise level and numbers of events. The value of the tradeoff cannot be confidently specified in any survey because the tradeoff estimate may have a large standard error of estimate and because the tradeoff estimate may be seriously biased by unknown noise measurement errors. Some evidence suggests a decrease in annoyance with very high numbers of noise events but this evidence is not strong enough to lead to the rejection of the conventionally accepted assumption that annoyance is related to a log transformation of the number of noise events.

  2. 40 CFR Appendix B to Part 132 - Great Lakes Water Quality Initiative

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM Pt. 132, App. B Appendix B to Part 132—Great Lakes Water Quality Initiative Methodology for Deriving Bioaccumulation Factors Great Lakes States and Tribes... system. For log KOW, the log of the octanol-water partition coefficient is a base 10 logarithm. Uptake...

  3. 40 CFR Appendix B to Part 132 - Great Lakes Water Quality Initiative

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM Pt. 132, App. B Appendix B to Part 132—Great Lakes Water Quality Initiative Methodology for Deriving Bioaccumulation Factors Great Lakes States and Tribes... system. For log KOW, the log of the octanol-water partition coefficient is a base 10 logarithm. Uptake...

  4. The economics of a mechanized multiproduct harvesting system for stand conversion of northern hardwoods.

    Treesearch

    John A. Sturos; Edwin S. Miyata; Helmuth M. Steinhilb; Robert M. Barron

    1983-01-01

    Describes chip and saw log yields, production, costs, and potential profits of clearcutting, down to a 2-inch diameter, a northern hardwood poletimber stand by a conventional whole-tree harvesting system and three sawtimber stands by several combinations of whole-tree chipping and saw log recovery.

  5. Aldosterone, Renin, Cardiovascular Events, and All-Cause Mortality Among African Americans: The Jackson Heart Study.

    PubMed

    Joseph, Joshua J; Echouffo-Tcheugui, Justin B; Kalyani, Rita R; Yeh, Hsin-Chieh; Bertoni, Alain G; Effoe, Valery S; Casanova, Ramon; Sims, Mario; Wu, Wen-Chih; Wand, Gary S; Correa, Adolfo; Golden, Sherita H

    2017-09-01

    This study examined the association of aldosterone and plasma renin activity (PRA) with incident cardiovascular disease (CVD), using a composite endpoint of coronary heart disease, stroke, and/or heart failure and mortality among African Americans in the Jackson Heart Study. There is a paucity of data for the association of aldosterone and PRA with incident CVD or all-cause mortality among community-dwelling African Americans. A total of 4,985 African American adults, 21 to 94 years of age, were followed for 12 years. Aldosterone, PRA, and cardiovascular risk factors were collected at baseline (from 2000 to 2004). Incident events included coronary heart disease and stroke (assessed from 2000 to 2011) and heart failure (assessed from 2005 to 2011). Cox models were used to estimate hazard ratios (HRs) for incident CVD and mortality, adjusting for age, sex, education, occupation, current smoking, physical activity, dietary intake, and body mass index. Among 4,160 participants without prevalent CVD over a median follow-up of 7 years, there were 322 incident CVD cases. In adjusted analyses, each 1-U SD increase in log-aldosterone and log-PRA were associated with HR of 1.26 (95% confidence intervals [CI]: 1.14 to 1.40) and 1.16 (95% CI: 1.02 to 1.33) for incident CVD, respectively. Over a median of 8 years, 513 deaths occurred among 4,985 participants. In adjusted analyses, each 1-U SD increase in log-aldosterone and log-PRA were associated with HRs of 1.13 (95% CI: 1.04 to 1.23) and 1.12 (95% CI: 1.01 to 1.24) for mortality, respectively. Elevated aldosterone and PRA may play a significant role in the development of CVD and all-cause mortality among African Americans. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  6. Evaluation of forest management practices through application of a biogeochemical model, PnET-BGC

    NASA Astrophysics Data System (ADS)

    Valipour, M.; Driscoll, C. T.; Johnson, C. E.; Campbell, J. L.; Fahey, T.; Zeng, T.

    2017-12-01

    Forest ecosystem response to logging disturbance varies significantly, depending on site conditions, species composition, land use history, and the method and frequency of harvesting. The long-term effects of forest cuttings are less clear due to limited information on land use history and long-term time series observations. The hydrochemical model, PnET-BGC was modified and verified using field data from multiple experimentally harvested northern hardwood watersheds at the Hubbard Brook Experimental Forest (HBEF), New Hampshire, USA, including a commercial whole-tree harvest (Watershed 5), a devegetation experiment (Watershed 2; devegetation and herbicide treatment), a commercial strip-cut (Watershed 4) to simulate the hydrology, biomass accumulation, and soil solution and stream water chemistry responses to clear-cutting. The confirmed model was used to investigate temporal changes in aboveground biomass accumulation and nutrient dynamics under three different harvesting intensities (40%, 60%, 80%) over four varied rotation lengths (20, 40, 60, 80 years) with results compared with a scenario of no forest harvesting. The total ecosystem carbon pool (biomass, soil and litter) was reduced over harvesting events. The greatest decline occurred in litter by 40%-70%, while the pool of carbon stored in aboveground biomass decreased by 30%-60% for 80% cutting levels at 40 and 20 year rotation lengths, respectively. The large pool of soil organic carbon remained relatively stable, with only minor declines over logging regimes. Stream water simulations demonstrated increased loss of major elements over cutting events. Ca+2 and NO3- were the most sensitive elements to leaching over frequent intensive logging. Accumulated leaching of Ca+2 and NO3- varied between 90-520 t Ca/ha and 40-420 t N/ha from conservative (80-year period and 40% cutting) to aggressive (20-year period and 80% cutting) cutting regimes, respectively. Moreover, a reduction in nutrient plant uptake over logging scenarios was estimated. Model simulations indicated nutrient losses were more sensitive to harvesting rotation length than intensity.

  7. Woody plant regeneration after blowdown, salvage logging, and prescribed fire in a northern Minnesota forest

    Treesearch

    Brian J. Palik; Doug Kastendick

    2009-01-01

    Salvage logging after natural disturbance has received increased scrutiny in recent years because of concerns over detrimental effects on tree regeneration and increased fine fuel levels. Most research on tree regeneration after salvage logging comes from fire-prone systems and is short-term in scope. Limited information is available on longer term responses to salvage...

  8. Using parallel computing methods to improve log surface defect detection methods

    Treesearch

    R. Edward Thomas; Liya Thomas

    2013-01-01

    Determining the size and location of surface defects is crucial to evaluating the potential yield and value of hardwood logs. Recently a surface defect detection algorithm was developed using the Java language. This algorithm was developed around an earlier laser scanning system that had poor resolution along the length of the log (15 scan lines per foot). A newer...

  9. CT Image Sequence Processing For Wood Defect Recognition

    Treesearch

    Dongping Zhu; R.W. Conners; Philip A. Araman

    1991-01-01

    The research reported in this paper explores a non-destructive testing application of x-ray computed tomography (CT) in the forest products industry. This application involves a computer vision system that uses CT to locate and identify internal defects in hardwood logs. The knowledge of log defects is critical in deciding whether to veneer or to saw up a log, and how...

  10. Comparative trends in log populations in northern Arizona mixed-conifer and ponderosa pine forests following severe drought

    Treesearch

    Joseph L. Ganey; Scott C. Vojta

    2017-01-01

    Logs provide an important form of coarse woody debris in forest systems, contributing to numerous ecological processes and affecting wildlife habitat and fuel complexes. Despite this, little information is available on the dynamics of log populations in southwestern ponderosa pine (Pinus ponderosa) and especially mixed-conifer forests. A recent episode of elevated tree...

  11. Product Recovery From Hemlock "Pulpwood" From Alaska.

    Treesearch

    Thomas D. Fahey

    1983-01-01

    A total of 363 western hemlock (Tsuga heterophylla (Raf.) Sarg.) logs from Alaska were sawn to compare recovery at a stud mill and at a dimension mill. Recovery at both mills varied by log diameters and by log scaling system. Lumber grade recovery was primarily in Stud grade at the stud mill and in Standard and Construction grade at the dimension...

  12. Evaluation of the use of partition coefficients and molecular surface properties as predictors of drug absorption: a provisional biopharmaceutical classification of the list of national essential medicines of Pakistan

    PubMed Central

    Shawahna, R.; Rahman, NU.

    2011-01-01

    Background and the purpose of the study Partition coefficients (log D and log P) and molecular surface area (PSA) are potential predictors of the intestinal permeability of drugs. The aim of this investigation was to evaluate and compare these intestinal permeability indicators. Methods Aqueous solubility data were obtained from literature or calculated using ACD/Labs and ALOGPS. Permeability data were predicted based on log P, log D at pH 6.0 (log D6.0), and PSA. Results Metoprolol's log P, log D6.0, and a PSA of <65 Å correctly predicted 55.9%, 50.8% and 54.2% of permeability classes, respectively. Labetalol's log P, log D6.0 and PSA correctly predicted 54.2%, 64.4% and 61% of permeability classes, respectively. Log D6.0 correlated well (81%) with Caco-2 permeability (Papp). Of the list of national essential medicines, 135 orally administered drugs were classified into biopharmaceutical classification system (BCS). Of these, 57 (42.2%), 28 (20.7%), 44 (32.6%), and 6 (4.4%) were class I, II, III and IV respectively. Conclusion Log D6.0 showed better prediction capability than log P. Metoprolol as permeability internal standard was more conservative than labetalol. PMID:22615645

  13. Polar exponential sensor arrays unify iconic and Hough space representation

    NASA Technical Reports Server (NTRS)

    Weiman, Carl F. R.

    1990-01-01

    The log-polar coordinate system, inherent in both polar exponential sensor arrays and log-polar remapped video imagery, is identical to the coordinate system of its corresponding Hough transform parameter space. The resulting unification of iconic and Hough domains simplifies computation for line recognition and eliminates the slope quantization problems inherent in the classical Cartesian Hough transform. The geometric organization of the algorithm is more amenable to massively parallel architectures than that of the Cartesian version. The neural architecture of the human visual cortex meets the geometric requirements to execute 'in-place' log-Hough algorithms of the kind described here.

  14. Application of Dst Interpretation Results by Log - Log Method in the Pore Space Type Estimation for the Upper Jurassic Carbonate Reservoir Rocks of the Carpathian Foredeep Basement / Interpretacja Testów Wykonywanych Rurowymi Próbnikami Złoża - Rpz w Skałach Węglanowych Górnej Jury Podłoża Zapadliska Przedkarpackiego

    NASA Astrophysics Data System (ADS)

    Dubiel, Stanisław; Zubrzycki, Adam; Rybicki, Czesław; Maruta, Michał

    2012-11-01

    In the south part of the Carpathian Foredeep basement, between Bochnia and Ropczyce, the Upper Jurassic (Oxfordian, Kimmeridian and Tithonian) carbonate complex plays important role as a hydrocarbon bearing formation. It consists of shallow marine carbonates deposited in environments of the outer carbonate ramp as reef limestones (dolomites), microbial - sponge or coral biostromes and marly or micrite limestones as well. The inner pore space system of these rocks was affected by different diagenetic processes as calcite cementation, dissolution, dolomitization and most probably by tectonic fracturing as well. These phenomena have modified pore space systems within limestone / dolomite series forming more or less developed reservoir zones (horizons). According to the interpretation of DST results (analysis of pressure build up curves by log - log method) for 11 intervals (marked out previously by well logging due to porosity increase readings) within the Upper Jurassic formation 3 types of pore/fracture space systems were distinguished: - type I - fracture - vuggy porosity system in which fractures connecting voids and vugs within organogenic carbonates are of great importance for medium flow; - type II - vuggy - fracture porosity system where a pore space consists of weak connected voids and intergranular/intercrystalline pores with minor influence of fractures; - type III - cavern porosity system in which a secondary porosity is developed due to dolomitization and cement/grain dissolution processes.

  15. An inexact log-normal distribution-based stochastic chance-constrained model for agricultural water quality management

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong

    2018-05-01

    In this study, an inexact log-normal-based stochastic chance-constrained programming model was developed for solving the non-point source pollution issues caused by agricultural activities. Compared to the general stochastic chance-constrained programming model, the main advantage of the proposed model is that it allows random variables to be expressed as a log-normal distribution, rather than a general normal distribution. Possible deviations in solutions caused by irrational parameter assumptions were avoided. The agricultural system management in the Erhai Lake watershed was used as a case study, where critical system factors, including rainfall and runoff amounts, show characteristics of a log-normal distribution. Several interval solutions were obtained under different constraint-satisfaction levels, which were useful in evaluating the trade-off between system economy and reliability. The applied results show that the proposed model could help decision makers to design optimal production patterns under complex uncertainties. The successful application of this model is expected to provide a good example for agricultural management in many other watersheds.

  16. Constructing compact and effective graphs for recommender systems via node and edge aggregations

    DOE PAGES

    Lee, Sangkeun; Kahng, Minsuk; Lee, Sang-goo

    2014-12-10

    Exploiting graphs for recommender systems has great potential to flexibly incorporate heterogeneous information for producing better recommendation results. As our baseline approach, we first introduce a naive graph-based recommendation method, which operates with a heterogeneous log-metadata graph constructed from user log and content metadata databases. Although the na ve graph-based recommendation method is simple, it allows us to take advantages of heterogeneous information and shows promising flexibility and recommendation accuracy. However, it often leads to extensive processing time due to the sheer size of the graphs constructed from entire user log and content metadata databases. In this paper, we proposemore » node and edge aggregation approaches to constructing compact and e ective graphs called Factor-Item bipartite graphs by aggregating nodes and edges of a log-metadata graph. Furthermore, experimental results using real world datasets indicate that our approach can significantly reduce the size of graphs exploited for recommender systems without sacrificing the recommendation quality.« less

  17. Comparison of planted soil infiltration systems for treatment of log yard runoff.

    PubMed

    Hedmark, Asa; Scholz, Miklas; Aronsson, Par; Elowson, Torbjorn

    2010-07-01

    Treatment of log yard runoff is required to avoid contamination of receiving watercourses. The research aim was to assess if infiltration of log yard runoff through planted soil systems is successful and if different plant species affect the treatment performance at a field-scale experimental site in Sweden (2005 to 2007). Contaminated runoff from the log yard of a sawmill was infiltrated through soil planted with Alnus glutinosa (L.) Gärtner (common alder), Salix schwerinii X viminalis (willow variety "Gudrun"), Lolium perenne (L.) (rye grass), and Phalaris arundinacea (L.) (reed canary grass). The study concluded that there were no treatment differences when comparing the four different plants with each other, and there also were no differences between the tree and the grass species. Furthermore, the infiltration treatment was effective in reducing total organic carbon (55%) and total phosphorus (45%) concentrations in the runoff, even when the loads on the infiltration system increased from year to year.

  18. Logarithmic amplifiers.

    PubMed

    Gandler, W; Shapiro, H

    1990-01-01

    Logarithmic amplifiers (log amps), which produce an output signal proportional to the logarithm of the input signal, are widely used in cytometry for measurements of parameters that vary over a wide dynamic range, e.g., cell surface immunofluorescence. Existing log amp circuits all deviate to some extent from ideal performance with respect to dynamic range and fidelity to the logarithmic curve; accuracy in quantitative analysis using log amps therefore requires that log amps be individually calibrated. However, accuracy and precision may be limited by photon statistics and system noise when very low level input signals are encountered.

  19. Nonblocking and orphan free message logging protocols

    NASA Technical Reports Server (NTRS)

    Alvisi, Lorenzo; Hoppe, Bruce; Marzullo, Keith

    1992-01-01

    Currently existing message logging protocols demonstrate a classic pessimistic vs. optimistic tradeoff. We show that the optimistic-pessimistic tradeoff is not inherent to the problem of message logging. We construct a message-logging protocol that has the positive features of both optimistic and pessimistic protocol: our protocol prevents orphans and allows simple failure recovery; however, it requires no blocking in failure-free runs. Furthermore, this protocol does not introduce any additional message overhead as compared to one implemented for a system in which messages may be lost but processes do not crash.

  20. Nonblocking and orphan free message logging protocols

    NASA Astrophysics Data System (ADS)

    Alvisi, Lorenzo; Hoppe, Bruce; Marzullo, Keith

    1992-12-01

    Currently existing message logging protocols demonstrate a classic pessimistic vs. optimistic tradeoff. We show that the optimistic-pessimistic tradeoff is not inherent to the problem of message logging. We construct a message-logging protocol that has the positive features of both optimistic and pessimistic protocol: our protocol prevents orphans and allows simple failure recovery; however, it requires no blocking in failure-free runs. Furthermore, this protocol does not introduce any additional message overhead as compared to one implemented for a system in which messages may be lost but processes do not crash.

  1. SU-F-T-469: A Clinically Observed Discrepancy Between Image-Based and Log- Based MLC Position

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neal, B; Ahmed, M; Siebers, J

    2016-06-15

    Purpose: To present a clinical case which challenges the base assumption of log-file based QA, by showing that the actual position of a MLC leaf can suddenly deviate from its programmed and logged position by >1 mm as observed with real-time imaging. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used in cine mode to capture portal images during treatment. Visual monitoring identified an anomalous MLC leaf pair gap not otherwise detected by the automatic position verification. The position of the erred leaf was measured on EPID images and log files were analyzed for themore » treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days. Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3±0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusion: It has been clinically observed that log-file derived leaf positions can differ from their actual positions by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trusting log file records. Intra-treatment EPID imaging provides a method to capture departures from MLC planned positions. Work was supported in part by Varian Medical Systems.« less

  2. Descriptive study of relationship between cardio-ankle vascular index and biomarkers in vascular-related diseases.

    PubMed

    Liu, Jinbo; Liu, Huan; Zhao, Hongwei; Shang, Guangyun; Zhou, Yingyan; Li, Lihong; Wang, Hongyu

    2017-01-01

    Cardio-ankle vascular index (CAVI) was supposed to be an independent predictor for vascular-related events. Biomarkers such as homocysteine (Hcy), N-terminal pro-brain natriuretic peptide (NT-proBNP), and urine albumin(microalbumin) (UAE) have involved the pathophysiological development of arteriosclerosis. The present study was to investigate relationship between CAVI and biomarkers in vascular-related diseases. A total of 656 subjects (M/F 272/384) from department of Vascular Medicine were enrolled into our study. They were divided into four groups according to the numbers of suffered diseases, healthy group (group 0: subjects without diseases of hypertension, diabetes mellitus (DM), coronary heart disease (CHD); n = 186), group 1 (with one of diseases of hypertension, CHD, DM; n = 237), group 2 (with two of diseases of hypertension, CHD, DM; n = 174), and group 3 (with all diseases of hypertension, CHD, DM; n = 59). CAVI was measured by VS-1000 apparatus. CAVI was increasing with increasing numbers of suffered vascular-related diseases. Similar results were found in the parameters of biomarkers such as Hcy, log NT-ProBNP, and log UAE. There were positive correlation between log NT-proBNP, Hcy, log UAE, and CAVI in the entire study group and nonhealthy group. Positive correlation between log UAE and CAVI were found in the entire study group after adjusting for age, body mass index (BMI), blood pressure, uric acid, and lipids. Multivariate analysis showed that log UAE was an independent associating factor of CAVI in all subjects. CAVI was significantly higher in subjects with hypertension, CHD, and DM. There was correlation between arterial stiffness and biomarkers such as NT-proBNP, Hcy, and UAE.

  3. Contribution of hydrological data to the understanding of the spatio-temporal dynamics of F-specific RNA bacteriophages in river water during rainfall-runoff events.

    PubMed

    Fauvel, Blandine; Cauchie, Henry-Michel; Gantzer, Christophe; Ogorzaly, Leslie

    2016-05-01

    Heavy rainfall events were previously reported to bring large amounts of microorganisms in surface water, including viruses. However, little information is available on the origin and transport of viral particles in water during such rain events. In this study, an integrative approach combining microbiological and hydrological measurements was investigated to appreciate the dynamics and origins of F-specific RNA bacteriophage fluxes during two distinct rainfall-runoff events. A high frequency sampling (automatic sampler) was set up to monitor the F-specific RNA bacteriophages fluxes at a fine temporal scale during the whole course of the rainfall-runoff events. A total of 276 rainfall-runoff samples were collected and analysed using both infectivity and RT-qPCR assays. The results highlight an increase of 2.5 log10 and 1.8 log10 of infectious F-specific RNA bacteriophage fluxes in parallel of an increase of the water flow levels for both events. Faecal pollution was characterised as being mainly from anthropic origin with a significant flux of phage particles belonging to the genogroup II. At the temporal scale, two successive distinct waves of phage pollution were established and identified through the hydrological measurements. The first arrival of phages in the water column was likely to be linked to the resuspension of riverbed sediments that was responsible for a high input of genogroup II. Surface runoff contributed further to the second input of phages, and more particularly of genogroup I. In addition, an important contribution of infectious phage particles has been highlighted. These findings imply the existence of a close relationship between the risk for human health and the viral contamination of flood water. Copyright © 2016 Luxembourg institute of Science and Technology. Published by Elsevier Ltd.. All rights reserved.

  4. The effects of a low international normalized ratio on thromboembolic and bleeding complications in patients with mechanical mitral valve replacement

    PubMed Central

    2014-01-01

    Background Mechanical heart valve replacement has an inherent risk of thromboembolic events (TEs). Current guidelines recommend an international normalized ratio (INR) of at least 2.5 after mechanical mitral valve replacement (MVR). This study aimed to evaluate the effects of a low INR (2.0–2.5) on thromboembolic and bleeding complications in patients with mechanical MVR on warfarin therapy. Methods One hundred and thirty-five patients who underwent mechanical MVR were enrolled in this study. The end points of this study were defined as TEs (valve thrombosis, transient ischemic attack, stroke) and bleeding (all minor and major bleeding) complications. Patients were followed up for a mean of 39.6 months and the mean INR of the patients was calculated. After data collection, patients were divided into 3 groups according to their mean INR, as follows: group 1 (n = 34), INR <2.0; group 2 (n = 49), INR 2.0–2.5; and group 3 (n = 52), INR >2.5. Results A total of 22 events (10 [7.4%] thromboembolic and 12 [8.8%] bleeding events) occurred in the follow-up period. The mean INR was an independent risk factor for the development of TEs. Mean INR and neurological dysfunction were independent risk factors for the development of bleeding events. A statistically significant positive correlation was found between the log mean INR and all bleeding events, and a negative correlation was found between the log mean INR and all TEs. The total number of events was significantly lower in group 2 than in groups 1 and 3 (P = 0.036). Conclusions This study showed that a target INRs of 2.0–2.5 are acceptable for preventing TEs and safe in terms of bleeding complications in patients with mechanical MVR. PMID:24885719

  5. Systems Biology Approach Reveals a Calcium-Dependent Mechanism for Basal Toxicity in Daphnia magna.

    PubMed

    Antczak, Philipp; White, Thomas A; Giri, Anirudha; Michelangeli, Francesco; Viant, Mark R; Cronin, Mark T D; Vulpe, Chris; Falciani, Francesco

    2015-09-15

    The expanding diversity and ever increasing amounts of man-made chemicals discharged to the environment pose largely unknown hazards to ecosystem and human health. The concept of adverse outcome pathways (AOPs) emerged as a comprehensive framework for risk assessment. However, the limited mechanistic information available for most chemicals and a lack of biological pathway annotation in many species represent significant challenges to effective implementation of this approach. Here, a systems level, multistep modeling strategy demonstrates how to integrate information on chemical structure with mechanistic insight from genomic studies, and phenotypic effects to define a putative adverse outcome pathway. Results indicated that transcriptional changes indicative of intracellular calcium mobilization were significantly overrepresented in Daphnia magna (DM) exposed to sublethal doses of presumed narcotic chemicals with log Kow ≥ 1.8. Treatment of DM with a calcium ATPase pump inhibitor substantially recapitulated the common transcriptional changes. We hypothesize that calcium mobilization is a potential key molecular initiating event in DM basal (narcosis) toxicity. Heart beat rate analysis and metabolome analysis indicated sublethal effects consistent with perturbations of calcium preceding overt acute toxicity. Together, the results indicate that altered calcium homeostasis may be a key early event in basal toxicity or narcosis induced by lipophilic compounds.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, S; Ho, M; Chen, C

    Purpose: The use of log files to perform patient specific quality assurance for both protons and IMRT has been established. Here, we extend that approach to a proprietary log file format and compare our results to measurements in phantom. Our goal was to generate a system that would permit gross errors to be found within 3 fractions until direct measurements. This approach could eventually replace direct measurements. Methods: Spot scanning protons pass through multi-wire ionization chambers which provide information about the charge, location, and size of each delivered spot. We have generated a program that calculates the dose in phantommore » from these log files and compares the measurements with the plan. The program has 3 different spot shape models: single Gaussian, double Gaussian and the ASTROID model. The program was benchmarked across different treatment sites for 23 patients and 74 fields. Results: The dose calculated from the log files were compared to those generate by the treatment planning system (Raystation). While the dual Gaussian model often gave better agreement, overall, the ASTROID model gave the most consistent results. Using a 5%–3 mm gamma with a 90% passing criteria and excluding doses below 20% of prescription all patient samples passed. However, the degree of agreement of the log file approach was slightly worse than that of the chamber array measurement approach. Operationally, this implies that if the beam passes the log file model, it should pass direct measurement. Conclusion: We have established and benchmarked a model for log file QA in an IBA proteus plus system. The choice of optimal spot model for a given class of patients may be affected by factors such as site, field size, and range shifter and will be investigated further.« less

  7. A kinetic energy model of two-vehicle crash injury severity.

    PubMed

    Sobhani, Amir; Young, William; Logan, David; Bahrololoom, Sareh

    2011-05-01

    An important part of any model of vehicle crashes is the development of a procedure to estimate crash injury severity. After reviewing existing models of crash severity, this paper outlines the development of a modelling approach aimed at measuring the injury severity of people in two-vehicle road crashes. This model can be incorporated into a discrete event traffic simulation model, using simulation model outputs as its input. The model can then serve as an integral part of a simulation model estimating the crash potential of components of the traffic system. The model is developed using Newtonian Mechanics and Generalised Linear Regression. The factors contributing to the speed change (ΔV(s)) of a subject vehicle are identified using the law of conservation of momentum. A Log-Gamma regression model is fitted to measure speed change (ΔV(s)) of the subject vehicle based on the identified crash characteristics. The kinetic energy applied to the subject vehicle is calculated by the model, which in turn uses a Log-Gamma Regression Model to estimate the Injury Severity Score of the crash from the calculated kinetic energy, crash impact type, presence of airbag and/or seat belt and occupant age. Copyright © 2010 Elsevier Ltd. All rights reserved.

  8. Giving perspective to cliff exposures with ground penetrating radar: Devonian lacustrine shore zone architecture

    NASA Astrophysics Data System (ADS)

    Andrews, Steven; Moreau, Julien; Archer, Stuart

    2015-04-01

    The orbitally-controlled cyclic lacustrine successions of the Middle Devonian in Northern Scotland contains repeated developments of shore zone sandstones. However, due to the cliff-forming nature of the succession and the attitude of the sections through these sandstones, interpretation of this facies has been problematic. To better understand the shore zone systems, we carried out very high resolution sedimentary logging and constructed photo-panels which were combined with high resolution GPR profiling (250 MHz). To ensure close ties between the sedimentary logs and the GPR data, the cliffs were accessed using rope access techniques while GPR grids were shot directly above. The profiles were shot mainly in the strike direction of what was thought to be the shore elongation every 5-10 m and every 20-30 m in the dip direction. Shore zone systems of 3 different sequences have been imaged for a total of 1155 m of GPR profile collected. This configuration has allowed 3D visualisation of the architecture of the shore zone systems and, in combination with detailed sedimentology, provided insights into the generation of the dynamic shore zone environments. The coastal cliffs of northern Scotland expose sedimentary cycles on average 16-m-thick which record deep lake, perennial lake and playa environments. The shore zone deposits reach 2 to 3.5 m in thickness. Loading and discrete channel forms are recognised in both the GPR data and sedimentary logs through the lower portion of the lake shore zone successions. Up-section the sandstone beds appear to become amalgamated forming subtle low angle accretionary bar complexes which although visible in outcrop, after careful investigation, can be fully visualised and examined in the GPR data. The 3D visualisation allowed mapping the architecture and distribution of the bars . The orientation of these features, recognised from the survey, is consistent with extensive palaeocurrent measurements from oscillation ripples. Further loaded sandstone beds and sand-filled shallow channel features overlie the bar forms. The channels are well imaged in the radargrams where their wider context can be gained. Through the combination of high resolution GPR data and detailed sedimentological analysis determination of the processes through which the previously enigmatic lake shore zone sandstones has been possible. The shore zone sandstones overlie playa facies which contain abundant desiccation horizons, reflecting the most arid phase in the climatically-controlled lacustrine cycle. As climatic conditions ameliorated the rejuvenation of fluvial systems resulted in the transport of sand out into the basin. Initial deposition was limited to intermittent events where sediment was laid down on a water saturated substrate. Some of these may have occurred subaqueously as small scale turbidity flows. High resolution fluctuations in lake level resulted in periodic short-lived reworking events along the lake margin which produced amalgamated sands, forming low relief bars. Shore zone reworking is likely to have occurred over a wide area as the lake margin migrated back and forth, and gradually transgressed. Continued transgression forced fluvial systems back towards the basin margin.

  9. ETV REPORT - PHYSICAL REMOVAL OF CRYPTOSPORIDIUM OOCYSTS AND GIARDIA CYSTS IN DRINKING WATER AQUASOURCE NORTH AMERICA ULTRAFILTRATION SYSTEM A35 AT PITTSBURGH, PA. - NSF00/07/EPADW395

    EPA Science Inventory

    Verification testing of the Aquasource Ultrafiltration Treatment System Model A35 was conducted from 12/1 - 12/31/98. The treatment system underwent microbial challenge testing on 1/22/99 and demonstrated a 5.5 log10 removal of Giardia cysts and a 6.5 log10 removal of Cryptospori...

  10. Satellite freeze forecast system. Operating/troubleshooting manual

    NASA Technical Reports Server (NTRS)

    Martsolf, J. D. (Principal Investigator)

    1983-01-01

    Examples of operational procedures are given to assist users of the satellites freeze forecasting system (SFFS) in logging in on to the computer, executing the programs in the menu, logging off the computer, and setting up the automatic system. Directions are also given for displaying, acquiring, and listing satellite maps; for communicating via terminal and monitor displays; and for what to do when the SFFS doesn't work. Administrative procedures are included.

  11. Mass Storage Performance Information System

    NASA Technical Reports Server (NTRS)

    Scheuermann, Peter

    2000-01-01

    The purpose of this task is to develop a data warehouse to enable system administrators and their managers to gather information by querying the data logs of the MDSDS. Currently detailed logs capture the activity of the MDSDS internal to the different systems. The elements to be included in the data warehouse are requirements analysis, data cleansing, database design, database population, hardware/software acquisition, data transformation, query and report generation, and data mining.

  12. Pharmacokinetic profiles of repaglinide in elderly subjects with type 2 diabetes.

    PubMed

    Hatorp, V; Huang, W C; Strange, P

    1999-04-01

    Pharmacokinetic profiles of single- and multiple-dose regimens of repaglinide were evaluated in 12 elderly subjects with type 2 diabetes. On day 1, following a 10-hour fast, subjects received a single 2-mg dose of repaglinide. Starting on day 2 and continuing for 7 days, each subject received a 2-mg dose of repaglinide 15 minutes before each of the three main meals. On day 9, subjects received a single 2-mg dose of repaglinide. Pharmacokinetic profiles, including area under the curve (AUC), log(AUC), maximal concentration (Cmax), log(Cmax), time to maximal concentration (Tmax), and half-life (T(1/2)), were determined at completion of the single- and multiple-dose regimens (days 1 and 9, respectively). Trough repaglinide values were collected on days 2 through 7. The mean log(AUC) values after multiple dosing were significantly higher than the values obtained after a single dose. The mean values for log(Cmax), and Tmax were comparable after each dosing regimen. The T(1/2) of repaglinide after multiple dosing was 1.7 hours. The trough values for repaglinide were low. No hypoglycemic events were reported. The pharmacokinetic profiles of repaglinide after single- and multiple-dose regimens were similar, and repaglinide was well tolerated by elderly subjects with type 2 diabetes.

  13. Impacts of extreme flooding on riverbank filtration water quality.

    PubMed

    Ascott, M J; Lapworth, D J; Gooddy, D C; Sage, R C; Karapanos, I

    2016-06-01

    Riverbank filtration schemes form a significant component of public water treatment processes on a global level. Understanding the resilience and water quality recovery of these systems following severe flooding is critical for effective water resources management under potential future climate change. This paper assesses the impact of floodplain inundation on the water quality of a shallow aquifer riverbank filtration system and how water quality recovers following an extreme (1 in 17 year, duration >70 days, 7 day inundation) flood event. During the inundation event, riverbank filtrate water quality is dominated by rapid direct recharge and floodwater infiltration (high fraction of surface water, dissolved organic carbon (DOC) >140% baseline values, >1 log increase in micro-organic contaminants, microbial detects and turbidity, low specific electrical conductivity (SEC) <90% baseline, high dissolved oxygen (DO) >400% baseline). A rapid recovery is observed in water quality with most floodwater impacts only observed for 2-3 weeks after the flooding event and a return to normal groundwater conditions within 6 weeks (lower fraction of surface water, higher SEC, lower DOC, organic and microbial detects, DO). Recovery rates are constrained by the hydrogeological site setting, the abstraction regime and the water quality trends at site boundary conditions. In this case, increased abstraction rates and a high transmissivity aquifer facilitate rapid water quality recoveries, with longer term trends controlled by background river and groundwater qualities. Temporary reductions in abstraction rates appear to slow water quality recoveries. Flexible operating regimes such as the one implemented at this study site are likely to be required if shallow aquifer riverbank filtration systems are to be resilient to future inundation events. Development of a conceptual understanding of hydrochemical boundaries and site hydrogeology through monitoring is required to assess the suitability of a prospective riverbank filtration site. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  14. Software Description for the O’Hare Runway Configuration Management System. Volume II. Low-Level Pseudocode,

    DTIC Science & Technology

    1982-10-01

    BACK_ GVRATON; (geserate log me"ses from wather and wind planning log screenJ PERFORN AIPORT.PLANNIG LOGISSA ORItATION; (eSamrate log message from...ISSAGN H31.TAL(COUNT) .26 -kAUX imesase is constructed with wather imforimetiom ad stated) IF(VILO.TAM(J).DIl A(S)’ ’)1 (VXLOG.TABL(J).VU.(S)’ THEN

  15. An interactive machine-learning approach for defect detection in computed tomogaraphy (CT) images of hardwood logs

    Treesearch

    Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt; Philip A. Araman

    2005-01-01

    This paper describes recent progress in the analysis of computed tomography (CT) images of hardwood logs. The long-term goal of the work is to develop a system that is capable of autonomous (or semiautonomous) detection of internal defects, so that log breakdown decisions can be optimized based on defect locations. The problem is difficult because wood exhibits large...

  16. The interactive impact of forest site and stand attributes and logging technology on stand management

    Treesearch

    C.B. LeDoux; J.E. Baumgras

    1991-01-01

    The impact of selected site and stand attributes on stand management is demonstrated using actual forest model plot data and a complete systems simulation model called MANAGE. The influence of terrain on the type of logging technology required to log a stand and the resulting impact on stand management is also illustrated. The results can be used by managers and...

  17. Sediment pathways in a tropical forest: effects of logging roads and skid trails

    NASA Astrophysics Data System (ADS)

    Sidle, Roy C.; Sasaki, Shozo; Otsuki, Mieko; Noguchi, Shoji; Rahim Nik, Abdul

    2004-03-01

    Significant erosion occurred from recently constructed forest logging roads and skid trails in a small headwater catchment in Peninsular Malaysia. Soil loss was estimated by measuring dimensions of all significant rills and gullies along the road, as well as by measuring height of preserved soil pedestals in sidecast and fill material and on skid trails. Estimates of surface erosion from logging roads and skid trails were 272 +/- 20 t ha-1 year-1 and 275 +/- 20 t ha-1 year-1 respectively. However, owing to lack of connectivity of skid trails to the stream, much of the sediment mobilized on skid trails was stored either on adjacent hillslopes or the trails themselves, rather than being transported to the stream system, as was the case for the road. Steeper skid trails (>20% gradient) had slightly higher erosion rates (320 +/- 24 t ha-1 year-1) than trails with gentler gradients (245-264 t ha-1 year-1). Some 60% of the soil loss on logging roads comes from erosion of the running surface. Disturbed cut and fill material along the road supplied the remaining 40% of the soil loss from roads. Roads and skid trails had no designed drainage systems; runoff discharged onto the hillslope at 25 major discharge nodes from the logging road (690 m total length) and at 34 nodes from skid trails (2300 m). Sediment pathways were either fully or moderately connected to headwater channels at 64% of the logging road nodes, but at only 26% of the nodes emanating from skid trails. A detailed sediment budget revealed that 78% of the soil loss from the road system (including log landings) was delivered to the stream in the first 16 months after logging began. Most (90%) of the deposition from skid trails occurred below just three discharge nodes. Runoff from and onto skid trails often exacerbated the sediment connectivity to channels. Clearly, sediment discharge from logging roads was more highly connected to the stream than discharge from skid trails. Once in the channel, much of this sediment was temporarily stored in the floodplain and behind woody debris.

  18. A Clustering Methodology of Web Log Data for Learning Management Systems

    ERIC Educational Resources Information Center

    Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros

    2012-01-01

    Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…

  19. Role of Passive Capturing in a Ubiquitous Learning Environment

    ERIC Educational Resources Information Center

    Ogata, Hiroaki; Hou, Bin; Li, MengMeng; Uosaki, Noriko; Mouri, Kousuke

    2013-01-01

    Ubiquitous Learning Log (ULL) is defined as a digital record of what you have learned in the daily life using ubiquitous technologies. This paper focuses on how to capture learning experiences in our daily life for vocabulary learning. In our previous works, we developed a system named SCROLL (System for Capturing and Reminding Of Learning Log) in…

  20. Financial and ecological indicators of reduced impact logging performance in the eastern Amazon

    Treesearch

    Thomas P. Holmes; Geoffrey M. Blate; Johan C. Zweede; Rodrigo Pereira; Paulo Barreto; Frederick Boltz; Roberto Bauch

    2002-01-01

    Reduced impact logging (RIL) systems are currently being promoted in Brazil and other tropical countries in response to domestic and international concern over the ecological and economic sustainability of harvesting natural tropical forests. RIL systems are necessary, but not sufficient, for sustainable forest management because they reduce damage to the forest...

  1. Diameter sensors for tree-length harvesting systems

    Treesearch

    T.P. McDonald; Robert B. Rummer; T.E. Grift

    2003-01-01

    Most cut-to-length (CTL) harvesters provide sensors for measuring diameter of trees as they are cut and processed. Among other uses, this capability provides a data collection tool for marketing of logs in real time. Logs can be sorted and stacked based on up-to-date market information, then transportation systems optimized to route wood to proper destinations at...

  2. Parallel checksumming of data chunks of a shared data object using a log-structured file system

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-09-06

    Checksum values are generated and used to verify the data integrity. A client executing in a parallel computing system stores a data chunk to a shared data object on a storage node in the parallel computing system. The client determines a checksum value for the data chunk; and provides the checksum value with the data chunk to the storage node that stores the shared object. The data chunk can be stored on the storage node with the corresponding checksum value as part of the shared object. The storage node may be part of a Parallel Log-Structured File System (PLFS), and the client may comprise, for example, a Log-Structured File System client on a compute node or burst buffer. The checksum value can be evaluated when the data chunk is read from the storage node to verify the integrity of the data that is read.

  3. Chapter 12: The variable-density thinning study at Stanislaus-Tuolumne Experimental Forest

    Treesearch

    E. Knapp; M. North; M. Benech; B. Estes

    2012-01-01

    Prior to historical logging and fire suppression, forests of the Sierra Nevada were extremely heterogeneous. Frequent low- to moderate-intensity fire was partly responsible for this heterogeneity, which in turn helped make forests resilient to high-severity stand-replacing events. Early observers of forests on the west slope of the Sierra Nevada noted the...

  4. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  5. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  6. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  7. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  8. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  9. Longterm Hydroxychloroquine Therapy and Low-dose Aspirin May Have an Additive Effectiveness in the Primary Prevention of Cardiovascular Events in Patients with Systemic Lupus Erythematosus.

    PubMed

    Fasano, Serena; Pierro, Luciana; Pantano, Ilenia; Iudici, Michele; Valentini, Gabriele

    2017-07-01

    Systemic lupus erythematosus (SLE) is associated with an increased risk of cardiovascular disease (CVD). Thromboprophylaxis with low-dose aspirin (ASA) and hydroxychloroquine (HCQ) seems promising in SLE. We investigated the effects of HCQ cumulative dosages (c-HCQ) and the possible synergistic efficacy of ASA and HCQ in preventing a first CV event (CVE) in patients with SLE. Patients consecutively admitted to our center who, at admission, satisfied the 1997 American College of Rheumatology and/or 2012 Systemic Lupus Collaborating Clinics classification criteria for SLE, and had not experienced any CVE, were enrolled. The occurrence of a thrombotic event, use of ASA, and c-HCQ were recorded. Kaplan-Meier analysis was performed to determine the c-HCQ associated with a lower incidence of CVE. Cox regression analysis served to identify factors associated with a first CVE. For the study, 189 patients with SLE were enrolled and monitored for 13 years (median). Ten CVE occurred during followup. At Kaplan-Meier analysis, the CVE-free rate was higher in ASA-treated patients administered a c-HCQ > 600 g (standard HCQ dose for at least 5 yrs) than in patients receiving ASA alone, or with a c-HCQ dose < 600 g (log-rank test chi-square = 4.01, p = 0.04). Multivariate analysis showed that antimalarials plus ASA protected against thrombosis (HR 0.041 and HR 0.047, respectively), while antiphospholipid antibodies (HR 17.965) and hypertension (HR 18.054) increased the risk of a first CVE. Our results suggest that prolonged use of HCQ plus ASA is thromboprotective in SLE and provides additional evidence for its continued use in patients with SLE.

  10. Assessing Impacts of Selective Logging on Water, Energy, and Carbon Fluxes in Amazon Forests Using the Functionally Assembled Terrestrial Ecosystem Simulator (FATES)

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Huang, M.; Keller, M. M.; Longo, M.; Knox, R. G.; Koven, C.; Fisher, R.

    2016-12-01

    As a key component in the climate system, old-growth tropical forests act as carbon sinks that remove CO2 from the atmosphere. However, these forests could be easily turned into C sources when disturbed. In fact, over half of tropical forests have been cleared or logged, and almost half of standing primary tropical forests are designated for timber production. Existing literature suggests that timber harvests alone could contribute up to 25% as much C losses as deforestation in Amazon. Yet, the spatial extent and recovery trajectory of disturbed forests in a changing climate are highly uncertain. This study constitutes our first attempt to quantify impacts of selective logging on water, energy, and carbon budgets in Amazon forests using the Functionally Assembled Terrestrial Ecosystem Simulator (FATES). The Community Land Model version 4.5 (CLM4.5), with and without FATES turned on, are configured to run at two flux towers established in the Large Scale Biosphere-Atmosphere Experiment in Amazonia (LBA). One tower is located at in an old-growth forest (i.e. KM67) and the other is located in a selectively logged site (i.e., KM83). The three CLM4.5 options, (1) Satellite Phenology (CLM4.5-SP), (2) Century-based biogeochemical cycling with prognostic phenology (CLM4.5-BGC), and (3) CLM4.5-FATES, are spun up to equilibrium by recycling the observed meteorology at the towers, respectively. The simulated fluxes (i.e., sensible heat, latent heat, and net ecosystem exchange) are then compared to observations at KM67 to evaluate the capability of the models in capturing water and carbon dynamics in old-growth tropical forests. Our results suggest that all three models perform reasonably well in capturing the fluxes but demographic features simulated by FATES, such as distributions of diameter at breast height (DBH) and stem density (SD), are skewed heavily toward extremely large trees (e.g., > 100 cm in DBH) when compared to site surveys at the forest plots. Efforts are underway to evaluate parametric sensitivity in FATES to improve simulations in old-growth forests, and to implement parameterization to represent pulse disturbance to carbon pools created by logging events at different intensities, and follow-up recovery closely related to gap-phase regeneration and competition for lights within the gaps.

  11. The origins of multifractality in financial time series and the effect of extreme events

    NASA Astrophysics Data System (ADS)

    Green, Elena; Hanan, William; Heffernan, Daniel

    2014-06-01

    This paper presents the results of multifractal testing of two sets of financial data: daily data of the Dow Jones Industrial Average (DJIA) index and minutely data of the Euro Stoxx 50 index. Where multifractal scaling is found, the spectrum of scaling exponents is calculated via Multifractal Detrended Fluctuation Analysis. In both cases, further investigations reveal that the temporal correlations in the data are a more significant source of the multifractal scaling than are the distributions of the returns. It is also shown that the extreme events which make up the heavy tails of the distribution of the Euro Stoxx 50 log returns distort the scaling in the data set. The most extreme events are inimical to the scaling regime. This result is in contrast to previous findings that extreme events contribute to multifractality.

  12. Sonic logging for detecting the excavation disturbed and fracture zones

    NASA Astrophysics Data System (ADS)

    Lin, Y. C.; Chang, Y. F.; Liu, J. W.; Tseng, C. W.

    2017-12-01

    This study presents a new sonic logging method to detect the excavation disturbed zone (EDZ) and fracture zones in a tunnel. The EDZ is a weak rock zone where its properties and conditions have been changed by excavation, which results such as fracturing, stress redistribution and desaturation in this zone. Thus, the EDZ is considered as a physically less stable and could form a continuous and high-permeable pathway for groundwater flow. Since EDZ and fracture zone have the potential of affecting the safety of the underground openings and repository performance, many studies were conducted to characterize the EDZ and fracture zone by different methods, such as the rock mass displacements and strain measurements, seismic refraction survey, seismic tomography and hydraulic test, etc. In this study, we designed a new sonic logging method to explore the EDZ and fracture zone in a tunnel at eastern Taiwan. A high power and high frequency sonic system was set up which includes a two hydrophones pitch-catch technique with a common-offset immersed in water-filled uncased wells and producing a 20 KHz sound to scan the well rock. Four dominant sonic events were observed in the measurements, they are refracted P- and S-wave along the well rock, direct water wave and the reverberation in the well water. Thus the measured P- and S-wave velocities, the signal-to-noise ratio of the refraction and the amplitudes of reverberation along the well rock were used as indexes to determine the EDZ and fracture zone. Comparing these indexes with core samples shows that significant changes in the indexes are consistent with the EDZ and fracture zone. Thus, the EDZ and fracture zone can be detected by this new sonic method conclusively.

  13. Life cycle performances of log wood applied for soil bioengineering constructions

    NASA Astrophysics Data System (ADS)

    Kalny, Gerda; Strauss-Sieberth, Alexandra; Strauss, Alfred; Rauch, Hans Peter

    2016-04-01

    Nowadays there is a high demand on engineering solutions considering not only technical aspects but also ecological and aesthetic values. Soil bioengineering is a construction technique that uses biological components for hydraulic and civil engineering solutions. Soil bioengineering solutions are based on the application of living plants and other auxiliary materials including among others log wood. This kind of construction material supports the soil bioengineering system as long as the plants as living construction material overtake the stability function. Therefore it is important to know about the durability and the degradation process of the wooden logs to retain the integral performance of a soil bio engineering system. These aspects will be considered within the framework of the interdisciplinary research project „ELWIRA Plants, wood, steel and concrete - life cycle performances as construction materials". Therefore field investigations on soil bioengineering construction material, specifically European Larch wood logs, of different soil bioengineering structures at the river Wien have been conducted. The drilling resistance as a parameter for particular material characteristics of selected logs was measured and analysed. The drilling resistance was measured with a Rinntech Resistograph instrument at different positions of the wooden logs, all surrounded with three different backfills: Fully surrounded with air, with earth contact on one side and near the water surface in wet-dry conditions. The age of the used logs ranges from one year old up to 20 year old. Results show progress of the drilling resistance throughout the whole cross section as an indicator to assess soil bioengineering construction material. Logs surrounded by air showed a higher drilling resistance than logs with earth contact and the ones exposed to wet-dry conditions. Hence the functional capability of wooden logs were analysed and discussed in terms of different levels of degradation. The results contribute to a sustainable and resource conserving handling with building materials in frame of construction and maintenance works of soil bioengineering structures.

  14. Trend in frequency of extreme precipitation events over Ontario from ensembles of multiple GCMs

    NASA Astrophysics Data System (ADS)

    Deng, Ziwang; Qiu, Xin; Liu, Jinliang; Madras, Neal; Wang, Xiaogang; Zhu, Huaiping

    2016-05-01

    As one of the most important extreme weather event types, extreme precipitation events have significant impacts on human and natural environment. This study assesses the projected long term trends in frequency of occurrence of extreme precipitation events represented by heavy precipitation days, very heavy precipitation days, very wet days and extreme wet days over Ontario, based on results of 21 CMIP3 GCM runs. To achieve this goal, first, all model data are linearly interpolated onto 682 grid points (0.45° × 0.45°) in Ontario; Next, biases in model daily precipitation amount are corrected with a local intensity scaling method to make the total wet days and total wet day precipitation from each of the GCMs are consistent with that from the climate forecast system reanalysis data, and then the four indices are estimated for each of the 21 GCM runs for 1968-2000, 2046-2065 and 2081-2100. After that, with the assumption that the rate parameter of the Poisson process for the occurrence of extreme precipitation events may vary with time as climate changes, the Poisson regression model which expresses the log rate as a linear function of time is used to detect the trend in frequency of extreme events in the GCMs simulations; Finally, the trends and their uncertainty are estimated. The result shows that in the twenty-first century annual heavy precipitation days, very heavy precipitation days and very wet days and extreme wet days are likely to significantly increase over major parts of Ontario and particularly heavy precipitation days, very wet days are very likely to significantly increase in some sub-regions in eastern Ontario. However, trends of seasonal indices are not significant.

  15. 47 CFR 73.1820 - Station log.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... part. (iii) An entry of each test and activation of the Emergency Alert System (EAS) pursuant to the... functions may be utilized to record entries in the station log Provided: (1) The recording devices do not...

  16. The 1993 Mississippi river flood: A one hundred or a one thousand year event?

    USGS Publications Warehouse

    Malamud, B.D.; Turcotte, D.L.; Barton, C.C.

    1996-01-01

    Power-law (fractal) extreme-value statistics are applicable to many natural phenomena under a wide variety of circumstances. Data from a hydrologic station in Keokuk, Iowa, shows the great flood of the Mississippi River in 1993 has a recurrence interval on the order of 100 years using power-law statistics applied to partial-duration flood series and on the order of 1,000 years using a log-Pearson type 3 (LP3) distribution applied to annual series. The LP3 analysis is the federally adopted probability distribution for flood-frequency estimation of extreme events. We suggest that power-law statistics are preferable to LP3 analysis. As a further test of the power-law approach we consider paleoflood data from the Colorado River. We compare power-law and LP3 extrapolations of historical data with these paleo-floods. The results are remarkably similar to those obtained for the Mississippi River: Recurrence intervals from power-law statistics applied to Lees Ferry discharge data are generally consistent with inferred 100- and 1,000-year paleofloods, whereas LP3 analysis gives recurrence intervals that are orders of magnitude longer. For both the Keokuk and Lees Ferry gauges, the use of an annual series introduces an artificial curvature in log-log space that leads to an underestimate of severe floods. Power-law statistics are predicting much shorter recurrence intervals than the federally adopted LP3 statistics. We suggest that if power-law behavior is applicable, then the likelihood of severe floods is much higher. More conservative dam designs and land-use restrictions Nay be required.

  17. Logging concessions enable illegal logging crisis in the Peruvian Amazon.

    PubMed

    Finer, Matt; Jenkins, Clinton N; Sky, Melissa A Blue; Pine, Justin

    2014-04-17

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms.

  18. Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon

    PubMed Central

    Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin

    2014-01-01

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US–Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms. PMID:24743552

  19. Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon

    NASA Astrophysics Data System (ADS)

    Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin

    2014-04-01

    The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms.

  20. Comparison of adsorption coefficient (K[sub oc]) for soils and HPLC retention factors of aromatic hydrocarbons using a chemically immobilized humic acid column in RP-HPLC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szabo, G.; Bulman, R.A.

    The determination of soil adsorption coefficients (K[sub oc]) via HPLC capacity factors (k[prime]) has been studied, including the effect of column type and mobile phase composition on the correlation between log K[sub oc] and log k[prime]. K[sub oc] values obtained by procedures other than HPLC correlate well with HPLC capacity factors determined on a chemically immobilized humic acid stationary phase, and it is suggested that this phase is a better model for the sorption onto soil or sediment than the octadecyl-, phenyl- and ethylsilica phases. By using log k[prime][sub w] a theoretical capacity factor has been obtained by extrapolation ofmore » the retention data in a binary solvent system to pure aqueous eluent. There is a better correlation between log K[sub oc] and log k[prime][sub w] than the correlation between log K[sub oc] and log k[prime].« less

  1. Current and future trends in marine image annotation software

    NASA Astrophysics Data System (ADS)

    Gomes-Pereira, Jose Nuno; Auger, Vincent; Beisiegel, Kolja; Benjamin, Robert; Bergmann, Melanie; Bowden, David; Buhl-Mortensen, Pal; De Leo, Fabio C.; Dionísio, Gisela; Durden, Jennifer M.; Edwards, Luke; Friedman, Ariell; Greinert, Jens; Jacobsen-Stout, Nancy; Lerner, Steve; Leslie, Murray; Nattkemper, Tim W.; Sameoto, Jessica A.; Schoening, Timm; Schouten, Ronald; Seager, James; Singh, Hanumant; Soubigou, Olivier; Tojeira, Inês; van den Beld, Inge; Dias, Frederico; Tempera, Fernando; Santos, Ricardo S.

    2016-12-01

    Given the need to describe, analyze and index large quantities of marine imagery data for exploration and monitoring activities, a range of specialized image annotation tools have been developed worldwide. Image annotation - the process of transposing objects or events represented in a video or still image to the semantic level, may involve human interactions and computer-assisted solutions. Marine image annotation software (MIAS) have enabled over 500 publications to date. We review the functioning, application trends and developments, by comparing general and advanced features of 23 different tools utilized in underwater image analysis. MIAS requiring human input are basically a graphical user interface, with a video player or image browser that recognizes a specific time code or image code, allowing to log events in a time-stamped (and/or geo-referenced) manner. MIAS differ from similar software by the capability of integrating data associated to video collection, the most simple being the position coordinates of the video recording platform. MIAS have three main characteristics: annotating events in real time, posteriorly to annotation and interact with a database. These range from simple annotation interfaces, to full onboard data management systems, with a variety of toolboxes. Advanced packages allow to input and display data from multiple sensors or multiple annotators via intranet or internet. Posterior human-mediated annotation often include tools for data display and image analysis, e.g. length, area, image segmentation, point count; and in a few cases the possibility of browsing and editing previous dive logs or to analyze the annotations. The interaction with a database allows the automatic integration of annotations from different surveys, repeated annotation and collaborative annotation of shared datasets, browsing and querying of data. Progress in the field of automated annotation is mostly in post processing, for stable platforms or still images. Integration into available MIAS is currently limited to semi-automated processes of pixel recognition through computer-vision modules that compile expert-based knowledge. Important topics aiding the choice of a specific software are outlined, the ideal software is discussed and future trends are presented.

  2. Phase II Trials for Heterogeneous Patient Populations with a Time-to-Event Endpoint.

    PubMed

    Jung, Sin-Ho

    2017-07-01

    In this paper, we consider a single-arm phase II trial with a time-to-event end-point. We assume that the study population has multiple subpopulations with different prognosis, but the study treatment is expected to be similarly efficacious across the subpopulations. We review a stratified one-sample log-rank test and present its sample size calculation method under some practical design settings. Our sample size method requires specification of the prevalence of subpopulations. We observe that the power of the resulting sample size is not very sensitive to misspecification of the prevalence.

  3. Carrier Mediated Distribution System (CAMDIS): a new approach for the measurement of octanol/water distribution coefficients.

    PubMed

    Wagner, Bjoern; Fischer, Holger; Kansy, Manfred; Seelig, Anna; Assmus, Frauke

    2015-02-20

    Here we present a miniaturized assay, referred to as Carrier-Mediated Distribution System (CAMDIS) for fast and reliable measurement of octanol/water distribution coefficients, log D(oct). By introducing a filter support for octanol, phase separation from water is facilitated and the tendency of emulsion formation (emulsification) at the interface is reduced. A guideline for the best practice of CAMDIS is given, describing a strategy to manage drug adsorption at the filter-supported octanol/buffer interface. We validated the assay on a set of 52 structurally diverse drugs with known shake flask log D(oct) values. Excellent agreement with literature data (r(2) = 0.996, standard error of estimate, SEE = 0.111), high reproducibility (standard deviation, SD < 0.1 log D(oct) units), minimal sample consumption (10 μL of 100 μM DMSO stock solution) and a broad analytical range (log D(oct) range = -0.5 to 4.2) make CAMDIS a valuable tool for the high-throughput assessment of log D(oc)t. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Decision support using anesthesia information management system records and accreditation council for graduate medical education case logs for resident operating room assignments.

    PubMed

    Wanderer, Jonathan P; Charnin, Jonathan; Driscoll, William D; Bailin, Michael T; Baker, Keith

    2013-08-01

    Our goal in this study was to develop decision support systems for resident operating room (OR) assignments using anesthesia information management system (AIMS) records and Accreditation Council for Graduate Medical Education (ACGME) case logs and evaluate the implementations. We developed 2 Web-based systems: an ACGME case-log visualization tool, and Residents Helping in Navigating OR Scheduling (Rhinos), an interactive system that solicits OR assignment requests from residents and creates resident profiles. Resident profiles are snapshots of the cases and procedures each resident has done and were derived from AIMS records and ACGME case logs. A Rhinos pilot was performed for 6 weeks on 2 clinical services. One hundred sixty-five requests were entered and used in OR assignment decisions by a single attending anesthesiologist. Each request consisted of a rank ordered list of up to 3 ORs. Residents had access to detailed information about these cases including surgeon and patient name, age, procedure type, and admission status. Success rates at matching resident requests were determined by comparing requests with AIMS records. Of the 165 requests, 87 first-choice matches (52.7%), 27 second-choice matches (16.4%), and 8 third-choice matches (4.8%) were made. Forty-three requests were unmatched (26.1%). Thirty-nine first-choice requests overlapped (23.6%). Full implementation followed on 8 clinical services for 8 weeks. Seven hundred fifty-four requests were reviewed by 15 attending anesthesiologists, with 339 first-choice matches (45.0%), 122 second-choice matches (16.2%), 55 third-choice matches (7.3%), and 238 unmatched (31.5%). There were 279 overlapping first-choice requests (37.0%). The overall combined match success rate was 69.4%. Separately, we developed an ACGME case-log visualization tool that allows individual resident experiences to be compared against case minimums as well as resident peer groups. We conclude that it is feasible to use ACGME case-log data in decision support systems for informing resident OR assignments. Additional analysis will be necessary to assess the educational impact of these systems.

  5. Generation of Vulcanian activity and long-period seismicity at Volcán de Colima, Mexico

    NASA Astrophysics Data System (ADS)

    Varley, Nick; Arámbula-Mendoza, Raúl; Reyes-Dávila, Gabriel; Sanderson, Richard; Stevenson, John

    2010-12-01

    During the current episode, which commenced in 1998, activity at Volcán de Colima has been characterised by daily Vulcanian events, several effusive phases and a number of larger dome destroying explosions. The upper edifice comprises of an intricate array of fractures and within this system, variations in the magma ascent rate, rheology and volatile-contents complete a complexity which controls the style of activity. Subtle variations in one or more of these factors can trigger a transition. A model is presented of the Vulcanian explosion mechanism, which is reflected in the associated seismicity: first the breaching of an impermeable cap and the initial gas loss after the rupture (low-frequency signal), followed by fragmentation (high-frequency signal). In 2005, a series of larger Vulcanian explosions associated with ascending magma, represent the period of activity with the highest production rate in recent years. Pyroclastic flows were produced by column collapse, with the absence of vesicularity amongst the products pointing to a deep source of gas driving the eruption. The appearance of swarms of long-period (LP) events associated with the explosions provided a great opportunity for analysis and an insight into the processes within the fracture system, which control the eruptive style. Cross-correlation of the LP waveforms produced a series of ten families which reappeared in subsequent swarms suggesting a consistent source. Brittle fracture associated with the enhanced stain-rate found along the conduit margins is suggested as the source of this seismicity. This is supported by a linear relationship between log event rate and relative amplitude of the events within each swarm. An analysis of the temporal distribution of LP events revealed a variation in conditions between swarms. For some swarms, the conditions within the fracture system meant that a conflict between the processes associated with magma ascent was revealed, whilst for others, it was a situation of failure and backup, which can be interpreted as the failing and activation of different fractures within the upper edifice.

  6. Conversion of Questionnaire Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powell, Danny H; Elwood Jr, Robert H

    During the survey, respondents are asked to provide qualitative answers (well, adequate, needs improvement) on how well material control and accountability (MC&A) functions are being performed. These responses can be used to develop failure probabilities for basic events performed during routine operation of the MC&A systems. The failure frequencies for individual events may be used to estimate total system effectiveness using a fault tree in a probabilistic risk analysis (PRA). Numeric risk values are required for the PRA fault tree calculations that are performed to evaluate system effectiveness. So, the performance ratings in the questionnaire must be converted to relativemore » risk values for all of the basic MC&A tasks performed in the facility. If a specific material protection, control, and accountability (MPC&A) task is being performed at the 'perfect' level, the task is considered to have a near zero risk of failure. If the task is performed at a less than perfect level, the deficiency in performance represents some risk of failure for the event. As the degree of deficiency in performance increases, the risk of failure increases. If a task that should be performed is not being performed, that task is in a state of failure. The failure probabilities of all basic events contribute to the total system risk. Conversion of questionnaire MPC&A system performance data to numeric values is a separate function from the process of completing the questionnaire. When specific questions in the questionnaire are answered, the focus is on correctly assessing and reporting, in an adjectival manner, the actual performance of the related MC&A function. Prior to conversion, consideration should not be given to the numeric value that will be assigned during the conversion process. In the conversion process, adjectival responses to questions on system performance are quantified based on a log normal scale typically used in human error analysis (see A.D. Swain and H.E. Guttmann, 'Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications,' NUREG/CR-1278). This conversion produces the basic event risk of failure values required for the fault tree calculations. The fault tree is a deductive logic structure that corresponds to the operational nuclear MC&A system at a nuclear facility. The conventional Delphi process is a time-honored approach commonly used in the risk assessment field to extract numerical values for the failure rates of actions or activities when statistically significant data is absent.« less

  7. Logging while fishing technique results in substantial savings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tollefsen, E.; Everett, M.

    1996-12-01

    During wireline logging operations, tools occasionally become stuck in the borehole and require fishing. A typical fishing job can take anywhere from 1{1/2}--4 days. In the Gulf of Mexico, a fishing job can easily cost between $100,000 and $500,000. These costs result from nonproductive time during the fishing trip, associated wiper trip and relogging the well. Logging while fishing (LWF) technology is a patented system capable of retrieving a stuck fish and completing the logging run during the same pipe descent. Completing logging operations using LWF method saves time and money. The technique also provides well information where data maymore » not otherwise have been obtained. Other benefits include reduced fishing time and an increased level of safety.« less

  8. Automatically Log Off Upon Disappearance of Facial Image

    DTIC Science & Technology

    2005-03-01

    log off a PC when the user’s face disappears for an adjustable time interval. Among the fundamental technologies of biometrics, facial recognition is... facial recognition products. In this report, a brief overview of face detection technologies is provided. The particular neural network-based face...ensure that the user logging onto the system is the same person. Among the fundamental technologies of biometrics, facial recognition is the only

  9. Seasonal pathogen removal by alternative on-site wastewater treatment systems.

    PubMed

    Pundsack, J; Axler, R; Hicks, R; Henneck, J; Nordman, D; McCarthy, B

    2001-01-01

    Subsurface-flow constructed wetlands, sand filters, and peat filters near Duluth, Minnesota, were studied to determine their seasonal performance for removing pathogens from wastewater. Influent was a high-strength septic tank effluent (mean values of 5-day biochemical oxygen demand, total nitrogen, and total phosphorus were 294, 96, and 15 mg/L, respectively) at the Natural Resources Research Institute's alternative treatment system test facility in northern Minnesota. Each treatment system was inoculated with cultures of Salmonella choleraesuis (serotype typhimurium) for 5 to 7 consecutive days in summer and winter during 1998 to 1999. After the seeding, outflow samples were taken until Salmonella counts were sustained at background levels. The removal of Salmonella was calculated for each system, although the exact removal mechanisms were not determined. During the summer, the wetlands removed 99.6 to 99.999 4% (2.4 to 5.3 log10 reduction) of the culturable Salmonella. The sand filters demonstrated a greater than 7 log10 removal of Salmonella cells, whereas the peat filters were responsible for a greater than 8 log10 loss of cells. Fewer Salomonella cells were removed by all of these systems during the winter, although the pattern of removal was similar to their summer operation. During the winter, the wetlands and sand filters removed greater than 1 log10 of culturable cells, but the peat filters were responsible for a greater than 5 log10 loss of cells. Fecal coliform removal patterns reflected those for Salmonella by treatment systems for summer and winter periods. Based on Salmonella and fecal coliform removal, the peat filters operated most effectively followed by the sand filters and the constructed wetlands.

  10. WinHPC System User Basics | High-Performance Computing | NREL

    Science.gov Websites

    guidance for starting to use this high-performance computing (HPC) system at NREL. Also see WinHPC policies ) when you are finished. Simply quitting Remote Desktop will keep your session active and using resources node). 2. Log in with your NREL.gov username/password. Remember to log out when finished. Mac 1. If you

  11. From Log Files to Assessment Metrics: Measuring Students' Science Inquiry Skills Using Educational Data Mining

    ERIC Educational Resources Information Center

    Gobert, Janice D.; Sao Pedro, Michael; Raziuddin, Juelaila; Baker, Ryan S.

    2013-01-01

    We present a method for assessing science inquiry performance, specifically for the inquiry skill of designing and conducting experiments, using educational data mining on students' log data from online microworlds in the Inq-ITS system (Inquiry Intelligent Tutoring System; www.inq-its.org). In our approach, we use a 2-step process: First we use…

  12. Infrared Drying as a Potential Alternative to Convective Drying for Biltong Production.

    PubMed

    Cherono, Kipchumba; Mwithiga, Gikuru; Schmidt, Stefan

    2016-06-03

    Two infrared systems set at an intensity of 4777 W/m 2 with peak emission wavelengths of 2.5 and 3.5 µm were used to produce biltong by drying differently pre-treated meat. In addition to meat texture and colour, the microbial quality of the biltong produced was assessed by quantifying viable heterotrophic microorganisms using a most probable number (MPN) method and by verifying the presence of presumptive Escherichia coli in samples produced using infrared and conventional convective drying. The two infrared drying systems reduced the heterotrophic microbial burden from 5.11 log 10 MPN/g to 2.89 log 10 MPN/g (2.5 µm) and 3.42 log 10 MPN/g (3.5 µm), respectively. The infrared systems achieved an up to one log higher MPN/g reduction than the convective system. In biltong samples produced by short wavelength (2.5 µm) infrared drying, E. coli was not detectable. This study demonstrates that the use of short wavelength infrared drying is a potential alternative to conventional convective drying by improving the microbiological quality of biltong products while at the same time delivering products of satisfactory quality.

  13. Practical life log video indexing based on content and context

    NASA Astrophysics Data System (ADS)

    Tancharoen, Datchakorn; Yamasaki, Toshihiko; Aizawa, Kiyoharu

    2006-01-01

    Today, multimedia information has gained an important role in daily life and people can use imaging devices to capture their visual experiences. In this paper, we present our personal Life Log system to record personal experiences in form of wearable video and environmental data; in addition, an efficient retrieval system is demonstrated to recall the desirable media. We summarize the practical video indexing techniques based on Life Log content and context to detect talking scenes by using audio/visual cues and semantic key frames from GPS data. Voice annotation is also demonstrated as a practical indexing method. Moreover, we apply body media sensors to record continuous life style and use body media data to index the semantic key frames. In the experiments, we demonstrated various video indexing results which provided their semantic contents and showed Life Log visualizations to examine personal life effectively.

  14. Health information and communication system for emergency management in a developing country, Iran.

    PubMed

    Seyedin, Seyed Hesam; Jamali, Hamid R

    2011-08-01

    Disasters are fortunately rare occurrences. However, accurate and timely information and communication are vital to adequately prepare individual health organizations for such events. The current article investigates the health related communication and information systems for emergency management in Iran. A mixed qualitative and quantitative methodology was used in this study. A sample of 230 health service managers was surveyed using a questionnaire and 65 semi-structured interviews were also conducted with public health and therapeutic affairs managers who were responsible for emergency management. A range of problems were identified including fragmentation of information, lack of local databases, lack of clear information strategy and lack of a formal system for logging disaster related information at regional or local level. Recommendations were made for improving the national emergency management information and communication system. The findings have implications for health organizations in developing and developed countries especially in the Middle East. Creating disaster related information databases, creating protocols and standards, setting an information strategy, training staff and hosting a center for information system in the Ministry of Health to centrally manage and share the data could improve the current information system.

  15. UNIX security in a supercomputing environment

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1989-01-01

    The author critiques some security mechanisms in most versions of the Unix operating system and suggests more effective tools that either have working prototypes or have been implemented, for example in secure Unix systems. Although no computer (not even a secure one) is impenetrable, breaking into systems with these alternate mechanisms will cost more, require more skill, and be more easily detected than penetrations of systems without these mechanisms. The mechanisms described fall into four classes (with considerable overlap). User authentication at the local host affirms the identity of the person using the computer. The principle of least privilege dictates that properly authenticated users should have rights precisely sufficient to perform their tasks, and system administration functions should be compartmentalized; to this end, access control lists or capabilities should either replace or augment the default Unix protection system, and mandatory access controls implementing multilevel security models and integrity mechanisms should be available. Since most users access supercomputing environments using networks, the third class of mechanisms augments authentication (where feasible). As no security is perfect, the fourth class of mechanism logs events that may indicate possible security violations; this will allow the reconstruction of a successful penetration (if discovered), or possibly the detection of an attempted penetration.

  16. An examination of scale of assessment, logging and ENSO-induced fires on butterfly diversity in Borneo.

    PubMed

    Cleary, Daniel F R

    2003-04-01

    The impact of disturbance on species diversity may be related to the spatial scales over which it occurs. Here I assess the impact of logging and ENSO (El Niño Southern Oscillation) -induced burning and forest isolation on the species richness (477 species out of more than 28,000 individuals) and community composition of butterflies and butterfly guilds using small (0.9 ha) plots nested within large (450 ha) landscapes. The landscapes were located in three habitat classes: (1) continuous, unburned forest; (2) unburned isolates surrounded by burned forest; and (3) burned forest. Plots with different logging histories were sampled within the two unburned habitat classes, allowing for independent assessment of the two disturbance factors (logging and burning). Disturbance within habitat classes (logging) had a very different impact on butterfly diversity than disturbance among habitat classes (due to ENSO-induced burning and isolation). Logging increased species richness, increased evenness, and lowered dominance. Among guilds based on larval food plants, the species richness of tree and herb specialists was higher in logged areas but their abundance was lower. Both generalist species richness and abundance was higher in logged areas. Among habitat classes, species richness was lower in burned forest and isolates than continuous forest but there was no overall difference in evenness or dominance. Among guilds, generalist species richness was significantly lower in burned forest and isolates than continuous forest. Generalist abundance was also very low in the isolates. There was no difference among disturbance classes in herb specialist species richness but abundance was significantly higher in the isolates and burned forest than in continuous forest. Tree specialist species richness was lower in burned forest than continuous forest but did not differ between continuous forest and isolates. The scale of assessment proved important in estimating the impact of disturbance on species richness. Within disturbance classes, the difference in species richness between primary and logged forest was more pronounced at the smaller spatial scale. Among disturbance classes, the difference in species richness between continuous forest and isolates or burned forest was more pronounced at the larger spatial scale. The lower levels of species richness in ENSO-affected areas and at the larger (landscape) spatial scale indicate that future severe ENSO events may prove one of the most serious threats to extant biodiversity.

  17. Fatal injuries caused by logs rolling off trucks: Kentucky 1994-1998.

    PubMed

    Struttmann, T W; Scheerer, A L

    2001-02-01

    Logging is one of the most hazardous occupations and fatality rates are consistently among the highest of all industries. A review of fatalities caused by logs rolling off trucks is presented. The Kentucky Fatality Assessment and Control Evaluation Project is a statewide surveillance system for occupational fatalities. Investigations are conducted on selected injuries with an emphasis on prevention strategy development. Logging was an area of high priority for case investigation. During 1994-1998, we identified seven incidents in which a worker was killed by a log rolling off a truck at a sawmill, accounting for 15% of the 45 deaths related to logging activities. These cases were reviewed to identify similar characteristics and risk factors. Investigations led to recommendations for behavioral, administrative, and engineering controls. Potential interventions include limiting load height on trucks, installing unloading cages at sawmills and prohibiting overloaded trucks on public roadways. Copyright 2001 Wiley-Liss, Inc.

  18. What's new in well logging and formation evaluation

    USGS Publications Warehouse

    Prensky, S.

    2011-01-01

    A number of significant new developments is emerging in well logging and formation evaluation. Some of the new developments include an ultrasonic wireline imager, an electromagnetic free-point indicator, wired and fiber-optic coiled tubing systems, and extreme-temperature logging-while-drilling (LWD) tools. The continued consolidation of logging and petrophysical service providers in 2010 means that these innovations are increasingly being provided by a few large companies. Weatherford International has launched a slimhole cross-dipole tool as part of the company's line of compact logging tools. The 26-ft-long Compact Cross-Dipole Sonic (CXD) tool can be run as part of a quad-combo compact logging string. Halliburton has introduced a version of its circumferential acoustic scanning tool (CAST) that runs on monoconductor cable (CAST-M) to provide high-resolution images in open hole and in cased hole for casing and cement evaluation.

  19. Geophysical, stratigraphic, and flow-zone logs of selected test, monitor, and water-supply wells in Cayuga County, New York

    USGS Publications Warehouse

    Anderson, J. Alton; Williams, John H.; Eckhardt, David A.V.; Miller, Todd S.

    2003-01-01

    Volatile-organic compounds have been detected in water sampled from more than 50 supply wells between the City of Auburn and Village of Union Springs in Cayuga County, New York, and the area was declared a Superfund site in 2002. In 2001-04, geophysical logs were collected from 37 test, monitor, and water-supply wells as a preliminary part of the investigation of volatile-organic compound contamination in the carbonate-bedrock aquifer system. The geophysical logs included gamma, induction, caliper, wellbore image, deviation, fluid resistivity and temperature, and flowmeter. The geophysical logs were analyzed along with core samples and outcrops of the bedrock to define the stratigraphic units and flow zones penetrated by the wells. This report describes the logging methods used in the study and presents the geophysical, stratigraphic, and flow-zone logs.

  20. Transaction aware tape-infrastructure monitoring

    NASA Astrophysics Data System (ADS)

    Nikolaidis, Fotios; Kruse, Daniele Francesco

    2014-06-01

    Administrating a large scale, multi protocol, hierarchical tape infrastructure like the CERN Advanced STORage manager (CASTOR)[2], which stores now 100 PB (with an increasing step of 25 PB per year), requires an adequate monitoring system for quick spotting of malfunctions, easier debugging and on demand report generation. The main challenges for such system are: to cope with CASTOR's log format diversity and its information scattered among several log files, the need for long term information archival, the strict reliability requirements and the group based GUI visualization. For this purpose, we have designed, developed and deployed a centralized system consisting of four independent layers: the Log Transfer layer for collecting log lines from all tape servers to a single aggregation server, the Data Mining layer for combining log data into transaction context, the Storage layer for archiving the resulting transactions and finally the Web UI layer for accessing the information. Having flexibility, extensibility and maintainability in mind, each layer is designed to work as a message broker for the next layer, providing a clean and generic interface while ensuring consistency, redundancy and ultimately fault tolerance. This system unifies information previously dispersed over several monitoring tools into a single user interface, using Splunk, which also allows us to provide information visualization based on access control lists (ACL). Since its deployment, it has been successfully used by CASTOR tape operators for quick overview of transactions, performance evaluation, malfunction detection and from managers for report generation.

  1. The ALICE DAQ infoLogger

    NASA Astrophysics Data System (ADS)

    Chapeland, S.; Carena, F.; Carena, W.; Chibante Barroso, V.; Costa, F.; Dénes, E.; Divià, R.; Fuchs, U.; Grigore, A.; Ionita, C.; Delort, C.; Simonetti, G.; Soós, C.; Telesca, A.; Vande Vyvre, P.; Von Haller, B.; Alice Collaboration

    2014-04-01

    ALICE (A Large Ion Collider Experiment) is a heavy-ion experiment studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). The ALICE DAQ (Data Acquisition System) is based on a large farm of commodity hardware consisting of more than 600 devices (Linux PCs, storage, network switches). The DAQ reads the data transferred from the detectors through 500 dedicated optical links at an aggregated and sustained rate of up to 10 Gigabytes per second and stores at up to 2.5 Gigabytes per second. The infoLogger is the log system which collects centrally the messages issued by the thousands of processes running on the DAQ machines. It allows to report errors on the fly, and to keep a trace of runtime execution for later investigation. More than 500000 messages are stored every day in a MySQL database, in a structured table keeping track for each message of 16 indexing fields (e.g. time, host, user, ...). The total amount of logs for 2012 exceeds 75GB of data and 150 million rows. We present in this paper the architecture and implementation of this distributed logging system, consisting of a client programming API, local data collector processes, a central server, and interactive human interfaces. We review the operational experience during the 2012 run, in particular the actions taken to ensure shifters receive manageable and relevant content from the main log stream. Finally, we present the performance of this log system, and future evolutions.

  2. Expert systems for automated correlation and interpretation of wireline logs

    USGS Publications Warehouse

    Olea, R.A.

    1994-01-01

    CORRELATOR is an interactive computer program for lithostratigraphic correlation of wireline logs able to store correlations in a data base with a consistency, accuracy, speed, and resolution that are difficult to obtain manually. The automatic determination of correlations is based on the maximization of a weighted correlation coefficient using two wireline logs per well. CORRELATOR has an expert system to scan and flag incongruous correlations in the data base. The user has the option to accept or disregard the advice offered by the system. The expert system represents knowledge through production rules. The inference system is goal-driven and uses backward chaining to scan through the rules. Work in progress is used to illustrate the potential that a second expert system with a similar architecture for interpreting dip diagrams could have to identify episodes-as those of interest in sequence stratigraphy and fault detection- and annotate them in the stratigraphic column. Several examples illustrate the presentation. ?? 1994 International Association for Mathematical Geology.

  3. A comparison of different turbidite plays in the Yinggehai and Qiongdongnan Basins of the South China Sea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardwell, R.K.; Norris, J.W.

    1996-12-31

    Three different types of turbidite plays have been drilled in the Yinggehai and Qiongdongnan basins of the South China Sea: slope fan turbidites, bottomset turbidites, and channel fill turbidites. Each play type has a distinctive well log signature, lithology, seismic reflector geometry, and reservoir character. Slope fan turbidites are encountered in the YA 21-1-3 well. Well logs are characterized by a ratty SP curve, and mud logs indicate that the turbidites are composed of up to 80 m of sands and silts. Seismic profiles show that these turbidites are found in a distributary channel and levee system on the shelf.more » Bottomset turbidites are encountered in the LD 15-1-1 well. Well logs are characterized by an upward coarsening SP curve, and mud logs indicate that the turbidites are composed of up to 10 m of silty sand. Seismic profiles show these turbidites are deposited by the slumping of shelf sands during a continuous lowstand progradation. Channel fill turbidites are encountered in the LD 30-1-1 well. Well logs are characterized by a blocky SP curve, and mud logs indicate that the turbidites are composed of up to 100 m of massive sand. Seismic profiles show that these turbidites are associated with channel systems that trend parallel to the local basin axis. Distinct cut and fill geometries indicate that the turbidite sands were deposited in a preexisting channel cut.« less

  4. A comparison of different turbidite plays in the Yinggehai and Qiongdongnan Basins of the South China Sea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardwell, R.K.; Norris, J.W.

    1996-01-01

    Three different types of turbidite plays have been drilled in the Yinggehai and Qiongdongnan basins of the South China Sea: slope fan turbidites, bottomset turbidites, and channel fill turbidites. Each play type has a distinctive well log signature, lithology, seismic reflector geometry, and reservoir character. Slope fan turbidites are encountered in the YA 21-1-3 well. Well logs are characterized by a ratty SP curve, and mud logs indicate that the turbidites are composed of up to 80 m of sands and silts. Seismic profiles show that these turbidites are found in a distributary channel and levee system on the shelf.more » Bottomset turbidites are encountered in the LD 15-1-1 well. Well logs are characterized by an upward coarsening SP curve, and mud logs indicate that the turbidites are composed of up to 10 m of silty sand. Seismic profiles show these turbidites are deposited by the slumping of shelf sands during a continuous lowstand progradation. Channel fill turbidites are encountered in the LD 30-1-1 well. Well logs are characterized by a blocky SP curve, and mud logs indicate that the turbidites are composed of up to 100 m of massive sand. Seismic profiles show that these turbidites are associated with channel systems that trend parallel to the local basin axis. Distinct cut and fill geometries indicate that the turbidite sands were deposited in a preexisting channel cut.« less

  5. Unified picture of strong-coupling stochastic thermodynamics and time reversals

    NASA Astrophysics Data System (ADS)

    Aurell, Erik

    2018-04-01

    Strong-coupling statistical thermodynamics is formulated as the Hamiltonian dynamics of an observed system interacting with another unobserved system (a bath). It is shown that the entropy production functional of stochastic thermodynamics, defined as the log ratio of forward and backward system path probabilities, is in a one-to-one relation with the log ratios of the joint initial conditions of the system and the bath. A version of strong-coupling statistical thermodynamics where the system-bath interaction vanishes at the beginning and at the end of a process is, as is also weak-coupling stochastic thermodynamics, related to the bath initially in equilibrium by itself. The heat is then the change of bath energy over the process, and it is discussed when this heat is a functional of the system history alone. The version of strong-coupling statistical thermodynamics introduced by Seifert and Jarzynski is related to the bath initially in conditional equilibrium with respect to the system. This leads to heat as another functional of the system history which needs to be determined by thermodynamic integration. The log ratio of forward and backward system path probabilities in a stochastic process is finally related to log ratios of the initial conditions of a combined system and bath. It is shown that the entropy production formulas of stochastic processes under a general class of time reversals are given by the differences of bath energies in a larger underlying Hamiltonian system. The paper highlights the centrality of time reversal in stochastic thermodynamics, also in the case of strong coupling.

  6. Outcomes of cataract surgery with residents as primary surgeons in the Veterans Affairs Healthcare System.

    PubMed

    Payal, Abhishek R; Gonzalez-Gonzalez, Luis A; Chen, Xi; Cakiner-Egilmez, Tulay; Chomsky, Amy; Baze, Elizabeth; Vollman, David; Lawrence, Mary G; Daly, Mary K

    2016-03-01

    To explore visual outcomes, functional visual improvement, and events in resident-operated cataract surgery cases. Veterans Affairs Ophthalmic Surgery Outcomes Database Project across 5 Veterans Affairs Medical Centers. Retrospective data analysis of deidentified data. Cataract surgery cases with residents as primary surgeons were analyzed for logMAR corrected distance visual acuity (CDVA) and vision-related quality of life (VRQL) measured by the modified National Eye Institute Vision Function Questionnaire and 30 intraoperative and postoperative events. In some analyses, cases without events (Group A) were compared with cases with events (Group B). The study included 4221 cataract surgery cases. Preoperative to postoperative CDVA improved significantly in both groups (P < .0001), although the level of improvement was less in Group B (P = .03). A CDVA of 20/40 or better was achieved in 96.64% in Group A and 88.25% in Group B (P < .0001); however, Group B had a higher prevalence of preoperative ocular comorbidities (P < .0001). Cases with 1 or more events were associated with a higher likelihood of a postoperative CDVA worse than 20/40 (odds ratio, 3.82; 95% confidence interval, 2.92-5.05; P < .0001) than those who did not experience an event. Both groups had a significant increase in VRQL from preoperative levels (both P < .0001); however, the level of preoperative to postoperative VRQL improvement was significantly less in Group B (P < .0001). Resident-operated cases with and without events had an overall significant improvement in visual acuity and visual function compared with preoperatively, although this improvement was less marked in those that had an event. None of the authors has a financial or proprietary interest in any material or method mentioned. Copyright © 2016 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  7. Woodstove Emission Sampling Methods Comparability Analysis and In-situ Evaluation of New Technology Woodstoves.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simons, Carl A.

    1988-06-01

    One major objective of this study was to compare several woodstove particulate emission sampling methods under laboratory and in-situ conditions. The laboratory work compared the EPA Method 5H, EPA Method 5G, and OMNI Automated Woodstove Emission Sampler (AWES)/Data LOG'r particulate emission sampling systems. A second major objective of the study was to evaluate the performance of two integral catalytic, two low emission non-catalytic, and two conventional technology woodstoves under in-situ conditions with AWES/Data LOG'r system. The AWES/Data LOG'r and EPA Method 5G sampling systems were also compared in an in-situ test on one of the integral catalytic woodstove models. 7more » figs., 12 tabs.« less

  8. VALORATE: fast and accurate log-rank test in balanced and unbalanced comparisons of survival curves and cancer genomics.

    PubMed

    Treviño, Victor; Tamez-Pena, Jose

    2017-06-15

    The association of genomic alterations to outcomes in cancer is affected by a problem of unbalanced groups generated by the low frequency of alterations. For this, an R package (VALORATE) that estimates the null distribution and the P -value of the log-rank based on a recent reformulation is presented. For a given number of alterations that define the size of survival groups, the log-rank density is estimated by a weighted sum of conditional distributions depending on a co-occurrence term of mutations and events. The estimations are accurately accelerated by sampling across co-occurrences allowing the analysis of large genomic datasets in few minutes. In conclusion, the proposed VALORATE R package is a valuable tool for survival analysis. The R package is available in CRAN at https://cran.r-project.org and in http://bioinformatica.mty.itesm.mx/valorateR . vtrevino@itesm.mx. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  9. Time-dependent analysis of dosage delivery information for patient-controlled analgesia services.

    PubMed

    Kuo, I-Ting; Chang, Kuang-Yi; Juan, De-Fong; Hsu, Steen J; Chan, Chia-Tai; Tsou, Mei-Yung

    2018-01-01

    Pain relief always plays the essential part of perioperative care and an important role of medical quality improvement. Patient-controlled analgesia (PCA) is a method that allows a patient to self-administer small boluses of analgesic to relieve the subjective pain. PCA logs from the infusion pump consisted of a lot of text messages which record all events during the therapies. The dosage information can be extracted from PCA logs to provide easily understanding features. The analysis of dosage information with time has great help to figure out the variance of a patient's pain relief condition. To explore the trend of pain relief requirement, we developed a PCA dosage information generator (PCA DIG) to extract meaningful messages from PCA logs during the first 48 hours of therapies. PCA dosage information including consumption, delivery, infusion rate, and the ratio between demand and delivery is presented with corresponding values in 4 successive time frames. Time-dependent statistical analysis demonstrated the trends of analgesia requirements decreased gradually along with time. These findings are compatible with clinical observations and further provide valuable information about the strategy to customize postoperative pain management.

  10. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems.

    PubMed

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.

  11. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems

    PubMed Central

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively. PMID:27974882

  12. SU-E-T-142: Automatic Linac Log File: Analysis and Reporting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gainey, M; Rothe, T

    Purpose: End to end QA for IMRT/VMAT is time consuming. Automated linac log file analysis and recalculation of daily recorded fluence, and hence dose, distribution bring this closer. Methods: Matlab (R2014b, Mathworks) software was written to read in and analyse IMRT/VMAT trajectory log files (TrueBeam 1.5, Varian Medical Systems) overnight, and are archived on a backed-up network drive (figure). A summary report (PDF) is sent by email to the duty linac physicist. A structured summary report (PDF) for each patient is automatically updated for embedding into the R&V system (Mosaiq 2.5, Elekta AG). The report contains cross-referenced hyperlinks to easemore » navigation between treatment fractions. Gamma analysis can be performed on planned (DICOM RTPlan) and treated (trajectory log) fluence distributions. Trajectory log files can be converted into RTPlan files for dose distribution calculation (Eclipse, AAA10.0.28, VMS). Results: All leaf positions are within +/−0.10mm: 57% within +/−0.01mm; 89% within 0.05mm. Mean leaf position deviation is 0.02mm. Gantry angle variations lie in the range −0.1 to 0.3 degrees, mean 0.04 degrees. Fluence verification shows excellent agreement between planned and treated fluence. Agreement between planned and treated dose distribution, the derived from log files, is very good. Conclusion: Automated log file analysis is a valuable tool for the busy physicist, enabling potential treated fluence distribution errors to be quickly identified. In the near future we will correlate trajectory log analysis with routine IMRT/VMAT QA analysis. This has the potential to reduce, but not eliminate, the QA workload.« less

  13. Model Analyst’s Toolkit User Guide, Version 7.1.0

    DTIC Science & Technology

    2015-08-01

    Help > About)  Environment details ( operating system )  metronome.log file, located in your MAT 7.1.0 installation folder  Any log file that...requirements to run the Model Analyst’s Toolkit:  Windows XP operating system (or higher) with Service Pack 2 and all critical Windows updates installed...application icon on your desktop  Create a Quick Launch icon – Creates a MAT application icon on the taskbar for operating systems released

  14. Prediction of Compressional, Shear, and Stoneley Wave Velocities from Conventional Well Log Data Using a Committee Machine with Intelligent Systems

    NASA Astrophysics Data System (ADS)

    Asoodeh, Mojtaba; Bagheripour, Parisa

    2012-01-01

    Measurement of compressional, shear, and Stoneley wave velocities, carried out by dipole sonic imager (DSI) logs, provides invaluable data in geophysical interpretation, geomechanical studies and hydrocarbon reservoir characterization. The presented study proposes an improved methodology for making a quantitative formulation between conventional well logs and sonic wave velocities. First, sonic wave velocities were predicted from conventional well logs using artificial neural network, fuzzy logic, and neuro-fuzzy algorithms. Subsequently, a committee machine with intelligent systems was constructed by virtue of hybrid genetic algorithm-pattern search technique while outputs of artificial neural network, fuzzy logic and neuro-fuzzy models were used as inputs of the committee machine. It is capable of improving the accuracy of final prediction through integrating the outputs of aforementioned intelligent systems. The hybrid genetic algorithm-pattern search tool, embodied in the structure of committee machine, assigns a weight factor to each individual intelligent system, indicating its involvement in overall prediction of DSI parameters. This methodology was implemented in Asmari formation, which is the major carbonate reservoir rock of Iranian oil field. A group of 1,640 data points was used to construct the intelligent model, and a group of 800 data points was employed to assess the reliability of the proposed model. The results showed that the committee machine with intelligent systems performed more effectively compared with individual intelligent systems performing alone.

  15. Education Department Begins Process to Implement HEA Reauthorization with New Campus Safety Provisions

    ERIC Educational Resources Information Center

    Phillips, Lisa

    2008-01-01

    The U.S. Department of Education has announced the beginning of the process to develop rules for new requirements in the recently passed Higher Education Act (HEA). Highlights of the HEA that affect campus public safety departments include measures that: (1) Require a fire log be maintained at an institution of higher education for events that…

  16. 25 CFR 542.43 - What are the minimum internal control standards for surveillance for a Tier C gaming operation?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... recordings and/or digital records shall be provided to the Commission upon request. (x) Video library log. A... events on video and/or digital recordings. The displayed date and time shall not significantly obstruct... each gaming machine change booth. (w) Video recording and/or digital record retention. (1) All video...

  17. 25 CFR 542.43 - What are the minimum internal control standards for surveillance for a Tier C gaming operation?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... recordings and/or digital records shall be provided to the Commission upon request. (x) Video library log. A... events on video and/or digital recordings. The displayed date and time shall not significantly obstruct... each gaming machine change booth. (w) Video recording and/or digital record retention. (1) All video...

  18. 25 CFR 542.43 - What are the minimum internal control standards for surveillance for a Tier C gaming operation?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... recordings and/or digital records shall be provided to the Commission upon request. (x) Video library log. A... events on video and/or digital recordings. The displayed date and time shall not significantly obstruct... each gaming machine change booth. (w) Video recording and/or digital record retention. (1) All video...

  19. Log Analysis Using Splunk Hadoop Connect

    DTIC Science & Technology

    2017-06-01

    running a logging service puts a performance tax on the system and may cause the degradation of performance. More thorough 8 logging will cause a...several nodes. For example, a disk failure would affect all the tasks running on a particular node and generate an alert message not only for the disk...the commands that were executed from the " Run " command. The keylogger installation did not create any registry keys for the program itself. However

  20. System Connection via SSH Gateway | High-Performance Computing | NREL

    Science.gov Websites

    ;@peregrine.hpc.nrel.gov First time logging in? If this is the first time you've logged in with your new account, you will password. You will be prompted to enter it a second time, then you will be logged off. Just reconnect with your HPC password at any time, you can simply use the passwd command. Remote Users If you're connecting

  1. The application of PGNAA borehole logging for copper grade estimation at Chuquicamata mine.

    PubMed

    Charbucinski, J; Duran, O; Freraut, R; Heresi, N; Pineyro, I

    2004-05-01

    The field trials of a prompt gamma neutron activation (PGNAA) spectrometric logging method and instrumentation (SIROLOG) for copper grade estimation in production holes of a porphyry type copper ore mine, Chuquicamata in Chile, are described. Examples of data analysis, calibration procedures and copper grade profiles are provided. The field tests have proved the suitability of the PGNAA logging system for in situ quality control of copper ore.

  2. Maintaining ecosystem resilience: functional responses of tree cavity nesters to logging in temperate forests of the Americas.

    PubMed

    Ibarra, José Tomás; Martin, Michaela; Cockle, Kristina L; Martin, Kathy

    2017-06-30

    Logging often reduces taxonomic diversity in forest communities, but little is known about how this biodiversity loss affects the resilience of ecosystem functions. We examined how partial logging and clearcutting of temperate forests influenced functional diversity of birds that nest in tree cavities. We used point-counts in a before-after-control-impact design to examine the effects of logging on the value, range, and density of functional traits in bird communities in Canada (21 species) and Chile (16 species). Clearcutting, but not partial logging, reduced diversity in both systems. The effect was much more pronounced in Chile, where logging operations removed critical nesting resources (large decaying trees), than in Canada, where decaying aspen Populus tremuloides were retained on site. In Chile, logging was accompanied by declines in species richness, functional richness (amount of functional niche occupied by species), community-weighted body mass (average mass, weighted by species densities), and functional divergence (degree of maximization of divergence in occupied functional niche). In Canada, clearcutting did not affect species richness but nevertheless reduced functional richness and community-weighted body mass. Although some cavity-nesting birds can persist under intensive logging operations, their ecosystem functions may be severely compromised unless future nest trees can be retained on logged sites.

  3. Impact of stent length on clinical outcomes of first-generation and new-generation drug-eluting stents.

    PubMed

    Konishi, Hirokazu; Miyauchi, Katsumi; Dohi, Tomotaka; Tsuboi, Shuta; Ogita, Manabu; Naito, Ryo; Kasai, Takatoshi; Tamura, Hiroshi; Okazaki, Shinya; Isoda, Kikuo; Daida, Hiroyuki

    2016-04-01

    The aim of this study is to compare first- and new-generation drug-eluting stents (DESs) which are implanted in long lesion. Stent length is known to be a predictor of adverse events after percutaneous coronary intervention (PCI), even with the first-generation DESs. The introduction of new-generation DESs has reduced the rates of adverse clinical events. However, the impact of stent length on long-term clinical outcomes is not well known. A total of 1181 consecutive patients who underwent PCI using either a first-generation DES (n = 885) or a new-generation DES (n = 296) between 2004 and 2011 were investigated. In each of the stent groups, the patients were divided into two groups by stent length (>32 and ≤32 mm) and compared. During the follow-up period, the incidence of major adverse cardiac events (MACEs) was significantly higher for patients with long stents implanted than with short stents (P < 0.01; log-rank test) in the first-generation DES group. However, there was no difference in the incidence of MACEs between the long- and short-stent groups in the new-generation DES group (P = 0.24; log-rank test). On multivariate Cox regression analysis, stent length was not associated with adverse events in the new-generation DES groups [hazard ratio (HR) 0.87; 95 % confidence interval (95 % CI) 0.71-1.04; P = 0.14]. Implanted stent length was significantly associated with a higher risk of MACEs in patients who received first-generation DESs, but not in patients who received the new-generation DESs.

  4. Comparing of Cox model and parametric models in analysis of effective factors on event time of neuropathy in patients with type 2 diabetes.

    PubMed

    Kargarian-Marvasti, Sadegh; Rimaz, Shahnaz; Abolghasemi, Jamileh; Heydari, Iraj

    2017-01-01

    Cox proportional hazard model is the most common method for analyzing the effects of several variables on survival time. However, under certain circumstances, parametric models give more precise estimates to analyze survival data than Cox. The purpose of this study was to investigate the comparative performance of Cox and parametric models in a survival analysis of factors affecting the event time of neuropathy in patients with type 2 diabetes. This study included 371 patients with type 2 diabetes without neuropathy who were registered at Fereydunshahr diabetes clinic. Subjects were followed up for the development of neuropathy between 2006 to March 2016. To investigate the factors influencing the event time of neuropathy, significant variables in univariate model ( P < 0.20) were entered into the multivariate Cox and parametric models ( P < 0.05). In addition, Akaike information criterion (AIC) and area under ROC curves were used to evaluate the relative goodness of fitted model and the efficiency of each procedure, respectively. Statistical computing was performed using R software version 3.2.3 (UNIX platforms, Windows and MacOS). Using Kaplan-Meier, survival time of neuropathy was computed 76.6 ± 5 months after initial diagnosis of diabetes. After multivariate analysis of Cox and parametric models, ethnicity, high-density lipoprotein and family history of diabetes were identified as predictors of event time of neuropathy ( P < 0.05). According to AIC, "log-normal" model with the lowest Akaike's was the best-fitted model among Cox and parametric models. According to the results of comparison of survival receiver operating characteristics curves, log-normal model was considered as the most efficient and fitted model.

  5. Search engines, news wires and digital epidemiology: Presumptions and facts.

    PubMed

    Kaveh-Yazdy, Fatemeh; Zareh-Bidoki, Ali-Mohammad

    2018-07-01

    Digital epidemiology tries to identify diseases dynamics and spread behaviors using digital traces collected via search engines logs and social media posts. However, the impacts of news on information-seeking behaviors have been remained unknown. Data employed in this research provided from two sources, (1) Parsijoo search engine query logs of 48 months, and (2) a set of documents of 28 months of Parsijoo's news service. Two classes of topics, i.e. macro-topics and micro-topics were selected to be tracked in query logs and news. Keywords of the macro-topics were automatically generated using web provided resources and exceeded 10k. Keyword set of micro-topics were limited to a numerable list including terms related to diseases and health-related activities. The tests are established in the form of three studies. Study A includes temporal analyses of 7 macro-topics in query logs. Study B considers analyzing seasonality of searching patterns of 9 micro-topics, and Study C assesses the impact of news media coverage on users' health-related information-seeking behaviors. Study A showed that the hourly distribution of various macro-topics followed the changes in social activity level. Conversely, the interestingness of macro-topics did not follow the regulation of topic distributions. Among macro-topics, "Pharmacotherapy" has highest interestingness level and wider time-window of popularity. In Study B, seasonality of a limited number of diseases and health-related activities were analyzed. Trends of infectious diseases, such as flu, mumps and chicken pox were seasonal. Due to seasonality of most of diseases covered in national vaccination plans, the trend belonging to "Immunization and Vaccination" was seasonal, as well. Cancer awareness events caused peaks in search trends of "Cancer" and "Screening" micro-topics in specific days of each year that mimic repeated patterns which may mistakenly be identified as seasonality. In study C, we assessed the co-integration and correlation between news and query trends. Our results demonstrated that micro-topics sparsely covered in news media had lowest level of impressiveness and, subsequently, the lowest impact on users' intents. Our results can reveal public reaction to social events, diseases and prevention procedures. Furthermore, we found that news trends are co-integrated with search queries and are able to reveal health-related events; however, they cannot be used interchangeably. It is recommended that the user-generated contents and news documents are analyzed mutually and interactively. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Testing the causality of Hawkes processes with time reversal

    NASA Astrophysics Data System (ADS)

    Cordi, Marcus; Challet, Damien; Muni Toke, Ioane

    2018-03-01

    We show that univariate and symmetric multivariate Hawkes processes are only weakly causal: the true log-likelihoods of real and reversed event time vectors are almost equal, thus parameter estimation via maximum likelihood only weakly depends on the direction of the arrow of time. In ideal (synthetic) conditions, tests of goodness of parametric fit unambiguously reject backward event times, which implies that inferring kernels from time-symmetric quantities, such as the autocovariance of the event rate, only rarely produce statistically significant fits. Finally, we find that fitting financial data with many-parameter kernels may yield significant fits for both arrows of time for the same event time vector, sometimes favouring the backward time direction. This goes to show that a significant fit of Hawkes processes to real data with flexible kernels does not imply a definite arrow of time unless one tests it.

  7. Evaluation of statistical distributions to analyze the pollution of Cd and Pb in urban runoff.

    PubMed

    Toranjian, Amin; Marofi, Safar

    2017-05-01

    Heavy metal pollution in urban runoff causes severe environmental damage. Identification of these pollutants and their statistical analysis is necessary to provide management guidelines. In this study, 45 continuous probability distribution functions were selected to fit the Cd and Pb data in the runoff events of an urban area during October 2014-May 2015. The sampling was conducted from the outlet of the city basin during seven precipitation events. For evaluation and ranking of the functions, we used the goodness of fit Kolmogorov-Smirnov and Anderson-Darling tests. The results of Cd analysis showed that Hyperbolic Secant, Wakeby and Log-Pearson 3 are suitable for frequency analysis of the event mean concentration (EMC), the instantaneous concentration series (ICS) and instantaneous concentration of each event (ICEE), respectively. In addition, the LP3, Wakeby and Generalized Extreme Value functions were chosen for the EMC, ICS and ICEE related to Pb contamination.

  8. Flexible mating system in a logged population of Swietenia macrophylla King (Meliaceae): implications for the management of a threatened neotropical tree species

    Treesearch

    Maristerra R. Lemes; Dario Grattapaglia; James Grogan; John Proctor; Rog& eacute Gribel; rio

    2007-01-01

    Microsatellites were used to evaluate the mating system of the remaining trees in a logged population of Swietenia macrophylla, a highly valuable and threatened hardwood species, in the Brazilian Amazon. A total of 25 open pollinated progeny arrays of 16 individuals, with their mother trees, were genotyped using eight highly polymorphic...

  9. A case study assessing opportunity costs and ecological benefits of streamside management zones and logging systems for eastern hardwood forests

    Treesearch

    Chris B. LeDoux; Ethel Wilkerson

    2006-01-01

    Forest landowners, managers, loggers, land-use planners, and other decision and policy-makers need to understand the opportunity costs and ecological benefits associated with different widths of streamside management zones (SMZs). In this paper, a simulation model was used to assess the opportunity costs of SMZ retention for four different logging systems, two mature...

  10. Audit Log for Forensic Photography

    NASA Astrophysics Data System (ADS)

    Neville, Timothy; Sorell, Matthew

    We propose an architecture for an audit log system for forensic photography, which ensures that the chain of evidence of a photograph taken by a photographer at a crime scene is maintained from the point of image capture to its end application at trial. The requirements for such a system are specified and the results of experiments are presented which demonstrate the feasibility of the proposed approach.

  11. Analyse dendroecologique et dendroclimatique des gisements de bois de lacs de la taiga de l'est de l'Amerique du Nord

    NASA Astrophysics Data System (ADS)

    Gennaretti, Fabio

    The aim of this thesis was to reconstruct ecological processes and climate change in the taiga of Quebec over the last two millennia to understand factors that have strongly influenced the evolution of this majestic region. To obtain the finest spatial and temporal resolution in our analysis, we used annual growth rings of subfossil logs collected in six lakes as paleoecological and paleoclimatic proxies. Deposits of subfossil logs determine the structure of lake littoral ecosystems and support their food webs. Moreover, they may represent long-term carbon sinks. In the first chapter of the thesis, we described present-day stocks of subfossil logs in the selected littoral zones and established log residence time in the lakes by tree-ring or radio carbon dating. Dating also allowed precise identification of each fire that burned the riparian forests during the last millennium. This chapter showed that interactions between terrestrial and aquatic ecosystems in the taiga are strongly influenced by wildfires whose effects can persist for centuries because of strong postfire reductions of log recruitments in lakes. At a local scale, the amount of logs and carbon preserved in littoral stocks depends on the fire history of the last millennium that is specific to each site. At a regional scale, wildfires limit significantly the amount of carbon sequestered in littoral stocks of logs. These stocks represent a negligible fraction of the total taiga carbon storage despite the abundance of lakes and the long residence time of littoral logs (up to five millennia for buried logs). In the second chapter, we combined a detailed inventory of the present-day riparian forest situated along the shoreline of two lakes with the tree-ring dating of the subfossil logs accumulated in the littoral zones facing these shores. Our objective was to determine whether changes in current riparian forest structure and composition within a given site could be attributed to different fire histories over the last millennium and to show the impacts of past fires on tree mortality, density and growth. Using our impressive paleoecological dataset (n = 1037 logs) in combination with our present-day forest inventory, we were able to reconstruct millennial forest dynamics with an unprecedented high spatial (few hundreds of square meters) and temporal (annual) resolution. Our findings help explain how the present-day landscape diversity in the taiga reflects the fire history of the last millennium, which varies from site to site. Fires have caused persistent and cumulative impacts resulting in a progressive opening of the forest coyer along with exclusion of balsam fir, a fire-sensitive tree species. The taiga landscape is a mosaic of forest stands characterized by different times since fire and different postfire forest structure trajectories. In the third chapter, we used our network of millennial tree-ring chronologies developed from the collected subfossil logs to pro duce a regional reconstruction of July-August temperatures over the last 1100 years. Our network filled a wide gap in the north-hemispheric network of paleoclimate proxies with annual resolution used for temperature reconstructions of the last millennium (see IPCC report). Moreover, our reconstruction provided direct field evidence that the climate of Northeastern North America is particularly sensitive to volcanic forcing. Indeed, successive large eruptions triggered the beginning of cold episodes in the study area that persisted for decades. In particular, two series of eruptions, centered around the Samalas event in 1257 and the Tambora event in 1815, coincided with two abrupt temperature regime shifts. In Northeastern North America, these shifts marked the onset of the Little Ice Age and the beginning of its coldest phase, respectively. Our reconstruction also showed a well-expressed Medieval Climate Anomaly, which included a few decades significantly warmer than the last 10 years. Keywords : fire ecology; forest-lake interactions; large woody debris; Little Ice Age; Medieval Climate Anomaly; millennial tree-ring chronologies; plant-climate interactions; temperature regime shifts; trajectories of forest structure and composition; volcanic forcing.

  12. Parameterization of an empirical model for the prediction of n-octanol, alkane and cyclohexane/water as well as brain/blood partition coefficients.

    PubMed

    Zerara, Mohamed; Brickmann, Jürgen; Kretschmer, Robert; Exner, Thomas E

    2009-02-01

    Quantitative information of solvation and transfer free energies is often needed for the understanding of many physicochemical processes, e.g the molecular recognition phenomena, the transport and diffusion processes through biological membranes and the tertiary structure of proteins. Recently, a concept for the localization and quantification of hydrophobicity has been introduced (Jäger et al. J Chem Inf Comput Sci 43:237-247, 2003). This model is based on the assumptions that the overall hydrophobicity can be obtained as a superposition of fragment contributions. To date, all predictive models for the logP have been parameterized for n-octanol/water (logP(oct)) solvent while very few models with poor predictive abilities are available for other solvents. In this work, we propose a parameterization of an empirical model for n-octanol/water, alkane/water (logP(alk)) and cyclohexane/water (logP(cyc)) systems. Comparison of both logP(alk) and logP(cyc) with the logarithms of brain/blood ratios (logBB) for a set of structurally diverse compounds revealed a high correlation showing their superiority over the logP(oct) measure in this context.

  13. Preliminary report on geophysical well-logging activity on the Salton Sea Scientific Drilling Project, Imperial Valley, California

    USGS Publications Warehouse

    Paillet, Frederick L.; Morin, R.H.; Hodges, H.E.

    1986-01-01

    The Salton Sea Scientific Drilling Project has culminated in a 10,564-ft deep test well, State 2-14 well, in the Imperial Valley of southern California. A comprehensive scientific program of drilling, coring, and downhole measurements, which was conducted for about 5 months, has obtained much scientific information concerning the physical and chemical processes associated with an active hydrothermal system. This report primarily focuses on the geophysical logging activities at the State 2-14 well and provides early dissemination of geophysical data to other investigators working on complementary studies. Geophysical-log data were obtained by a commercial logging company and by the U.S. Geological Survey (USGS). Most of the commercial logs were obtained during three visits to the site; only one commercial log was obtained below a depth of 6,000 ft. The commercial logs obtained were dual induction, natural gamma, compensated neutron formation density, caliper and sonic. The USGS logging effort consisted of four primary periods, with many logs extending below a depth of 6,000 ft. The USGS logs obtained were temperature, caliper, natural gamma, gamma spectral, epithermal neutron, acoustic velocity, full-waveform, and acoustic televiewer. Various problems occurred throughout the drilling phase of the Salton Sea Scientific Drilling Project that made successful logging difficult: (1) borehole constrictions, possibly resulting from mud coagulation, (2) maximum temperatures of about 300 C, and (3) borehole conditions unfavorable for logging because of numerous zones of fluid loss, cement plugs, and damage caused by repeated trips in and out of the hole. These factors hampered and compromised logging quality at several open-hole intervals. The quality of the logs was dependent on the degree of probe sophistication and sensitivity to borehole-wall conditions. Digitized logs presented were processed on site and are presented in increments of 1,000 ft. A summary of the numerous factors that may be relevant to this interpretation also is presented. (Lantz-PTT)

  14. SHORT-TERM SAFETY PROFILE OF INTRAVITREAL ZIV-AFLIBERCEPT.

    PubMed

    Chhablani, Jay; Narayanan, Raja; Mathai, Annie; Yogi, Rohit; Stewart, Michael

    2016-06-01

    To evaluate the safety of intravitreal ziv-aflibercept (Zaltrap) in the treatment choroidal neovascularization secondary to age-related macular degeneration. Eligible eyes with choroidal neovascularization secondary to age-related macular degeneration each received a single intravitreal injection of ziv-aflibercept. Comprehensive ophthalmic examinations and detailed systemic evaluations were performed at baseline and Days 1, 7, and 30 after injection, and International Society for Clinical Electrophysiology of Vision standard electroretinography was performed at baseline and Day 30. Primary outcome measures were safety parameters that included signs of clinical and electroretinographic toxicity. Secondary outcome measures included changes in best-corrected visual acuity and central subfield thickness. Twelve eyes of 12 patients were treated. None of the patients complained of blurred vision, ocular pain, or bulbar injection at any of the follow-up visits, nor was intraocular inflammation noted. There were no significant differences in implicit times, "a" and "b" wave amplitudes, or b/a ratios at 1 month when compared with baseline (P = 0.4). None of the patients experienced serious ocular or systemic adverse events. Mean best-corrected visual acuity improved only slightly at 30 days (LogMAR 0.45 ± 0.31 [Snellen equivalent: 20/60]) compared with baseline (LogMAR 0.37 ± 0.24 [Snellen equivalent: 20/50]; P = 0.51). Single intravitreal injections of ziv-aflibercept into eyes with neovascular age-related macular degeneration appear to be safe through 1 month. Ziv-aflibercept could become a safe, low-cost therapy for macular diseases in developing countries and in those where intravitreal aflibercept (Eylea) is not available.

  15. SEC proton prediction model: verification and analysis.

    PubMed

    Balch, C C

    1999-06-01

    This paper describes a model that has been used at the NOAA Space Environment Center since the early 1970s as a guide for the prediction of solar energetic particle events. The algorithms for proton event probability, peak flux, and rise time are described. The predictions are compared with observations. The current model shows some ability to distinguish between proton event associated flares and flares that are not associated with proton events. The comparisons of predicted and observed peak flux show considerable scatter, with an rms error of almost an order of magnitude. Rise time comparisons also show scatter, with an rms error of approximately 28 h. The model algorithms are analyzed using historical data and improvements are suggested. Implementation of the algorithm modifications reduces the rms error in the log10 of the flux prediction by 21%, and the rise time rms error by 31%. Improvements are also realized in the probability prediction by deriving the conditional climatology for proton event occurrence given flare characteristics.

  16. CREM monitoring: a wireless RF application

    NASA Astrophysics Data System (ADS)

    Valencia, J. D.; Burghard, B. J.; Skorpik, J. R.; Silvers, K. L.; Schwartz, M. J.

    2005-05-01

    Recent security lapses within the Department of Energy laboratories prompted the establishment and implementation of additional procedures and training for operations involving classified removable electronic media (CREM) storage. In addition, the definition of CREM has been expanded and the number of CREM has increased significantly. Procedures now require that all CREM be inventoried and accounted for on a weekly basis. Weekly inventories consist of a physical comparison of each item against the reportable inventory listing. Securing and accounting for CREM is a continuous challenge for existing security systems. To address this challenge, an innovative framework, encompassing a suite of technologies, has been developed by Pacific Northwest National Laboratory (PNNL) to monitor, track, and locate CREM in safes, vaults, and storage areas. This Automated Removable Media Observation and Reporting (ARMOR)framework, described in this paper, is an extension of an existing PNNL program, SecureSafe. The key attributes of systems built around the ARMOR framework include improved accountability, reduced risk of human error, improved accuracy and timeliness of inventory data, and reduced costs. ARMOR solutions require each CREM to be tagged with a unique electronically readable ID code. Inventory data is collected from tagged CREM at regular intervals and upon detection of an access event. Automated inventory collection and report generation eliminates the need for hand-written inventory sheets and allows electronic transfer of the collected inventory data to a modern electronic reporting system. An electronic log of CREM access events is maintained, providing enhanced accountability for daily/weekly checks, routine audits, and follow-up investigations.

  17. Convective Mixing in Distal Pipes Exacerbates Legionella pneumophila Growth in Hot Water Plumbing.

    PubMed

    Rhoads, William J; Pruden, Amy; Edwards, Marc A

    2016-03-12

    Legionella pneumophila is known to proliferate in hot water plumbing systems, but little is known about the specific physicochemical factors that contribute to its regrowth. Here, L. pneumophila trends were examined in controlled, replicated pilot-scale hot water systems with continuous recirculation lines subject to two water heater settings (40 °C and 58 °C) and three distal tap water use frequencies (high, medium, and low) with two pipe configurations (oriented upward to promote convective mixing with the recirculating line and downward to prevent it). Water heater temperature setting determined where L. pneumophila regrowth occurred in each system, with an increase of up to 4.4 log gene copies/mL in the 40 °C system tank and recirculating line relative to influent water compared to only 2.5 log gene copies/mL regrowth in the 58 °C system. Distal pipes without convective mixing cooled to room temperature (23-24 °C) during periods of no water use, but pipes with convective mixing equilibrated to 30.5 °C in the 40 °C system and 38.8 °C in the 58 °C system. Corresponding with known temperature effects on L. pneumophila growth and enhanced delivery of nutrients, distal pipes with convective mixing had on average 0.2 log more gene copies/mL in the 40 °C system and 0.8 log more gene copies/mL in the 58 °C system. Importantly, this work demonstrated the potential for thermal control strategies to be undermined by distal taps in general, and convective mixing in particular.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, Anthony M.; Williams, Liliya L.R.; Hjorth, Jens, E-mail: amyoung@astro.umn.edu, E-mail: llrw@astro.umn.edu, E-mail: jens@dark-cosmology.dk

    One usually thinks of a radial density profile as having a monotonically changing logarithmic slope, such as in NFW or Einasto profiles. However, in two different classes of commonly used systems, this is often not the case. These classes exhibit non-monotonic changes in their density profile slopes which we call oscillations for short. We analyze these two unrelated classes separately. Class 1 consists of systems that have density oscillations and that are defined through their distribution function f ( E ), or differential energy distribution N ( E ), such as isothermal spheres, King profiles, or DARKexp, a theoretically derivedmore » model for relaxed collisionless systems. Systems defined through f ( E ) or N ( E ) generally have density slope oscillations. Class 1 system oscillations can be found at small, intermediate, or large radii but we focus on a limited set of Class 1 systems that have oscillations in the central regions, usually at log( r / r {sub −2}) ∼< −2, where r {sub −2} is the largest radius where d log(ρ)/ d log( r ) = −2. We show that the shape of their N ( E ) can roughly predict the amplitude of oscillations. Class 2 systems which are a product of dynamical evolution, consist of observed and simulated galaxies and clusters, and pure dark matter halos. Oscillations in the density profile slope seem pervasive in the central regions of Class 2 systems. We argue that in these systems, slope oscillations are an indication that a system is not fully relaxed. We show that these oscillations can be reproduced by small modifications to N ( E ) of DARKexp. These affect a small fraction of systems' mass and are confined to log( r / r {sub −2}) ∼< 0. The size of these modifications serves as a potential diagnostic for quantifying how far a system is from being relaxed.« less

  19. Predicting the Rate of River Bank Erosion Caused by Large Wood Log

    NASA Astrophysics Data System (ADS)

    Zhang, N.; Rutherfurd, I.; Ghisalberti, M.

    2016-12-01

    When a single tree falls into a river channel, flow is deflected and accelerated between the tree roots and the bank face, increasing shear stress and scouring the bank. The scallop shaped erosion increases the diversity of the channel morphology, but also causes concern for adjacent landholders. Concern about increased bank erosion is one of the main reasons for large wood to still be removed from channels in SE Australia. Further, the hydraulic effect of many logs in the channel can reduce overall bank erosion rates. Although both phenomena have been described before, this research develops a hydraulic model that estimates their magnitude, and tests and calibrates this model with flume and field measurements, with logs with various configurations and sizes. Specifically, the model estimates the change in excess shear stress on the bank associated . The model addresses the effect of the log angle, distance from bank, and log size and flow condition by solving the mass continuity and energy conservation between the cross section at the approaching flow and contracted flow. Then, we evaluate our model against flume experiment preformed with semi-realistic log models to represent logs in different sizes and decay stages by comparing the measured and simulated velocity increase in the gap between the log and the bank. The log angle, distance from bank, and flow condition are systemically varied for each log model during the experiment. Final, the calibrated model is compared with the field data collected in anabranching channels of Murray River in SE Australia where there are abundant instream logs and regulated and consistent high flow for irrigation. Preliminary results suggest that a log can significantly increase the shear stress on the bank, especially when it positions perpendicular to the flow. The shear stress increases with the log angle in a rising curve (The log angle is the angle between log trunk and flow direction. 0o means log is parallel to flow with canopy pointing downstream). However, the shear stress shows insignificant changes as the log is being moved close to the bank.

  20. Effects of stream-adjacent logging in fishless headwaters on downstream coastal cutthroat trout

    USGS Publications Warehouse

    Bateman, Douglas S.; Sloat, Matthew R.; Gresswell, Robert E.; Berger, Aaron M.; Hockman-Wert, David; Leer, David W.; Skaugset, Arne E.

    2016-01-01

    To investigate effects of headwater logging on downstream coastal cutthroat trout (Oncorhynchus clarkii clarkii) populations, we monitored stream habitat and biotic indicators including biomass, abundance, growth, movement, and survival over 8 years using a paired-watershed approach. Reference and logged catchments were located on private industrial forestland on ∼60-year harvest rotation. Five clearcuts (14% of the logged catchment area) were adjacent to fishless portions of the headwater streams, and contemporary regulations did not require riparian forest buffers in the treatment catchment. Logging did not have significant negative effects on downstream coastal cutthroat trout populations for the duration of the sample period. Indeed, the only statistically significant response of fish populations following logging in fishless headwaters was an increase in late-summer biomass (g·m−2) of age-1+ coastal cutthroat trout in tributaries. Ultimately, the ability to make broad generalizations concerning effects of timber harvest is difficult because response to disturbance (anthropogenically influenced or not) in aquatic systems is complex and context-dependent, but our findings provide one example of environmentally compatible commercial logging in a regenerated forest setting.

Top