Sample records for activity log files

  1. Who Goes There? Measuring Library Web Site Usage.

    ERIC Educational Resources Information Center

    Bauer, Kathleen

    2000-01-01

    Discusses how libraries can gather data on the use of their Web sites. Highlights include Web server log files, including the common log file, referrer log file, and agent log file; log file limitations; privacy concerns; and choosing log analysis software, both free and commercial. (LRW)

  2. Identification and Management of Pump Thrombus in the HeartWare Left Ventricular Assist Device System: A Novel Approach Using Log File Analysis.

    PubMed

    Jorde, Ulrich P; Aaronson, Keith D; Najjar, Samer S; Pagani, Francis D; Hayward, Christopher; Zimpfer, Daniel; Schlöglhofer, Thomas; Pham, Duc T; Goldstein, Daniel J; Leadley, Katrin; Chow, Ming-Jay; Brown, Michael C; Uriel, Nir

    2015-11-01

    The study sought to characterize patterns in the HeartWare (HeartWare Inc., Framingham, Massachusetts) ventricular assist device (HVAD) log files associated with successful medical treatment of device thrombosis. Device thrombosis is a serious adverse event for mechanical circulatory support devices and is often preceded by increased power consumption. Log files of the pump power are easily accessible on the bedside monitor of HVAD patients and may allow early diagnosis of device thrombosis. Furthermore, analysis of the log files may be able to predict the success rate of thrombolysis or the need for pump exchange. The log files of 15 ADVANCE trial patients (algorithm derivation cohort) with 16 pump thrombus events treated with tissue plasminogen activator (tPA) were assessed for changes in the absolute and rate of increase in power consumption. Successful thrombolysis was defined as a clinical resolution of pump thrombus including normalization of power consumption and improvement in biochemical markers of hemolysis. Significant differences in log file patterns between successful and unsuccessful thrombolysis treatments were verified in 43 patients with 53 pump thrombus events implanted outside of clinical trials (validation cohort). The overall success rate of tPA therapy was 57%. Successful treatments had significantly lower measures of percent of expected power (130.9% vs. 196.1%, p = 0.016) and rate of increase in power (0.61 vs. 2.87, p < 0.0001). Medical therapy was successful in 77.7% of the algorithm development cohort and 81.3% of the validation cohort when the rate of power increase and percent of expected power values were <1.25% and 200%, respectively. Log file parameters can potentially predict the likelihood of successful tPA treatments and if validated prospectively, could substantially alter the approach to thrombus management. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  3. TH-AB-201-12: Using Machine Log-Files for Treatment Planning and Delivery QA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanhope, C; Liang, J; Drake, D

    2016-06-15

    Purpose: To determine the segment reduction and dose resolution necessary for machine log-files to effectively replace current phantom-based patient-specific quality assurance, while minimizing computational cost. Methods: Elekta’s Log File Convertor R3.2 records linac delivery parameters (dose rate, gantry angle, leaf position) every 40ms. Five VMAT plans [4 H&N, 1 Pulsed Brain] comprised of 2 arcs each were delivered on the ArcCHECK phantom. Log-files were reconstructed in Pinnacle on the phantom geometry using 1/2/3/4° control point spacing and 2/3/4mm dose grid resolution. Reconstruction effectiveness was quantified by comparing 2%/2mm gamma passing rates of the original and log-file plans. Modulation complexity scoresmore » (MCS) were calculated for each beam to correlate reconstruction accuracy and beam modulation. Percent error in absolute dose for each plan-pair combination (log-file vs. ArcCHECK, original vs. ArcCHECK, log-file vs. original) was calculated for each arc and every diode greater than 10% of the maximum measured dose (per beam). Comparing standard deviations of the three plan-pair distributions, relative noise of the ArcCHECK and log-file systems was elucidated. Results: The original plans exhibit a mean passing rate of 95.1±1.3%. The eight more modulated H&N arcs [MCS=0.088±0.014] and two less modulated brain arcs [MCS=0.291±0.004] yielded log-file pass rates most similar to the original plan when using 1°/2mm [0.05%±1.3% lower] and 2°/3mm [0.35±0.64% higher] log-file reconstructions respectively. Log-file and original plans displayed percent diode dose errors 4.29±6.27% and 3.61±6.57% higher than measurement. Excluding the phantom eliminates diode miscalibration and setup errors; log-file dose errors were 0.72±3.06% higher than the original plans – significantly less noisy. Conclusion: For log-file reconstructed VMAT arcs, 1° control point spacing and 2mm dose resolution is recommended, however, less modulated arcs may allow less stringent reconstructions. Following the aforementioned reconstruction recommendations, the log-file technique is capable of detecting delivery errors with equivalent accuracy and less noise than ArcCHECK QA. I am funded by an Elekta Research Grant.« less

  4. Log file-based patient dose calculations of double-arc VMAT for head-and-neck radiotherapy.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Majima, Kazuhiro; Matsushita, Haruo; Takeda, Ken; Jingu, Keiichi

    2018-04-01

    The log file-based method cannot display dosimetric changes due to linac component miscalibration because of the insensitivity of log files to linac component miscalibration. The purpose of this study was to supply dosimetric changes in log file-based patient dose calculations for double-arc volumetric-modulated arc therapy (VMAT) in head-and-neck cases. Fifteen head-and-neck cases participated in this study. For each case, treatment planning system (TPS) doses were produced by double-arc and single-arc VMAT. Miscalibration-simulated log files were generated by inducing a leaf miscalibration of ±0.5 mm into the log files that were acquired during VMAT irradiation. Subsequently, patient doses were estimated using the miscalibration-simulated log files. For double-arc VMAT, regarding planning target volume (PTV), the change from TPS dose to miscalibration-simulated log file dose in D mean was 0.9 Gy and that for tumor control probability was 1.4%. As for organ-at-risks (OARs), the change in D mean was <0.7 Gy and normal tissue complication probability was <1.8%. A comparison between double-arc and single-arc VMAT for PTV showed statistically significant differences in the changes evaluated by D mean and radiobiological metrics (P < 0.01), even though the magnitude of these differences was small. Similarly, for OARs, the magnitude of these changes was found to be small. Using the log file-based method for PTV and OARs, the log file-based method estimate of patient dose using the double-arc VMAT has accuracy comparable to that obtained using the single-arc VMAT. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. Beyond Logging of Fingertip Actions: Analysis of Collaborative Learning Using Multiple Sources of Data

    ERIC Educational Resources Information Center

    Avouris, N.; Fiotakis, G.; Kahrimanis, G.; Margaritis, M.; Komis, V.

    2007-01-01

    In this article, we discuss key requirements for collecting behavioural data concerning technology-supported collaborative learning activities. It is argued that the common practice of analysis of computer generated log files of user interactions with software tools is not enough for building a thorough view of the activity. Instead, more…

  6. Clinical impact of dosimetric changes for volumetric modulated arc therapy in log file-based patient dose calculations.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Matsunaga, Kenichi; Matsushita, Haruo; Majima, Kazuhiro; Jingu, Keiichi

    2017-10-01

    A log file-based method cannot detect dosimetric changes due to linac component miscalibration because log files are insensitive to miscalibration. Herein, clinical impacts of dosimetric changes on a log file-based method were determined. Five head-and-neck and five prostate plans were applied. Miscalibration-simulated log files were generated by inducing a linac component miscalibration into the log file. Miscalibration magnitudes for leaf, gantry, and collimator at the general tolerance level were ±0.5mm, ±1°, and ±1°, respectively, and at a tighter tolerance level achievable on current linac were ±0.3mm, ±0.5°, and ±0.5°, respectively. Re-calculations were performed on patient anatomy using log file data. Changes in tumor control probability/normal tissue complication probability from treatment planning system dose to re-calculated dose at the general tolerance level was 1.8% on planning target volume (PTV) and 2.4% on organs at risk (OARs) in both plans. These changes at the tighter tolerance level were improved to 1.0% on PTV and to 1.5% on OARs, with a statistically significant difference. We determined the clinical impacts of dosimetric changes on a log file-based method using a general tolerance level and a tighter tolerance level for linac miscalibration and found that a tighter tolerance level significantly improved the accuracy of the log file-based method. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  7. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  8. Comparing Web and Touch Screen Transaction Log Files

    PubMed Central

    Huntington, Paul; Williams, Peter

    2001-01-01

    Background Digital health information is available on a wide variety of platforms including PC-access of the Internet, Wireless Application Protocol phones, CD-ROMs, and touch screen public kiosks. All these platforms record details of user sessions in transaction log files, and there is a growing body of research into the evaluation of this data. However, there is very little research that has examined the problems of comparing the transaction log files of kiosks and the Internet. Objectives To provide a first step towards examining the problems of comparing the transaction log files of kiosks and the Internet. Methods We studied two platforms: touch screen kiosks and a comparable Web site. For both of these platforms, we examined the menu structure (which affects transaction log file data), the log-file structure, and the metrics derived from log-file records. Results We found substantial differences between the generated metrics. Conclusions None of the metrics discussed can be regarded as an effective way of comparing the use of kiosks and Web sites. Two metrics stand out as potentially comparable and valuable: the number of user sessions per hour and user penetration of pages. PMID:11720960

  9. Cloud object store for archive storage of high performance computing data using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  10. The NetLogger Toolkit V2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunter, Dan; Lee, Jason; Stoufer, Martin

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation ofmore » application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and penodically try to reconnect broken TCP pipe. A typical use for this is to store data on local disk while net is down. An event archiver can log one or more incoming NetLogger streams to a local disk file (netlogd) or to a mySQL database (netarchd). We have found exploratory, visual analysis of the log event data to be the most useful means of determining the causes of performance anomalies The NetLogger Visualization tool, niv, has been developed to provide a flexible and interactive graphical representation of system-level and application-level events.« less

  11. Analysis of the request patterns to the NSSDC on-line archive

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore

    1994-01-01

    NASA missions, both for earth science and for space science, collect huge amounts of data, and the rate at which data is being gathered is increasing. For example, the EOSDIS project is expected to collect petabytes per year. In addition, these archives are being made available to remote users over the Internet. The ability to manage the growth of the size and request activity of scientific archives depends on an understanding of the access patterns of scientific users. The National Space Science Data Center (NSSDC) of NASA Goddard Space Flight Center has run their on-line mass storage archive of space data, the National Data Archive and Distribution Service (NDADS), since November 1991. A large world-wide space research community makes use of NSSDC, requesting more than 20,000 files per month. Since the initiation of their service, they have maintained log files which record all accesses the archive. In this report, we present an analysis of the NDADS log files. We analyze the log files, and discuss several issues, including caching, reference patterns, clustering, and system loading.

  12. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    ERIC Educational Resources Information Center

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  13. SU-E-T-392: Evaluation of Ion Chamber/film and Log File Based QA to Detect Delivery Errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, C; Mason, B; Kirsner, S

    2015-06-15

    Purpose: Ion chamber and film (ICAF) is a method used to verify patient dose prior to treatment. More recently, log file based QA has been shown as an alternative for measurement based QA. In this study, we delivered VMAT plans with and without errors to determine if ICAF and/or log file based QA was able to detect the errors. Methods: Using two VMAT patients, the original treatment plan plus 7 additional plans with delivery errors introduced were generated and delivered. The erroneous plans had gantry, collimator, MLC, gantry and collimator, collimator and MLC, MLC and gantry, and gantry, collimator, andmore » MLC errors. The gantry and collimator errors were off by 4{sup 0} for one of the two arcs. The MLC error introduced was one in which the opening aperture didn’t move throughout the delivery of the field. For each delivery, an ICAF measurement was made as well as a dose comparison based upon log files. Passing criteria to evaluate the plans were ion chamber less and 5% and film 90% of pixels pass the 3mm/3% gamma analysis(GA). For log file analysis 90% of voxels pass the 3mm/3% 3D GA and beam parameters match what was in the plan. Results: Two original plans were delivered and passed both ICAF and log file base QA. Both ICAF and log file QA met the dosimetry criteria on 4 of the 12 erroneous cases analyzed (2 cases were not analyzed). For the log file analysis, all 12 erroneous plans alerted a mismatch in delivery versus what was planned. The 8 plans that didn’t meet criteria all had MLC errors. Conclusion: Our study demonstrates that log file based pre-treatment QA was able to detect small errors that may not be detected using an ICAF and both methods of were able to detect larger delivery errors.« less

  14. Teaching an Old Log New Tricks with Machine Learning.

    PubMed

    Schnell, Krista; Puri, Colin; Mahler, Paul; Dukatz, Carl

    2014-03-01

    To most people, the log file would not be considered an exciting area in technology today. However, these relatively benign, slowly growing data sources can drive large business transformations when combined with modern-day analytics. Accenture Technology Labs has built a new framework that helps to expand existing vendor solutions to create new methods of gaining insights from these benevolent information springs. This framework provides a systematic and effective machine-learning mechanism to understand, analyze, and visualize heterogeneous log files. These techniques enable an automated approach to analyzing log content in real time, learning relevant behaviors, and creating actionable insights applicable in traditionally reactive situations. Using this approach, companies can now tap into a wealth of knowledge residing in log file data that is currently being collected but underutilized because of its overwhelming variety and volume. By using log files as an important data input into the larger enterprise data supply chain, businesses have the opportunity to enhance their current operational log management solution and generate entirely new business insights-no longer limited to the realm of reactive IT management, but extending from proactive product improvement to defense from attacks. As we will discuss, this solution has immediate relevance in the telecommunications and security industries. However, the most forward-looking companies can take it even further. How? By thinking beyond the log file and applying the same machine-learning framework to other log file use cases (including logistics, social media, and consumer behavior) and any other transactional data source.

  15. WE-G-213CD-03: A Dual Complementary Verification Method for Dynamic Tumor Tracking on Vero SBRT.

    PubMed

    Poels, K; Depuydt, T; Verellen, D; De Ridder, M

    2012-06-01

    to use complementary cine EPID and gimbals log file analysis for in-vivo tracking accuracy monitoring. A clinical prototype of dynamic tracking (DT) was installed on the Vero SBRT system. This prototype version allowed tumor tracking by gimballed linac rotations using an internal-external correspondence model. The DT prototype software allowed the detailed logging of all applied gimbals rotations during tracking. The integration of an EPID on the vero system allowed the acquisition of cine EPID images during DT. We quantified the tracking error on cine EPID (E-EPID) by subtracting the target center (fiducial marker detection) and the field centroid. Dynamic gimbals log file information was combined with orthogonal x-ray verification images to calculate the in-vivo tracking error (E-kVLog). The correlation between E-kVLog and E-EPID was calculated for validation of the gimbals log file. Further, we investigated the sensitivity of the log file tracking error by introducing predefined systematic tracking errors. As an application we calculate gimbals log file tracking error for dynamic hidden target tests to investigate gravity effects and decoupled gimbals rotation from gantry rotation. Finally, calculating complementary cine EPID and log file tracking errors evaluated the clinical accuracy of dynamic tracking. A strong correlation was found between log file and cine EPID tracking error distribution during concurrent measurements (R=0.98). We found sensitivity in the gimbals log files to detect a systematic tracking error up to 0.5 mm. Dynamic hidden target tests showed no gravity influence on tracking performance and high degree of decoupled gimbals and gantry rotation during dynamic arc dynamic tracking. A submillimetric agreement between clinical complementary tracking error measurements was found. Redundancy of the internal gimbals log file with x-ray verification images with complementary independent cine EPID images was implemented to monitor the accuracy of gimballed tumor tracking on Vero SBRT. Research was financially supported by the Flemish government (FWO), Hercules Foundation and BrainLAB AG. © 2012 American Association of Physicists in Medicine.

  16. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  17. A Clustering Methodology of Web Log Data for Learning Management Systems

    ERIC Educational Resources Information Center

    Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros

    2012-01-01

    Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…

  18. Visual behavior characterization for intrusion and misuse detection

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.; Frincke, Deborah

    2001-05-01

    As computer and network intrusions become more and more of a concern, the need for better capabilities, to assist in the detection and analysis of intrusions also increase. System administrators typically rely on log files to analyze usage and detect misuse. However, as a consequence of the amount of data collected by each machine, multiplied by the tens or hundreds of machines under the system administrator's auspices, the entirety of the data available is neither collected nor analyzed. This is compounded by the need to analyze network traffic data as well. We propose a methodology for analyzing network and computer log information visually based on the analysis of the behavior of the users. Each user's behavior is the key to determining their intent and overriding activity, whether they attempt to hide their actions or not. Proficient hackers will attempt to hide their ultimate activities, which hinders the reliability of log file analysis. Visually analyzing the users''s behavior however, is much more adaptable and difficult to counteract.

  19. SU-E-T-142: Automatic Linac Log File: Analysis and Reporting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gainey, M; Rothe, T

    Purpose: End to end QA for IMRT/VMAT is time consuming. Automated linac log file analysis and recalculation of daily recorded fluence, and hence dose, distribution bring this closer. Methods: Matlab (R2014b, Mathworks) software was written to read in and analyse IMRT/VMAT trajectory log files (TrueBeam 1.5, Varian Medical Systems) overnight, and are archived on a backed-up network drive (figure). A summary report (PDF) is sent by email to the duty linac physicist. A structured summary report (PDF) for each patient is automatically updated for embedding into the R&V system (Mosaiq 2.5, Elekta AG). The report contains cross-referenced hyperlinks to easemore » navigation between treatment fractions. Gamma analysis can be performed on planned (DICOM RTPlan) and treated (trajectory log) fluence distributions. Trajectory log files can be converted into RTPlan files for dose distribution calculation (Eclipse, AAA10.0.28, VMS). Results: All leaf positions are within +/−0.10mm: 57% within +/−0.01mm; 89% within 0.05mm. Mean leaf position deviation is 0.02mm. Gantry angle variations lie in the range −0.1 to 0.3 degrees, mean 0.04 degrees. Fluence verification shows excellent agreement between planned and treated fluence. Agreement between planned and treated dose distribution, the derived from log files, is very good. Conclusion: Automated log file analysis is a valuable tool for the busy physicist, enabling potential treated fluence distribution errors to be quickly identified. In the near future we will correlate trajectory log analysis with routine IMRT/VMAT QA analysis. This has the potential to reduce, but not eliminate, the QA workload.« less

  20. Coastal bathymetry data collected in 2011 from the Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    DeWitt, Nancy T.; Pfeiffer, William R.; Bernier, Julie C.; Buster, Noreen A.; Miselis, Jennifer L.; Flocks, James G.; Reynolds, Billy J.; Wiese, Dana S.; Kelso, Kyle W.

    2014-01-01

    This report serves as an archive of processed interferometric swath and single-beam bathymetry data. Geographic Iinformation System data products include a 50-meter cell-size interpolated bathymetry grid surface, trackline maps, and point data files. Additional files include error analysis maps, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.

  1. Catching errors with patient-specific pretreatment machine log file analysis.

    PubMed

    Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa

    2013-01-01

    A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  2. Navigating Streams of Paper.

    ERIC Educational Resources Information Center

    Bennett-Abney, Cheryl

    2001-01-01

    Three organizational tools for counselors are described: three-ring binder for notes, forms, and schedules; daily log of time and activities; and a tickler file with tasks arranged by days of the week. (SK)

  3. 46 CFR Appendix A to Part 530 - Instructions for the Filing of Service Contracts

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... file service contracts. BTCL will direct OIRM to provide approved filers with a log-on ID and password. Filers who wish a third party (publisher) to file their service contracts must so indicate on Form FMC-83... home page, http://www.fmc.gov. A. Registration, Log-on ID and Password To register for filing, a...

  4. 46 CFR Appendix A to Part 530 - Instructions for the Filing of Service Contracts

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... file service contracts. BTCL will direct OIRM to provide approved filers with a log-on ID and password. Filers who wish a third party (publisher) to file their service contracts must so indicate on Form FMC-83... home page, http://www.fmc.gov. A. Registration, Log-on ID and Password To register for filing, a...

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, S; Ho, M; Chen, C

    Purpose: The use of log files to perform patient specific quality assurance for both protons and IMRT has been established. Here, we extend that approach to a proprietary log file format and compare our results to measurements in phantom. Our goal was to generate a system that would permit gross errors to be found within 3 fractions until direct measurements. This approach could eventually replace direct measurements. Methods: Spot scanning protons pass through multi-wire ionization chambers which provide information about the charge, location, and size of each delivered spot. We have generated a program that calculates the dose in phantommore » from these log files and compares the measurements with the plan. The program has 3 different spot shape models: single Gaussian, double Gaussian and the ASTROID model. The program was benchmarked across different treatment sites for 23 patients and 74 fields. Results: The dose calculated from the log files were compared to those generate by the treatment planning system (Raystation). While the dual Gaussian model often gave better agreement, overall, the ASTROID model gave the most consistent results. Using a 5%–3 mm gamma with a 90% passing criteria and excluding doses below 20% of prescription all patient samples passed. However, the degree of agreement of the log file approach was slightly worse than that of the chamber array measurement approach. Operationally, this implies that if the beam passes the log file model, it should pass direct measurement. Conclusion: We have established and benchmarked a model for log file QA in an IBA proteus plus system. The choice of optimal spot model for a given class of patients may be affected by factors such as site, field size, and range shifter and will be investigated further.« less

  6. Understanding Academic Information Seeking Habits through Analysis of Web Server Log Files: The Case of the Teachers College Library Website

    ERIC Educational Resources Information Center

    Asunka, Stephen; Chae, Hui Soo; Hughes, Brian; Natriello, Gary

    2009-01-01

    Transaction logs of user activity on an academic library website were analyzed to determine general usage patterns on the website. This paper reports on insights gained from the analysis, and identifies and discusses issues relating to content access, interface design and general functionality of the website. (Contains 13 figures and 8 tables.)

  7. An EXCEL macro for importing log ASCII standard (LAS) files into EXCEL worksheets

    NASA Astrophysics Data System (ADS)

    Özkaya, Sait Ismail

    1996-02-01

    An EXCEL 5.0 macro is presented for converting a LAS text file into an EXCEL worksheet. Although EXCEL has commands for importing text files and parsing text lines, LAS files must be decoded line-by-line because three different delimiters are used to separate fields of differing length. The macro is intended to eliminate manual decoding of LAS version 2.0. LAS is a floppy disk format for storage and transfer of log data as text files. LAS was proposed by the Canadian Well Logging Society. The present EXCEL macro decodes different sections of a LAS file, separates, and places the fields into different columns of an EXCEL worksheet. To import a LAS file into EXCEL without errors, the file must not contain any unrecognized symbols, and the data section must be the last section. The program does not check for the presence of mandatory sections or fields as required by LAS rules. Once a file is incorporated into EXCEL, mandatory sections and fields may be inspected visually.

  8. SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeMarco, J; McCloskey, S; Low, D

    Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT planmore » file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment.« less

  9. SU-E-T-184: Clinical VMAT QA Practice Using LINAC Delivery Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, H; Jacobson, T; Gu, X

    2015-06-15

    Purpose: To evaluate the accuracy of volumetric modulated arc therapy (VMAT) treatment delivery dose clouds by comparing linac log data to doses measured using an ionization chamber and film. Methods: A commercial IMRT quality assurance (QA) process utilizing a DICOM-RT framework was tested for clinical practice using 30 prostate and 30 head and neck VMAT plans. Delivered 3D VMAT dose distributions were independently checked using a PinPoint ionization chamber and radiographic film in a solid water phantom. DICOM RT coordinates were used to extract the corresponding point and planar doses from 3D log file dose distributions. Point doses were evaluatedmore » by computing the percent error between log file and chamber measured values. A planar dose evaluation was performed for each plan using a 2D gamma analysis with 3% global dose difference and 3 mm isodose point distance criteria. The same analysis was performed to compare treatment planning system (TPS) doses to measured values to establish a baseline assessment of agreement. Results: The mean percent error between log file and ionization chamber dose was 1.0%±2.1% for prostate VMAT plans and −0.2%±1.4% for head and neck plans. The corresponding TPS calculated and measured ionization chamber values agree within 1.7%±1.6%. The average 2D gamma passing rates for the log file comparison to film are 98.8%±1.0% and 96.2%±4.2% for the prostate and head and neck plans, respectively. The corresponding passing rates for the TPS comparison to film are 99.4%±0.5% and 93.9%±5.1%. Overall, the point dose and film data indicate that log file determined doses are in excellent agreement with measured values. Conclusion: Clinical VMAT QA practice using LINAC treatment log files is a fast and reliable method for patient-specific plan evaluation.« less

  10. Automating linear accelerator quality assurance.

    PubMed

    Eckhause, Tobias; Al-Hallaq, Hania; Ritter, Timothy; DeMarco, John; Farrey, Karl; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Perez, Mario; Park, SungYong; Booth, Jeremy T; Thorwarth, Ryan; Moran, Jean M

    2015-10-01

    The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The standard deviation in MLC position, as determined by EPID measurements, across the consortium was 0.33 mm for IMRT fields. With respect to the log files, the deviations between expected and actual positions for parameters were small (<0.12 mm) for all Linacs. Considering both log files and EPID measurements, all parameters were well within published tolerance values. Variations in collimator angle, MLC position, and gantry sag were also evaluated for all Linacs. The performance of the TrueBeam Linac model was shown to be consistent based on automated analysis of trajectory log files and EPID images acquired during delivery of a standardized test suite. The results can be compared directly to tolerance thresholds. In addition, sharing of results from standard tests across institutions can facilitate the identification of QA process and Linac changes. These reference values are presented along with the standard deviation for common tests so that the test suite can be used by other centers to evaluate their Linac performance against those in this consortium.

  11. Linking log files with dosimetric accuracy--A multi-institutional study on quality assurance of volumetric modulated arc therapy.

    PubMed

    Pasler, Marlies; Kaas, Jochem; Perik, Thijs; Geuze, Job; Dreindl, Ralf; Künzler, Thomas; Wittkamper, Frits; Georg, Dietmar

    2015-12-01

    To systematically evaluate machine specific quality assurance (QA) for volumetric modulated arc therapy (VMAT) based on log files by applying a dynamic benchmark plan. A VMAT benchmark plan was created and tested on 18 Elekta linacs (13 MLCi or MLCi2, 5 Agility) at 4 different institutions. Linac log files were analyzed and a delivery robustness index was introduced. For dosimetric measurements an ionization chamber array was used. Relative dose deviations were assessed by mean gamma for each control point and compared to the log file evaluation. Fourteen linacs delivered the VMAT benchmark plan, while 4 linacs failed by consistently terminating the delivery. The mean leaf error (±1SD) was 0.3±0.2 mm for all linacs. Large MLC maximum errors up to 6.5 mm were observed at reversal positions. Delivery robustness index accounting for MLC position correction (0.8-1.0) correlated with delivery time (80-128 s) and depended on dose rate performance. Dosimetric evaluation indicated in general accurate plan reproducibility with γ(mean)(±1 SD)=0.4±0.2 for 1 mm/1%. However single control point analysis revealed larger deviations and attributed well to log file analysis. The designed benchmark plan helped identify linac related malfunctions in dynamic mode for VMAT. Log files serve as an important additional QA measure to understand and visualize dynamic linac parameters. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. SU-F-T-469: A Clinically Observed Discrepancy Between Image-Based and Log- Based MLC Position

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neal, B; Ahmed, M; Siebers, J

    2016-06-15

    Purpose: To present a clinical case which challenges the base assumption of log-file based QA, by showing that the actual position of a MLC leaf can suddenly deviate from its programmed and logged position by >1 mm as observed with real-time imaging. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used in cine mode to capture portal images during treatment. Visual monitoring identified an anomalous MLC leaf pair gap not otherwise detected by the automatic position verification. The position of the erred leaf was measured on EPID images and log files were analyzed for themore » treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days. Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3±0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusion: It has been clinically observed that log-file derived leaf positions can differ from their actual positions by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trusting log file records. Intra-treatment EPID imaging provides a method to capture departures from MLC planned positions. Work was supported in part by Varian Medical Systems.« less

  13. REPHLEX II: An information management system for the ARS Water Data Base

    NASA Astrophysics Data System (ADS)

    Thurman, Jane L.

    1993-08-01

    The REPHLEX II computer system is an on-line information management system which allows scientists, engineers, and other researchers to retrieve data from the ARS Water Data Base using asynchronous communications. The system features two phone lines handling baud rates from 300 to 2400, customized menus to facilitate browsing, help screens, direct access to information and data files, electronic mail processing, file transfers using the XMODEM protocol, and log-in procedures which capture information on new users, process passwords, and log activity for a permanent audit trail. The primary data base on the REPHLEX II system is the ARS Water Data Base which consists of rainfall and runoff data from experimental agricultural watersheds located in the United States.

  14. SU-F-T-233: Evaluation of Treatment Delivery Parameters Using High Resolution ELEKTA Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kabat, C; Defoor, D; Alexandrian, A

    2016-06-15

    Purpose: As modern linacs have become more technologically advanced with the implementation of IGRT and IMRT with HDMLCs, a requirement for more elaborate tracking techniques to monitor components’ integrity is paramount. ElektaLog files are generated every 40 milliseconds, which can be analyzed to track subtle changes and provide another aspect of quality assurance. This allows for constant monitoring of fraction consistency in addition to machine reliability. With this in mind, it was the aim of the study to evaluate if ElektaLog files can be utilized for linac consistency QA. Methods: ElektaLogs were reviewed for 16 IMRT patient plans with >16more » fractions. Logs were analyzed by creating fluence maps from recorded values of MLC locations, jaw locations, and dose per unit time. Fluence maps were then utilized to calculate a 2D gamma index with a 2%–2mm criteria for each fraction. ElektaLogs were also used to analyze positional errors for MLC leaves and jaws, which were used to compute an overall error for the MLC banks, Y-jaws, and X-jaws by taking the root-meansquare value of the individual recorded errors during treatment. Additionally, beam on time was calculated using the number of ElektaLog file entries within the file. Results: The average 2D gamma for all 16 patient plans was found to be 98.0±2.0%. Recorded gamma index values showed an acceptable correlation between fractions. Average RMS values for MLC leaves and the jaws resulted in a leaf variation of roughly 0.3±0.08 mm and jaw variation of about 0.15±0.04 mm, both of which fall within clinical tolerances. Conclusion: The use of ElektaLog files for day-to-day evaluation of linac integrity and patient QA can be utilized to allow for reliable analysis of system accuracy and performance.« less

  15. Log-less metadata management on metadata server for parallel file systems.

    PubMed

    Liao, Jianwei; Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally.

  16. Log-Less Metadata Management on Metadata Server for Parallel File Systems

    PubMed Central

    Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally. PMID:24892093

  17. Geophysical log database for the Floridan aquifer system and southeastern Coastal Plain aquifer system in Florida and parts of Georgia, Alabama, and South Carolina

    USGS Publications Warehouse

    Williams, Lester J.; Raines, Jessica E.; Lanning, Amanda E.

    2013-04-04

    A database of borehole geophysical logs and other types of data files were compiled as part of ongoing studies of water availability and assessment of brackish- and saline-water resources. The database contains 4,883 logs from 1,248 wells in Florida, Georgia, Alabama, South Carolina, and from a limited number of offshore wells of the eastern Gulf of Mexico and the Atlantic Ocean. The logs can be accessed through a download directory organized by state and county for onshore wells and in a single directory for the offshore wells. A flat file database is provided that lists the wells, their coordinates, and the file listings.

  18. Web usage data mining agent

    NASA Astrophysics Data System (ADS)

    Madiraju, Praveen; Zhang, Yanqing

    2002-03-01

    When a user logs in to a website, behind the scenes the user leaves his/her impressions, usage patterns and also access patterns in the web servers log file. A web usage mining agent can analyze these web logs to help web developers to improve the organization and presentation of their websites. They can help system administrators in improving the system performance. Web logs provide invaluable help in creating adaptive web sites and also in analyzing the network traffic analysis. This paper presents the design and implementation of a Web usage mining agent for digging in to the web log files.

  19. Index map of cross sections through parts of the Appalachian basin (Kentucky, New York, Ohio, Pennsylvania, Tennessee, Virginia, West Virginia): Chapter E.1 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    The appendixes in chapters E.4.1 and E.4.2 include (1) Log ASCII Standard (LAS) files, which encode gamma-ray, neutron, density, and other logs in text files that can be used by most well-logging software programs; and (2) graphic well-log traces. In the appendix to chapter E.4.1, the well-log traces are accompanied by lithologic descriptions with formation tops.

  20. Monte Carlo based, patient-specific RapidArc QA using Linac log files.

    PubMed

    Teke, Tony; Bergman, Alanah M; Kwa, William; Gill, Bradford; Duzenli, Cheryl; Popescu, I Antoniu

    2010-01-01

    A Monte Carlo (MC) based QA process to validate the dynamic beam delivery accuracy for Varian RapidArc (Varian Medical Systems, Palo Alto, CA) using Linac delivery log files (DynaLog) is presented. Using DynaLog file analysis and MC simulations, the goal of this article is to (a) confirm that adequate sampling is used in the RapidArc optimization algorithm (177 static gantry angles) and (b) to assess the physical machine performance [gantry angle and monitor unit (MU) delivery accuracy]. Ten clinically acceptable RapidArc treatment plans were generated for various tumor sites and delivered to a water-equivalent cylindrical phantom on the treatment unit. Three Monte Carlo simulations were performed to calculate dose to the CT phantom image set: (a) One using a series of static gantry angles defined by 177 control points with treatment planning system (TPS) MLC control files (planning files), (b) one using continuous gantry rotation with TPS generated MLC control files, and (c) one using continuous gantry rotation with actual Linac delivery log files. Monte Carlo simulated dose distributions are compared to both ionization chamber point measurements and with RapidArc TPS calculated doses. The 3D dose distributions were compared using a 3D gamma-factor analysis, employing a 3%/3 mm distance-to-agreement criterion. The dose difference between MC simulations, TPS, and ionization chamber point measurements was less than 2.1%. For all plans, the MC calculated 3D dose distributions agreed well with the TPS calculated doses (gamma-factor values were less than 1 for more than 95% of the points considered). Machine performance QA was supplemented with an extensive DynaLog file analysis. A DynaLog file analysis showed that leaf position errors were less than 1 mm for 94% of the time and there were no leaf errors greater than 2.5 mm. The mean standard deviation in MU and gantry angle were 0.052 MU and 0.355 degrees, respectively, for the ten cases analyzed. The accuracy and flexibility of the Monte Carlo based RapidArc QA system were demonstrated. Good machine performance and accurate dose distribution delivery of RapidArc plans were observed. The sampling used in the TPS optimization algorithm was found to be adequate.

  1. A clinically observed discrepancy between image-based and log-based MLC positions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neal, Brian, E-mail: bpn2p@virginia.edu; Ahmed, Mahmoud; Kathuria, Kunal

    2016-06-15

    Purpose: To present a clinical case in which real-time intratreatment imaging identified an multileaf collimator (MLC) leaf to be consistently deviating from its programmed and logged position by >1 mm. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used to capture cine during treatment images. The author serendipitously visually identified a suspected MLC leaf displacement that was not otherwise detected. The leaf position as recorded on the EPID images was measured and log-files were analyzed for the treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days.more » Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log-file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3 ± 0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusions: It has been clinically observed that log-file derived leaf positions can differ from their actual position by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trust log-file records. Intratreatment EPID imaging provides a method to capture departures from MLC planned positions.« less

  2. Quantification of residual dose estimation error on log file-based patient dose calculation.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Matsunaga, Kenichi; Matsushita, Haruo; Majima, Kazuhiro; Jingu, Keiichi

    2016-05-01

    The log file-based patient dose estimation includes a residual dose estimation error caused by leaf miscalibration, which cannot be reflected on the estimated dose. The purpose of this study is to determine this residual dose estimation error. Modified log files for seven head-and-neck and prostate volumetric modulated arc therapy (VMAT) plans simulating leaf miscalibration were generated by shifting both leaf banks (systematic leaf gap errors: ±2.0, ±1.0, and ±0.5mm in opposite directions and systematic leaf shifts: ±1.0mm in the same direction) using MATLAB-based (MathWorks, Natick, MA) in-house software. The generated modified and non-modified log files were imported back into the treatment planning system and recalculated. Subsequently, the generalized equivalent uniform dose (gEUD) was quantified for the definition of the planning target volume (PTV) and organs at risks. For MLC leaves calibrated within ±0.5mm, the quantified residual dose estimation errors that obtained from the slope of the linear regression of gEUD changes between non- and modified log file doses per leaf gap are in head-and-neck plans 1.32±0.27% and 0.82±0.17Gy for PTV and spinal cord, respectively, and in prostate plans 1.22±0.36%, 0.95±0.14Gy, and 0.45±0.08Gy for PTV, rectum, and bladder, respectively. In this work, we determine the residual dose estimation errors for VMAT delivery using the log file-based patient dose calculation according to the MLC calibration accuracy. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  3. Storage of sparse files using parallel log-structured file system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    A sparse file is stored without holes by storing a data portion of the sparse file using a parallel log-structured file system; and generating an index entry for the data portion, the index entry comprising a logical offset, physical offset and length of the data portion. The holes can be restored to the sparse file upon a reading of the sparse file. The data portion can be stored at a logical end of the sparse file. Additional storage efficiency can optionally be achieved by (i) detecting a write pattern for a plurality of the data portions and generating a singlemore » patterned index entry for the plurality of the patterned data portions; and/or (ii) storing the patterned index entries for a plurality of the sparse files in a single directory, wherein each entry in the single directory comprises an identifier of a corresponding sparse file.« less

  4. 46 CFR 97.35-3 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... voyage is completed, the master or person in charge shall file the logbook with the Officer in Charge.... Such logs or records are not filed with the Officer in Charge, Marine Inspection, but must be kept... logs for the period of validity of the vessel's certificate of inspection. [CGD 95-027, 61 FR 26007...

  5. 46 CFR 97.35-3 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... voyage is completed, the master or person in charge shall file the logbook with the Officer in Charge.... Such logs or records are not filed with the Officer in Charge, Marine Inspection, but must be kept... logs for the period of validity of the vessel's certificate of inspection. [CGD 95-027, 61 FR 26007...

  6. SU-F-T-295: MLCs Performance and Patient-Specific IMRT QA Using Log File Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osman, A; American University of Biuret Medical Center, Biuret; Maalej, N

    2016-06-15

    Purpose: To analyze the performance of the multi-leaf collimators (MLCs) from the log files recorded during the intensity modulated radiotherapy (IMRT) treatment and to construct the relative fluence maps and do the gamma analysis to compare the planned and executed MLCs movement. Methods: We developed a program to extract and analyze the data from dynamic log files (dynalog files) generated from sliding window IMRT delivery treatments. The program extracts the planned and executed (actual or delivered) MLCs movement, calculates and compares the relative planned and executed fluences. The fluence maps were used to perform the gamma analysis (with 3% dosemore » difference and 3 mm distance to agreement) for 3 IMR patients. We compared our gamma analysis results with those obtained from portal dose image prediction (PDIP) algorithm performed using the EPID. Results: For 3 different IMRT patient treatments, the maximum difference between the planned and the executed MCLs positions was 1.2 mm. The gamma analysis results of the planned and delivered fluences were in good agreement with the gamma analysis from portal dosimetry. The maximum difference for number of pixels passing the gamma criteria (3%/3mm) was 0.19% with respect to portal dosimetry results. Conclusion: MLC log files can be used to verify the performance of the MLCs. Patientspecific IMRT QA based on MLC movement log files gives similar results to EPID dosimetry results. This promising method for patient-specific IMRT QA is fast, does not require dose measurements in a phantom, can be done before the treatment and for every fraction, and significantly reduces the IMRT workload. The author would like to thank King Fahd University of petroleum and Minerals for the support.« less

  7. Log ASCII Standard (LAS) Files for Geophysical Wireline Well Logs and Their Application to Geologic Cross Sections Through the Central Appalachian Basin

    USGS Publications Warehouse

    Crangle, Robert D.

    2007-01-01

    Introduction The U.S. Geological Survey (USGS) uses geophysical wireline well logs for a variety of purposes, including stratigraphic correlation (Hettinger, 2001, Ryder, 2002), petroleum reservoir analyses (Nelson and Bird, 2005), aquifer studies (Balch, 1988), and synthetic seismic profiles (Kulander and Ryder, 2005). Commonly, well logs are easier to visualize, manipulate, and interpret when available in a digital format. In recent geologic cross sections E-E' and D-D', constructed through the central Appalachian basin (Ryder, Swezey, and others, in press; Ryder, Crangle, and others, in press), gamma ray well log traces and lithologic logs were used to correlate key stratigraphic intervals (Fig. 1). The stratigraphy and structure of the cross sections are illustrated through the use of graphical software applications (e.g., Adobe Illustrator). The gamma ray traces were digitized in Neuralog (proprietary software) from paper well logs and converted to a Log ASCII Standard (LAS) format. Once converted, the LAS files were transformed to images through an LAS-reader application (e.g., GeoGraphix Prizm) and then overlain in positions adjacent to well locations, used for stratigraphic control, on each cross section. This report summarizes the procedures used to convert paper logs to a digital LAS format using a third-party software application, Neuralog. Included in this report are LAS files for sixteen wells used in geologic cross section E-E' (Table 1) and thirteen wells used in geologic cross section D-D' (Table 2).

  8. SU-E-T-325: The New Evaluation Method of the VMAT Plan Delivery Using Varian DynaLog Files and Modulation Complexity Score (MCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tateoka, K; Graduate School of Medicine, Sapporo Medical University, Sapporo, JP; Fujimomo, K

    2014-06-01

    Purpose: The aim of the study is to evaluate the use of Varian DynaLog files to verify VMAT plans delivery and modulation complexity score (MCS) of VMAT. Methods: Delivery accuracy of machine performance was quantified by multileaf collimator (MLC) position errors, gantry angle errors and fluence delivery accuracy for volumetric modulated arc therapy (VMAT). The relationship between machine performance and plan complexity were also investigated using the modulation complexity score (MCS). Plan and Actual MLC positions, gantry angles and delivered fraction of monitor units were extracted from Varian DynaLog files. These factors were taken from the record and verify systemmore » of MLC control file. Planned and delivered beam data were compared to determine leaf position errors and gantry angle errors. Analysis was also performed on planned and actual fluence maps reconstructed from those of the DynaLog files. This analysis was performed for all treatment fractions of 5 prostate VMAT plans. The analysis of DynaLog files have been carried out by in-house programming in Visual C++. Results: The root mean square of leaf position and gantry angle errors were about 0.12 and 0.15, respectively. The Gamma of planned and actual fluence maps at 3%/3 mm criterion was about 99.21. The gamma of the leaf position errors were not directly related to plan complexity as determined by the MCS. Therefore, the gamma of the gantry angle errors were directly related to plan complexity as determined by the MCS. Conclusion: This study shows Varian dynalog files for VMAT plan can be diagnosed delivery errors not possible with phantom based quality assurance. Furthermore, the MCS of VMAT plan can evaluate delivery accuracy for patients receiving of VMAT. Machine performance was found to be directly related to plan complexity but this is not the dominant determinant of delivery accuracy.« less

  9. 20 CFR 401.85 - Exempt systems.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... subsection (k)(2) of the Privacy Act: (A) The General Criminal Investigation Files, SSA; (B) The Criminal Investigations File, SSA; and, (C) The Program Integrity Case Files, SSA. (D) Civil and Administrative Investigative Files of the Inspector General, SSA/OIG. (E) Complaint Files and Log. SSA/OGC. (iii) Pursuant to...

  10. SU-F-T-288: Impact of Trajectory Log Files for Clarkson-Based Independent Dose Verification of IMRT and VMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, R; Kamima, T; Tachibana, H

    2016-06-15

    Purpose: To investigate the effect of the trajectory files from linear accelerator for Clarkson-based independent dose verification in IMRT and VMAT plans. Methods: A CT-based independent dose verification software (Simple MU Analysis: SMU, Triangle Products, Japan) with a Clarksonbased algorithm was modified to calculate dose using the trajectory log files. Eclipse with the three techniques of step and shoot (SS), sliding window (SW) and Rapid Arc (RA) was used as treatment planning system (TPS). In this study, clinically approved IMRT and VMAT plans for prostate and head and neck (HN) at two institutions were retrospectively analyzed to assess the dosemore » deviation between DICOM-RT plan (PL) and trajectory log file (TJ). An additional analysis was performed to evaluate MLC error detection capability of SMU when the trajectory log files was modified by adding systematic errors (0.2, 0.5, 1.0 mm) and random errors (5, 10, 30 mm) to actual MLC position. Results: The dose deviations for prostate and HN in the two sites were 0.0% and 0.0% in SS, 0.1±0.0%, 0.1±0.1% in SW and 0.6±0.5%, 0.7±0.9% in RA, respectively. The MLC error detection capability shows the plans for HN IMRT were the most sensitive and 0.2 mm of systematic error affected 0.7% dose deviation on average. Effect of the MLC random error did not affect dose error. Conclusion: The use of trajectory log files including actual information of MLC location, gantry angle, etc should be more effective for an independent verification. The tolerance level for the secondary check using the trajectory file may be similar to that of the verification using DICOM-RT plan file. From the view of the resolution of MLC positional error detection, the secondary check could detect the MLC position error corresponding to the treatment sites and techniques. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  11. 47 CFR 76.1706 - Signal leakage logs and repair records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the probable cause of the leakage. The log shall be kept on file for a period of two years and shall... 47 Telecommunication 4 2010-10-01 2010-10-01 false Signal leakage logs and repair records. 76.1706... leakage logs and repair records. Cable operators shall maintain a log showing the date and location of...

  12. 47 CFR 76.1706 - Signal leakage logs and repair records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... the probable cause of the leakage. The log shall be kept on file for a period of two years and shall... 47 Telecommunication 4 2011-10-01 2011-10-01 false Signal leakage logs and repair records. 76.1706... leakage logs and repair records. Cable operators shall maintain a log showing the date and location of...

  13. Online Persistence in Higher Education Web-Supported Courses

    ERIC Educational Resources Information Center

    Hershkovitz, Arnon; Nachmias, Rafi

    2011-01-01

    This research consists of an empirical study of online persistence in Web-supported courses in higher education, using Data Mining techniques. Log files of 58 Moodle websites accompanying Tel Aviv University courses were drawn, recording the activity of 1189 students in 1897 course enrollments during the academic year 2008/9, and were analyzed…

  14. Replication in the Harp File System

    DTIC Science & Technology

    1981-07-01

    Shrira Michael Williams iadly 1991 © Massachusetts Institute of Technology (To appear In the Proceedings of the Thirteenth ACM Symposium on Operating...S., Spector, A. Z., and Thompson, D. S. Distributed Logging for Transaction Processing. ACM Special Interest Group on Management of Data 1987 Annual ...System. USENIX Conference Proceedings , June, 1990, pp. 63-71. 15. Hagmann, R. Reimplementing the Cedar File System Using Logging and Group Commit

  15. 17 CFR 274.402 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... for access codes to file on EDGAR. 274.402 Section 274.402 Commodity and Securities Exchanges... Forms for Electronic Filing § 274.402 Form ID, uniform application for access codes to file on EDGAR..., filing agent or training agent to log on to the EDGAR system, submit filings, and change its CCC. (d...

  16. 17 CFR 274.402 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... for access codes to file on EDGAR. 274.402 Section 274.402 Commodity and Securities Exchanges... Forms for Electronic Filing § 274.402 Form ID, uniform application for access codes to file on EDGAR..., filing agent or training agent to log on to the EDGAR system, submit filings, and change its CCC. (d...

  17. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file

    NASA Astrophysics Data System (ADS)

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-01

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases. This work was partly presented at the 58th Annual meeting of American Association of Physicists in Medicine.

  18. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file.

    PubMed

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-21

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases.

  19. Workload Characterization and Performance Implications of Large-Scale Blog Servers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeon, Myeongjae; Kim, Youngjae; Hwang, Jeaho

    With the ever-increasing popularity of social network services (SNSs), an understanding of the characteristics of these services and their effects on the behavior of their host servers is critical. However, there has been a lack of research on the workload characterization of servers running SNS applications such as blog services. To fill this void, we empirically characterized real-world web server logs collected from one of the largest South Korean blog hosting sites for 12 consecutive days. The logs consist of more than 96 million HTTP requests and 4.7 TB of network traffic. Our analysis reveals the followings: (i) The transfermore » size of non-multimedia files and blog articles can be modeled using a truncated Pareto distribution and a log-normal distribution, respectively; (ii) User access for blog articles does not show temporal locality, but is strongly biased towards those posted with image or audio files. We additionally discuss the potential performance improvement through clustering of small files on a blog page into contiguous disk blocks, which benefits from the observed file access patterns. Trace-driven simulations show that, on average, the suggested approach achieves 60.6% better system throughput and reduces the processing time for file access by 30.8% compared to the best performance of the Ext4 file system.« less

  20. SU-E-T-784: Using MLC Log Files for Daily IMRT Delivery Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stathakis, S; Defoor, D; Linden, P

    2015-06-15

    Purpose: To verify daily intensity modulated radiation therapy (IMRT) treatments using multi-leaf collimator (MLC) log files. Methods: The MLC log files from a NovalisTX Varian linear accelerator were used in this study. The MLC files were recorded daily for all patients undergoing IMRT or volumetric modulated arc therapy (VMAT). The first record of each patient was used as reference and all records for subsequent days were compared against the reference. An in house MATLAB software code was used for the comparisons. Each MLC log file was converted to a fluence map (FM) and a gamma index (γ) analysis was usedmore » for the evaluation of each daily delivery for every patient. The tolerance for the gamma index was set to 2% dose difference and 2mm distance to agreement while points with signal of 10% or lower of the maximum value were excluded from the comparisons. Results: The γ between each of the reference FMs and the consecutive daily fraction FMs had an average value of 99.1% (ranged from 98.2 to 100.0%). The FM images were reconstructed at various resolutions in order to study the effect of the resolution on the γ and at the same time reduce the time for processing the images. We found that the comparison of images with the highest resolution (768×1024) yielded on average a lower γ (99.1%) than the ones with low resolution (192×256) (γ 99.5%). Conclusion: We developed an in-house software that allows us to monitor the quality of daily IMRT and VMAT treatment deliveries using information from the MLC log files of the linear accelerator. The information can be analyzed and evaluated as early as after the completion of each daily treatment. Such tool can be valuable to assess the effect of MLC positioning on plan quality, especially in the context of adaptive radiotherapy.« less

  1. A high-speed scintillation-based electronic portal imaging device to quantitatively characterize IMRT delivery.

    PubMed

    Ranade, Manisha K; Lynch, Bart D; Li, Jonathan G; Dempsey, James F

    2006-01-01

    We have developed an electronic portal imaging device (EPID) employing a fast scintillator and a high-speed camera. The device is designed to accurately and independently characterize the fluence delivered by a linear accelerator during intensity modulated radiation therapy (IMRT) with either step-and-shoot or dynamic multileaf collimator (MLC) delivery. Our aim is to accurately obtain the beam shape and fluence of all segments delivered during IMRT, in order to study the nature of discrepancies between the plan and the delivered doses. A commercial high-speed camera was combined with a terbium-doped gadolinium-oxy-sulfide (Gd2O2S:Tb) scintillator to form an EPID for the unaliased capture of two-dimensional fluence distributions of each beam in an IMRT delivery. The high speed EPID was synchronized to the accelerator pulse-forming network and gated to capture every possible pulse emitted from the accelerator, with an approximate frame rate of 360 frames-per-second (fps). A 62-segment beam from a head-and-neck IMRT treatment plan requiring 68 s to deliver was recorded with our high speed EPID producing approximately 6 Gbytes of imaging data. The EPID data were compared with the MLC instruction files and the MLC controller log files. The frames were binned to provide a frame rate of 72 fps with a signal-to-noise ratio that was sufficient to resolve leaf positions and segment fluence. The fractional fluence from the log files and EPID data agreed well. An ambiguity in the motion of the MLC during beam on was resolved. The log files reported leaf motions at the end of 33 of the 42 segments, while the EPID observed leaf motions in only 7 of the 42 segments. The static IMRT segment shapes observed by the high speed EPID were in good agreement with the shapes reported in the log files. The leaf motions observed during beam-on for step-and-shoot delivery were not temporally resolved by the log files.

  2. Geologic cross section E-E' through the Appalachian basin from the Findlay arch, Wood County, Ohio, to the Valley and Ridge province, Pendleton County, West Virginia: Chapter E.4.2 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Swezey, Christopher S.; Crangle, Robert D.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    This chapter is a re-release of U.S. Geological Survey Scientific Investigations Map 2985, of the same title, by Ryder and others (2008). For this chapter, two appendixes have been added that do not appear with the original version. Appendix A provides Log ASCII Standard (LAS) files for each drill hole along cross-section E–E'; they are text files which encode gamma-ray, neutron, density, and other logs that can be used by most well-logging software. Appendix B provides graphic well-log traces from each drill hole.

  3. INSPIRE and SPIRES Log File Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Cole; /Wheaton Coll. /SLAC

    2012-08-31

    SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are mademore » between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.« less

  4. 78 FR 40474 - Sustaining Power Solutions LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-05

    .... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...

  5. 78 FR 34371 - Longfellow Wind, LLC: Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-07

    .... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...

  6. The new idea of transporting tailings-logs in tailings slurry pipeline and the innovation of technology of mining waste-fill method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin Yu; Wang Fuji; Tao Yan

    2000-07-01

    This paper introduced a new idea of transporting mine tailings-logs in mine tailings-slurry pipeline and a new technology of mine cemented filing of tailings-logs with tailings-slurry. The hydraulic principles, the compaction of tailings-logs and the mechanic function of fillbody of tailings-logs cemented by tailings-slurry have been discussed.

  7. Study of the IMRT interplay effect using a 4DCT Monte Carlo dose calculation.

    PubMed

    Jensen, Michael D; Abdellatif, Ady; Chen, Jeff; Wong, Eugene

    2012-04-21

    Respiratory motion may lead to dose errors when treating thoracic and abdominal tumours with radiotherapy. The interplay between complex multileaf collimator patterns and patient respiratory motion could result in unintuitive dose changes. We have developed a treatment reconstruction simulation computer code that accounts for interplay effects by combining multileaf collimator controller log files, respiratory trace log files, 4DCT images and a Monte Carlo dose calculator. Two three-dimensional (3D) IMRT step-and-shoot plans, a concave target and integrated boost were delivered to a 1D rigid motion phantom. Three sets of experiments were performed with 100%, 50% and 25% duty cycle gating. The log files were collected, and five simulation types were performed on each data set: continuous isocentre shift, discrete isocentre shift, 4DCT, 4DCT delivery average and 4DCT plan average. Analysis was performed using 3D gamma analysis with passing criteria of 2%, 2 mm. The simulation framework was able to demonstrate that a single fraction of the integrated boost plan was more sensitive to interplay effects than the concave target. Gating was shown to reduce the interplay effects. We have developed a 4DCT Monte Carlo simulation method that accounts for IMRT interplay effects with respiratory motion by utilizing delivery log files.

  8. Analysis of Student Activity in Web-Supported Courses as a Tool for Predicting Dropout

    ERIC Educational Resources Information Center

    Cohen, Anat

    2017-01-01

    Persistence in learning processes is perceived as a central value; therefore, dropouts from studies are a prime concern for educators. This study focuses on the quantitative analysis of data accumulated on 362 students in three academic course website log files in the disciplines of mathematics and statistics, in order to examine whether student…

  9. Digging Deeper into Learners' Experiences in MOOCs: Participation in Social Networks outside of MOOCs, Notetaking and Contexts Surrounding Content Consumption

    ERIC Educational Resources Information Center

    Veletsianos, George; Collier, Amy; Schneider, Emily

    2015-01-01

    Researchers describe with increasing confidence "what" they observe participants doing in massive open online courses (MOOCs). However, our understanding of learner activities in open courses is limited by researchers' extensive dependence on log file analyses and clickstream data to make inferences about learner behaviors. Further, the…

  10. User's manual for SEDCALC, a computer program for computation of suspended-sediment discharge

    USGS Publications Warehouse

    Koltun, G.F.; Gray, John R.; McElhone, T.J.

    1994-01-01

    Sediment-Record Calculations (SEDCALC), a menu-driven set of interactive computer programs, was developed to facilitate computation of suspended-sediment records. The programs comprising SEDCALC were developed independently in several District offices of the U.S. Geological Survey (USGS) to minimize the intensive labor associated with various aspects of sediment-record computations. SEDCALC operates on suspended-sediment-concentration data stored in American Standard Code for Information Interchange (ASCII) files in a predefined card-image format. Program options within SEDCALC can be used to assist in creating and editing the card-image files, as well as to reformat card-image files to and from formats used by the USGS Water-Quality System. SEDCALC provides options for creating card-image files containing time series of equal-interval suspended-sediment concentrations from 1. digitized suspended-sediment-concentration traces, 2. linear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals, and 3. nonlinear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals. Suspended-sediment discharge can be computed from the streamflow and suspended-sediment-concentration data or by application of transport relations derived by regressing log-transformed instantaneous streamflows on log-transformed instantaneous suspended-sediment concentrations or discharges. The computed suspended-sediment discharge data are stored in card-image files that can be either directly imported to the USGS Automated Data Processing System or used to generate plots by means of other SEDCALC options.

  11. 17 CFR 239.63 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... for access codes to file on EDGAR. 239.63 Section 239.63 Commodity and Securities Exchanges SECURITIES... Statements § 239.63 Form ID, uniform application for access codes to file on EDGAR. Form ID must be filed by... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification...

  12. 78 FR 54888 - Guzman Power Markets, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-06

    ... the eFiling link to log on and submit the intervention or protests. Persons unable to file... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... electronic service, persons with Internet access who will eFile a document and/or be listed as a contact for...

  13. 17 CFR 239.63 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... for access codes to file on EDGAR. 239.63 Section 239.63 Commodity and Securities Exchanges SECURITIES... Statements § 239.63 Form ID, uniform application for access codes to file on EDGAR. Form ID must be filed by... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification...

  14. 78 FR 28835 - Salton Sea Power Generation Company; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    .... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...

  15. Console Log Keeping Made Easier - Tools and Techniques for Improving Quality of Flight Controller Activity Logs

    NASA Technical Reports Server (NTRS)

    Scott, David W.; Underwood, Debrah (Technical Monitor)

    2002-01-01

    At the Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) for International Space Station (ISS), each flight controller maintains detailed logs of activities and communications at their console position. These logs are critical for accurately controlling flight in real-time as well as providing a historical record and troubleshooting tool. This paper describes logging methods and electronic formats used at the POIC and provides food for thought on their strengths and limitations, plus proposes some innovative extensions. It also describes an inexpensive PC-based scheme for capturing and/or transcribing audio clips from communications consoles. Flight control activity (e.g. interpreting computer displays, entering data/issuing electronic commands, and communicating with others) can become extremely intense. It's essential to document it well, but the effort to do so may conflict with actual activity. This can be more than just annoying, as what's in the logs (or just as importantly not in them) often feeds back directly into the quality of future operations, whether short-term or long-term. In earlier programs, such as Spacelab, log keeping was done on paper, often using position-specific shorthand, and the other reader was at the mercy of the writer's penmanship. Today, user-friendly software solves the legibility problem and can automate date/time entry, but some content may take longer to finish due to individual typing speed and less use of symbols. File layout can be used to great advantage in making types of information easy to find, and creating searchable master logs for a given position is very easy and a real lifesaver in reconstructing events or researching a given topic. We'll examine log formats from several console position, and the types of information that are included and (just as importantly) excluded. We'll also look at when a summary or synopsis is effective, and when extensive detail is needed.

  16. 77 FR 55817 - Delek Crude Logistics, LLC; Notice of Petition for Waiver

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ... using the eRegistration link. Select the eFiling link to log on and submit the intervention or protests... number. eFiling is encouraged. More detailed information relating to filing requirements, interventions...'') grant a temporary waiver of the filing and reporting requirements of sections 6 and 201 of the...

  17. Geologic cross section D-D' through the Appalachian basin from the Findlay arch, Sandusky County, Ohio, to the Valley and Ridge province, Hardy County, West Virginia: Chapter E.4.1 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Crangle, Robert D.; Trippi, Michael H.; Swezey, Christopher S.; Lentz, Erika E.; Rowan, Elisabeth L.; Hope, Rebecca S.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    This chapter is a re-release of U.S. Geological Survey Scientific Investigations Map 3067, of the same title, by Ryder and others (2009). For this chapter, two appendixes have been added that do not appear with the original version. Appendix A provides Log ASCII Standard (LAS) files for each drill hole along cross-section D-D'; they are text files which encode gamma-ray, neutron, density, and other logs that can be used by most well-logging software. Appendix B provides graphic well-log traces and lithologic descriptions with formation tops from each drill hole.

  18. Parallel checksumming of data chunks of a shared data object using a log-structured file system

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-09-06

    Checksum values are generated and used to verify the data integrity. A client executing in a parallel computing system stores a data chunk to a shared data object on a storage node in the parallel computing system. The client determines a checksum value for the data chunk; and provides the checksum value with the data chunk to the storage node that stores the shared object. The data chunk can be stored on the storage node with the corresponding checksum value as part of the shared object. The storage node may be part of a Parallel Log-Structured File System (PLFS), and the client may comprise, for example, a Log-Structured File System client on a compute node or burst buffer. The checksum value can be evaluated when the data chunk is read from the storage node to verify the integrity of the data that is read.

  19. 15 CFR 762.3 - Records exempt from recordkeeping requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...; (2) Special export file list; (3) Vessel log from freight forwarder; (4) Inspection certificate; (5... form; (12) Financial hold form; (13) Export parts shipping problem form; (14) Draft number log; (15) Expense invoice mailing log; (16) Financial status report; (17) Bank release of guarantees; (18) Cash...

  20. 15 CFR 762.3 - Records exempt from recordkeeping requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...; (2) Special export file list; (3) Vessel log from freight forwarder; (4) Inspection certificate; (5... form; (12) Financial hold form; (13) Export parts shipping problem form; (14) Draft number log; (15) Expense invoice mailing log; (16) Financial status report; (17) Bank release of guarantees; (18) Cash...

  1. Cooperative storage of shared files in a parallel computing system with dynamic block size

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-11-10

    Improved techniques are provided for parallel writing of data to a shared object in a parallel computing system. A method is provided for storing data generated by a plurality of parallel processes to a shared object in a parallel computing system. The method is performed by at least one of the processes and comprises: dynamically determining a block size for storing the data; exchanging a determined amount of the data with at least one additional process to achieve a block of the data having the dynamically determined block size; and writing the block of the data having the dynamically determined block size to a file system. The determined block size comprises, e.g., a total amount of the data to be stored divided by the number of parallel processes. The file system comprises, for example, a log structured virtual parallel file system, such as a Parallel Log-Structured File System (PLFS).

  2. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  3. 78 FR 70299 - Capacity Markets Partners, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-25

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  4. 78 FR 59923 - Buffalo Dunes Wind Project, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  5. 78 FR 28833 - Lighthouse Energy Group, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  6. 78 FR 29366 - Wheelabrator Baltimore, LP; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-20

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  7. 77 FR 64978 - Sunbury Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  8. 78 FR 62300 - Burgess Biopower LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-15

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  9. 78 FR 75561 - South Bay Energy Corp.; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  10. 78 FR 28833 - Ebensburg Power Company; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  11. 78 FR 72673 - Yellow Jacket Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-03

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street, NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  12. 78 FR 44557 - Guttman Energy Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  13. 78 FR 68052 - Covanta Haverhill Association, LP; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  14. 78 FR 49506 - Source Power & Gas LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  15. 77 FR 64980 - Noble Americas Energy Solutions LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE...://www.ferc.gov . To facilitate electronic service, persons with Internet access who will eFile a... using the eRegistration link. Select the eFiling link to log on and submit the intervention or protests...

  16. 78 FR 46939 - DWP Energy Holdings, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  17. 78 FR 28833 - CE Leathers Company; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  18. 78 FR 59014 - Lakeswind Power Partners, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-25

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street, NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  19. 78 FR 75560 - Green Current Solutions, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  20. 77 FR 64980 - Collegiate Clean Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  1. 77 FR 64977 - Frontier Utilities New York LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  2. 78 FR 62299 - West Deptford Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-15

    ... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  3. 78 FR 52913 - Allegany Generating Station LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-27

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  4. SedMob: A mobile application for creating sedimentary logs in the field

    NASA Astrophysics Data System (ADS)

    Wolniewicz, Pawel

    2014-05-01

    SedMob is an open-source, mobile software package for creating sedimentary logs, targeted for use in tablets and smartphones. The user can create an unlimited number of logs, save data from each bed in the log as well as export and synchronize the data with a remote server. SedMob is designed as a mobile interface to SedLog: a free multiplatform package for drawing graphic logs that runs on PC computers. Data entered into SedMob are saved in the CSV file format, fully compatible with SedLog.

  5. COMBATXXI, JDAFS, and LBC Integration Requirements for EASE

    DTIC Science & Technology

    2015-10-06

    process as linear and as new data is made available, any previous analysis is obsolete and has to start the process over again. Figure 2 proposes a...final line of the manifest file names the scenario file associated with the run. Under the usual practice, the analyst now starts the COMBATXXI...describes which events are to be logged. Finally the scenario is started with the click of a button. The simulation generates logs of a couple of sorts

  6. Model Analyst’s Toolkit User Guide, Version 7.1.0

    DTIC Science & Technology

    2015-08-01

    Help > About)  Environment details ( operating system )  metronome.log file, located in your MAT 7.1.0 installation folder  Any log file that...requirements to run the Model Analyst’s Toolkit:  Windows XP operating system (or higher) with Service Pack 2 and all critical Windows updates installed...application icon on your desktop  Create a Quick Launch icon – Creates a MAT application icon on the taskbar for operating systems released

  7. Users' information-seeking behavior on a medical library Website

    PubMed Central

    Rozic-Hristovski, Anamarija; Hristovski, Dimitar; Todorovski, Ljupco

    2002-01-01

    The Central Medical Library (CMK) at the Faculty of Medicine, University of Ljubljana, Slovenia, started to build a library Website that included a guide to library services and resources in 1997. The evaluation of Website usage plays an important role in its maintenance and development. Analyzing and exploring regularities in the visitors' behavior can be used to enhance the quality and facilitate delivery of information services, identify visitors' interests, and improve the server's performance. The analysis of the CMK Website users' navigational behavior was carried out by analyzing the Web server log files. These files contained information on all user accesses to the Website and provided a great opportunity to learn more about the behavior of visitors to the Website. The majority of the available tools for Web log file analysis provide a predefined set of reports showing the access count and the transferred bytes grouped along several dimensions. In addition to the reports mentioned above, the authors wanted to be able to perform interactive exploration and ad hoc analysis and discover trends in a user-friendly way. Because of that, we developed our own solution for exploring and analyzing the Web logs based on data warehousing and online analytical processing technologies. The analytical solution we developed proved successful, so it may find further application in the field of Web log file analysis. We will apply the findings of the analysis to restructuring the CMK Website. PMID:11999179

  8. 18 CFR 270.304 - Tight formation gas.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... determination that natural gas is tight formation gas must file with the jurisdictional agency an application... formation; (d) A complete copy of the well log, including the log heading identifying the designated tight...

  9. SU-F-T-177: Impacts of Gantry Angle Dependent Scanning Beam Properties for Proton Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Y; Clasie, B; Lu, H

    Purpose: In pencil beam scanning (PBS), the delivered spot MU, position and size are slightly different at different gantry angles. We investigated the level of delivery uncertainty at different gantry angles through a log file analysis. Methods: 34 PBS fields covering full 360 degrees gantry angle spread were collected retrospectively from 28 patients treated at our institution. All fields were delivered at zero gantry angle and the prescribed gantry angle, and measured at isocenter with the MatriXX 2D array detector at the prescribed gantry angle. The machine log files were analyzed to extract the delivered MU per spot and themore » beam position from the strip ionization chambers in the treatment nozzle. The beam size was separately measured as a function of gantry angle and beam energy. Using this information, the dose was calculated in a water phantom at both gantry angles and compared to the measurement using the 3D γ-index at 2mm/2%. Results: The spot-by-spot difference between the beam position in the log files from the delivery at the two gantry angles has a mean of 0.3 and 0.4 mm and a standard deviation of 0.6 and 0.7 mm for × and y directions, respectively. Similarly, the spot-by-spot difference between the MU in the log files from the delivery at the two gantry angles has a mean 0.01% and a standard deviation of 0.7%. These small deviations lead to an excellent agreement in dose calculations with an average γ pass rate for all fields being approximately 99.7%. When each calculation is compared to the measurement, a high correlation in γ was also found. Conclusion: Using machine logs files, we verified that PBS beam delivery at different gantry angles are sufficiently small and the planned spot position and MU. This study brings us one step closer to simplifying our patient-specific QA.« less

  10. MO-F-CAMPUS-I-01: A System for Automatically Calculating Organ and Effective Dose for Fluoroscopically-Guided Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Rana, V

    2015-06-15

    Purpose: A system was developed that automatically calculates the organ and effective dose for individual fluoroscopically-guided procedures using a log of the clinical exposure parameters. Methods: We have previously developed a dose tracking system (DTS) to provide a real-time color-coded 3D- mapping of skin dose. This software produces a log file of all geometry and exposure parameters for every x-ray pulse during a procedure. The data in the log files is input into PCXMC, a Monte Carlo program that calculates organ and effective dose for projections and exposure parameters set by the user. We developed a MATLAB program to readmore » data from the log files produced by the DTS and to automatically generate the definition files in the format used by PCXMC. The processing is done at the end of a procedure after all exposures are completed. Since there are thousands of exposure pulses with various parameters for fluoroscopy, DA and DSA and at various projections, the data for exposures with similar parameters is grouped prior to entry into PCXMC to reduce the number of Monte Carlo calculations that need to be performed. Results: The software developed automatically transfers data from the DTS log file to PCXMC and runs the program for each grouping of exposure pulses. When the dose from all exposure events are calculated, the doses for each organ and all effective doses are summed to obtain procedure totals. For a complicated interventional procedure, the calculations can be completed on a PC without manual intervention in less than 30 minutes depending on the level of data grouping. Conclusion: This system allows organ dose to be calculated for individual procedures for every patient without tedious calculations or data entry so that estimates of stochastic risk can be obtained in addition to the deterministic risk estimate provided by the DTS. Partial support from NIH grant R01EB002873 and Toshiba Medical Systems Corp.« less

  11. 78 FR 28834 - Salton Sea Power L.L.C.; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  12. 78 FR 28835 - Del Ranch Company; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  13. 78 FR 28835 - Patua Project LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  14. 78 FR 75561 - Great Bay Energy V, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  15. 77 FR 64981 - Homer City Generation, L.P.; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  16. 77 FR 69819 - Cirrus Wind 1, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-21

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  17. 77 FR 64979 - Great Bay Energy IV, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  18. 77 FR 53195 - H.A. Wagner LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-31

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  19. 78 FR 59923 - Mammoth Three LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  20. 78 FR 61945 - Tuscola Wind II, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-07

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  1. 77 FR 69819 - QC Power Strategies Fund LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-21

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  2. 78 FR 75561 - Astral Energy LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  3. Archive of single-beam bathymetry data collected during USGS cruise 07CCT01 nearshore of Fort Massachusetts and within Camille Cut, West and East Ship Islands, Gulf Islands National Seashore, Mississippi, July 2007

    USGS Publications Warehouse

    DeWitt, Nancy T.; Flocks, James G.; Reynolds, B.J.; Hansen, Mark

    2012-01-01

    The Gulf Islands National Seashore (GUIS) is composed of a series of barrier islands along the Mississippi - Alabama coastline. Historically these islands have undergone long-term shoreline change. The devastation of Hurricane Katrina in 2005 prompted questions about the stability of the barrier islands and their potential response to future storm impacts. Additionally, there was concern from the National Park Service (NPS) about the preservation of the historical Fort Massachusetts, located on West Ship Island. During the early 1900s, Ship Island was an individual island. In 1969 Hurricane Camille breached Ship Island, widening the cut and splitting it into what is now known as West Ship Island and East Ship Island. In July of 2007, the U.S. Geological Survey (USGS) was able to provide the NPS with a small bathymetric survey of Camille Cut using high-resolution single-beam bathymetry. This provided GUIS with a post-Katrina assessment of the bathymetry in Camille Cut and along the northern shoreline directly in front of Fort Massachusetts. Ultimately, this survey became an initial bathymetry dataset toward a larger USGS effort included in the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility Project (http://ngom.usgs.gov/gomsc/mscip/). This report serves as an archive of the processed single-beam bathymetry. Data products herein include gridded and interpolated digital depth surfaces and x,y,z data products. Additional files include trackline maps, navigation files, geographic information system (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Scanned images of the handwritten FACS logs and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for description of acronyms and abbreviations used in this report or hold the cursor over an acronym for a pop-up explanation. The USGS St. Petersburg Coastal and Marine Science Center assigns a unique identifier to each cruise or field activity. For example, 07CCT01 tells us the data were collected in 2007 for the Coastal Change and Transport (CCT) study and the data were collected during the first (01) field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. Data were collected using a 26-foot (ft) Glacier Bay catamaran. The single-beam transducers were sled mounted on a rail attached between the catamaran hulls. Navigation was acquired using HYPACK, Inc., Hypack version 4.3a.7.1 and differentially corrected using land-based GPS stations. See the digital FACS equipment log for details about the acquisition equipment used. Raw datasets were stored digitally and processed systematically using NovAtel's Waypoint GrafNav version 7.6, SANDS version 3.7, and ESRI ArcGIS version 9.3.1. For more information on processing refer to the Equipment and Processing page.

  4. Sight Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G.

    2014-09-01

    Enables applications to emit log information into an output file and produced a structured visual summary of the log data, as well as various statistical analyses of it. This makes it easier for developers to understand the behavior of their applications.

  5. 75 FR 60122 - Notice of Public Information Collection(s) Being Reviewed by the Federal Communications...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-29

    ... the respondents, including the use of automated collection techniques or other forms of information...: OMB Control Number: 3060-0360. Title: Section 80.409, Station Logs. Form No.: N/A. Type of Review... for filing suits upon such claims. Section 80.409(d), Ship Radiotelegraph Logs: Logs of ship stations...

  6. 78 FR 28834 - Elmore Company; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  7. 78 FR 49507 - OriGen Energy LLC ; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... securities and assumptions of liability. Any person desiring to intervene or to protest should file with the... with Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log...

  8. 78 FR 49507 - ORNI 47 LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  9. 77 FR 64981 - BITHENERGY, Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and...

  10. 78 FR 40473 - eBay Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-05

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  11. 78 FR 28832 - CalEnergy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  12. 17 CFR 259.602 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...)—allows a filer, filing agent or training agent to log on to the EDGAR system, submit filings, and change... agent to change its Password. [69 FR 22710, Apr. 26, 2004] Editorial Note: For Federal Register... section of the printed volume and at at www.fdsys.gov. ...

  13. 17 CFR 259.602 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...)—allows a filer, filing agent or training agent to log on to the EDGAR system, submit filings, and change... agent to change its Password. [69 FR 22710, Apr. 26, 2004] Editorial Note: For Federal Register... section of the printed volume and on GPO Access. ...

  14. Paleomagnetic dating: Methods, MATLAB software, example

    NASA Astrophysics Data System (ADS)

    Hnatyshin, Danny; Kravchinsky, Vadim A.

    2014-09-01

    A MATLAB software tool has been developed to provide an easy to use graphical interface for the plotting and interpretation of paleomagnetic data. The tool takes either paleomagnetic directions or paleopoles and compares them to a user defined apparent polar wander path or secular variation curve to determine the age of a paleomagnetic sample. Ages can be determined in two ways, either by translating the data onto the reference curve, or by rotating it about a set location (e.g. sampling location). The results are then compiled in data tables which can be exported as an excel file. This data can also be plotted using variety of built-in stereographic projections, which can then be exported as an image file. This software was used to date the giant Sukhoi Log gold deposit in Russia. Sukhoi Log has undergone a complicated history of faulting, folding, metamorphism, and is the vicinity of many granitic bodies. Paleomagnetic analysis of Sukhoi Log allowed for the timing of large scale thermal or chemical events to be determined. Paleomagnetic analysis from gold mineralized black shales was used to define the natural remanent magnetization recorded at Sukhoi Log. The obtained paleomagnetic direction from thermal demagnetization produced a paleopole at 61.3°N, 155.9°E, with the semi-major axis and semi-minor axis of the 95% confidence ellipse being 16.6° and 15.9° respectively. This paleopole is compared to the Siberian apparent polar wander path (APWP) by translating the paleopole to the nearest location on the APWP. This produced an age of 255.2- 31.0+ 32.0Ma and is the youngest well defined age known for Sukhoi Log. We propose that this is the last major stage of activity at Sukhoi Log, and likely had a role in determining the present day state of mineralization seen at the deposit.

  15. Logs Perl Module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owen, R. K.

    2007-04-04

    A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.

  16. Expansion of the roadway reference log : KYSPR-99-201.

    DOT National Transportation Integrated Search

    2000-05-01

    The objectives of this study were to: 1) expand the current route log to include milepoints for all intersections on state maintained roads and 2) recommend a procedure for establishing milepoints and maintaining the file with up-to-date information....

  17. 78 FR 52524 - Sunoco Pipeline LP; Notice of Petition for Declaratory Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-23

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... described in their petition. Any person desiring to intervene or to protest in this proceedings must file in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  18. 78 FR 62349 - Sunoco Pipeline L.P.; Notice of Petition for Declaratory Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-18

    ... to log on and submit the intervention or protests. Persons unable to file electronically should... petition. Any person desiring to intervene or to protest in this proceeding must file in accordance with..., persons with Internet access who will eFile a document and/or be listed as a contact for an intervenor...

  19. 17 CFR 249.446 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification Authorization Code (PMAC)—allows a filer, filing agent or training agent to change its Password. [69 FR 22710... Sections Affected, which appears in the Finding Aids section of the printed volume and on GPO Access. ...

  20. Web processing service for landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Sandric, I.; Ursaru, P.; Chitu, D.; Mihai, B.; Savulescu, I.

    2012-04-01

    Hazard analysis requires heavy computation and specialized software. Web processing services can offer complex solutions that can be accessed through a light client (web or desktop). This paper presents a web processing service (both WPS and Esri Geoprocessing Service) for landslides hazard assessment. The web processing service was build with Esri ArcGIS Server solution and Python, developed using ArcPy, GDAL Python and NumPy. A complex model for landslide hazard analysis using both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation was build and published as WPS and Geoprocessing service using ArcGIS Standard Enterprise 10.1. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. All these parameters can be served by the client from other WFS services or by uploading and processing the data on the server. The user can select the option of creating the first and second derivatives from the DEM automatically on the server or to upload the data already calculated. One of the main dynamic factors from the landslide analysis model is leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index can be derived from various satellite images or downloaded as a product. The upload of such data (time series) is possible using a NetCDF file format. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created and it can be downloaded from the server with the log file

  1. 78 FR 77155 - Grant Program To Assess, Evaluate, and Promote Development of Tribal Energy and Mineral Resources

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-20

    ... through DEMD's in-house databases; Well log interpretation, including correlation of formation tops.... Files must have descriptive file names to help DEMD quickly locate specific components of the proposal...

  2. Building analytical platform with Big Data solutions for log files of PanDA infrastructure

    NASA Astrophysics Data System (ADS)

    Alekseev, A. A.; Barreiro Megino, F. G.; Klimentov, A. A.; Korchuganova, T. A.; Maendo, T.; Padolski, S. V.

    2018-05-01

    The paper describes the implementation of a high-performance system for the processing and analysis of log files for the PanDA infrastructure of the ATLAS experiment at the Large Hadron Collider (LHC), responsible for the workload management of order of 2M daily jobs across the Worldwide LHC Computing Grid. The solution is based on the ELK technology stack, which includes several components: Filebeat, Logstash, ElasticSearch (ES), and Kibana. Filebeat is used to collect data from logs. Logstash processes data and export to Elasticsearch. ES are responsible for centralized data storage. Accumulated data in ES can be viewed using a special software Kibana. These components were integrated with the PanDA infrastructure and replaced previous log processing systems for increased scalability and usability. The authors will describe all the components and their configuration tuning for the current tasks, the scale of the actual system and give several real-life examples of how this centralized log processing and storage service is used to showcase the advantages for daily operations.

  3. 20 CFR 658.414 - Referral of non-JS-related complaints.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... applicable, were referred on the complaint log specified in § 658.410(c)(1). The JS official shall also prepare and keep the file specified in § 658.410(c)(3) for the complaints filed pursuant to paragraph (a...

  4. 78 FR 49506 - E.ON Global Commodities North America LLC; Supplemental Notice That Initial Market-Based Rate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  5. 78 FR 63977 - Enable Bakken Crude Services, LLC; Notice of Request For Waiver

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-25

    ... person desiring to intervene or to protest in this proceedings must file in accordance with Rules 211 and... Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and...

  6. 17 CFR 249.446 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification Authorization Code (PMAC)—allows a filer, filing agent or training agent to change its Password. [69 FR 22710... Sections Affected, which appears in the Finding Aids section of the printed volume and at at www.fdsys.gov. ...

  7. MAIL LOG, program theory, volume 2

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.

  8. SU-G-JeP1-08: Dual Modality Verification for Respiratory Gating Using New Real- Time Tumor Tracking Radiotherapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiinoki, T; Hanazawa, H; Shibuya, K

    Purpose: The respirato ry gating system combined the TrueBeam and a new real-time tumor-tracking radiotherapy system (RTRT) was installed. The RTRT system consists of two x-ray tubes and color image intensifiers. Using fluoroscopic images, the fiducial marker which was implanted near the tumor was tracked and was used as the internal surrogate for respiratory gating. The purposes of this study was to develop the verification technique of the respiratory gating with the new RTRT using cine electronic portal image device images (EPIDs) of TrueBeam and log files of the RTRT. Methods: A patient who underwent respiratory gated SBRT of themore » lung using the RTRT were enrolled in this study. For a patient, the log files of three-dimensional coordinate of fiducial marker used as an internal surrogate were acquired using the RTRT. Simultaneously, the cine EPIDs were acquired during respiratory gated radiotherapy. The data acquisition was performed for one field at five sessions during the course of SBRT. The residual motion errors were calculated using the log files (E{sub log}). The fiducial marker used as an internal surrogate into the cine EPIDs was automatically extracted by in-house software based on the template-matching algorithm. The differences between the the marker positions of cine EPIDs and digitally reconstructed radiograph were calculated (E{sub EPID}). Results: Marker detection on EPID using in-house software was influenced by low image contrast. For one field during the course of SBRT, the respiratory gating using the RTRT showed the mean ± S.D. of 95{sup th} percentile E{sub EPID} were 1.3 ± 0.3 mm,1.1 ± 0.5 mm,and those of E{sub log} were 1.5 ± 0.2 mm, 1.1 ± 0.2 mm in LR and SI directions, respectively. Conclusion: We have developed the verification method of respiratory gating combined TrueBeam and new real-time tumor-tracking radiotherapy system using EPIDs and log files.« less

  9. [Investigation of Elekta linac characteristics for VMAT].

    PubMed

    Luo, Guangwen; Zhang, Kunyi

    2012-01-01

    The aim of this study is to investigate the characteristics of Elekta delivery system for volumetric modulated arc therapy (VMAT). Five VMAT plans were delivered in service mode and dose rates, and speed of gantry and MLC leaves were analyzed by log files. Results showed that dose rates varied between 6 dose rates. Gantry and MLC leaf speed dynamically varied during delivery. The technique of VMAT requires linac to dynamically control more parameters, and these key dynamic variables during VMAT delivery can be checked by log files. Quality assurance procedure should be carried out for VMAT related parameter.

  10. Archive of digital boomer seismic reflection data collected during USGS field activities 95LCA03 and 96LCA02 in the Peace River of West-Central Florida, 1995 and 1996

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Tihansky, Ann B.; Lewelling, Bill R.; Flocks, James G.; Wiese, Dana S.; Kindinger, Jack G.; Harrison, Arnell S.

    2006-01-01

    In October and November of 1995 and February of 1996, the U.S. Geological Survey, in cooperation with the Southwest Florida Water Management District, conducted geophysical surveys of the Peace River in west-central Florida from east of Bartow to west of Arcadia. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, observers' logbooks, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  11. TU-CD-304-11: Veritas 2.0: A Cloud-Based Tool to Facilitate Research and Innovation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra, P; Patankar, A; Etmektzoglou, A

    Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verifiedmore » via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto.« less

  12. The Feasibility of Using Cluster Analysis to Examine Log Data from Educational Video Games. CRESST Report 790

    ERIC Educational Resources Information Center

    Kerr, Deirdre; Chung, Gregory K. W. K.; Iseli, Markus R.

    2011-01-01

    Analyzing log data from educational video games has proven to be a challenging endeavor. In this paper, we examine the feasibility of using cluster analysis to extract information from the log files that is interpretable in both the context of the game and the context of the subject area. If cluster analysis can be used to identify patterns of…

  13. A Kinect-based system for automatic recording of some pigeon behaviors.

    PubMed

    Lyons, Damian M; MacDonall, James S; Cunningham, Kelly M

    2015-12-01

    Contact switches and touch screens are the state of the art for recording pigeons' pecking behavior. Recording other behavior, however, requires a different sensor for each behavior, and some behaviors cannot easily be recorded. We present a flexible and inexpensive image-based approach to detecting and counting pigeon behaviors that is based on the Kinect sensor from Microsoft. Although the system is as easy to set up and use as the standard approaches, it is more flexible because it can record behaviors in addition to key pecking. In this article, we show how both the fast, fine motion of key pecking and the gross body activity of feeding can be measured. Five pigeons were trained to peck at a lighted contact switch, a pigeon key, to obtain food reward. The timing of the pecks and the food reward signals were recorded in a log file using standard equipment. The Kinect-based system, called BehaviorWatch, also measured the pecking and feeding behavior and generated a different log file. For key pecking, BehaviorWatch had an average sensitivity of 95% and a precision of 91%, which were very similar to the pecking measurements from the standard equipment. For detecting feeding activity, BehaviorWatch had a sensitivity of 95% and a precision of 97%. These results allow us to demonstrate that an advantage of the Kinect-based approach is that it can also be reliably used to measure activity other than key pecking.

  14. 18 CFR 401.110 - Fees.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... include staff time associated with: (A) Processing FOIA requests; (B) Locating and reviewing files; (C) Monitoring file reviews; (D) Generating computer records (electronic print-outs); and (E) Preparing logs of..., black and white copies. The charge for copying standard sized, black and white public records shall be...

  15. 18 CFR 401.110 - Fees.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... include staff time associated with: (A) Processing FOIA requests; (B) Locating and reviewing files; (C) Monitoring file reviews; (D) Generating computer records (electronic print-outs); and (E) Preparing logs of..., black and white copies. The charge for copying standard sized, black and white public records shall be...

  16. Archive of digital CHIRP seismic reflection data collected during USGS cruise 06FSH01 offshore of Siesta Key, Florida, May 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.; Robbins, Lisa L.

    2007-01-01

    In May of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Siesta Key, Florida. This report serves as an archive of unprocessed digital chirp seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  17. Archive of digital CHIRP seismic reflection data collected during USGS cruise 06SCC01 offshore of Isles Dernieres, Louisiana, June 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Ferina, Nick F.; Wiese, Dana S.; Flocks, James G.

    2007-01-01

    In June of 2006, the U.S. Geological Survey conducted a geophysical survey offshore of Isles Dernieres, Louisiana. This report serves as an archive of unprocessed digital CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic UNIX (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  18. 9 CFR 327.10 - Samples; inspection of consignments; refusal of entry; marking.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... import establishment and approved by the Director, Import Inspection Division, is on file at the import... (iv) That the establishment will maintain a daily stamping log containing the following information... covering the product to be inspected. The daily stamping log must be retained by the establishment in...

  19. 46 CFR 78.37-3 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... completed, the master or person in charge shall file the logbook with the Officer in Charge, Marine... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  20. 46 CFR 131.610 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) OFFSHORE SUPPLY VESSELS OPERATIONS Logs § 131... them. (d) When a voyage is completed, or after a specified time has elapsed, the master shall file the... alternative log or record for making entries required by law, including regulations in this subchapter. This...

  1. 46 CFR 131.610 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) OFFSHORE SUPPLY VESSELS OPERATIONS Logs § 131... them. (d) When a voyage is completed, or after a specified time has elapsed, the master shall file the... alternative log or record for making entries required by law, including regulations in this subchapter. This...

  2. 9 CFR 327.10 - Samples; inspection of consignments; refusal of entry; marking.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... import establishment and approved by the Director, Import Inspection Division, is on file at the import... (iv) That the establishment will maintain a daily stamping log containing the following information... covering the product to be inspected. The daily stamping log must be retained by the establishment in...

  3. 46 CFR 78.37-3 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... completed, the master or person in charge shall file the logbook with the Officer in Charge, Marine... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  4. A Varian DynaLog file-based procedure for patient dose-volume histogram-based IMRT QA.

    PubMed

    Calvo-Ortega, Juan F; Teke, Tony; Moragues, Sandra; Pozo, Miquel; Casals-Farran, Joan

    2014-03-06

    In the present study, we describe a method based on the analysis of the dynamic MLC log files (DynaLog) generated by the controller of a Varian linear accelerator in order to perform patient-specific IMRT QA. The DynaLog files of a Varian Millennium MLC, recorded during an IMRT treatment, can be processed using a MATLAB-based code in order to generate the actual fluence for each beam and so recalculate the actual patient dose distribution using the Eclipse treatment planning system. The accuracy of the DynaLog-based dose reconstruction procedure was assessed by introducing ten intended errors to perturb the fluence of the beams of a reference plan such that ten subsequent erroneous plans were generated. In-phantom measurements with an ionization chamber (ion chamber) and planar dose measurements using an EPID system were performed to investigate the correlation between the measured dose changes and the expected ones detected by the reconstructed plans for the ten intended erroneous cases. Moreover, the method was applied to 20 cases of clinical plans for different locations (prostate, lung, breast, and head and neck). A dose-volume histogram (DVH) metric was used to evaluate the impact of the delivery errors in terms of dose to the patient. The ionometric measurements revealed a significant positive correlation (R² = 0.9993) between the variations of the dose induced in the erroneous plans with respect to the reference plan and the corresponding changes indicated by the DynaLog-based reconstructed plans. The EPID measurements showed that the accuracy of the DynaLog-based method to reconstruct the beam fluence was comparable with the dosimetric resolution of the portal dosimetry used in this work (3%/3 mm). The DynaLog-based reconstruction method described in this study is a suitable tool to perform a patient-specific IMRT QA. This method allows us to perform patient-specific IMRT QA by evaluating the result based on the DVH metric of the planning CT image (patient DVH-based IMRT QA).

  5. 40 CFR 60.288a - Reporting.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... test to generate a submission package file, which documents performance test data. You must then submit the file generated by the ERT through the EPA's Compliance and Emissions Data Reporting Interface (CEDRI), which can be accessed by logging in to the EPA's Central Data Exchange (CDX) (https://cdx.epa...

  6. Examining the Return on Investment of a Security Information and Event Management Solution in a Notional Department of Defense Network Environment

    DTIC Science & Technology

    2013-06-01

    collection are the facts that devices the lack encryption or compression methods and that the log file must be saved on the host system prior to transfer...time. Statistical correlation utilizes numerical algorithms to detect deviations from normal event levels and other routine activities (Chuvakin...can also assist in detecting low volume threats. Although easy and logical to implement, the implementation of statistical correlation algorithms

  7. Archive of Digital Boomer Seismic Reflection Data Collected During USGS Field Activities 93LCA01 and 94LCA01 in Kingsley, Orange, and Lowry Lakes, Northeast Florida, 1993 and 1994

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Davis, Jeffrey B.; Flocks, James G.; Wiese, Dana S.

    2004-01-01

    In August and September of 1993 and January of 1994, the U.S. Geological Survey, under a cooperative agreement with the St. Johns River Water Management District (SJRWMD), conducted geophysical surveys of Kingsley Lake, Orange Lake, and Lowry Lake in northeast Florida. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, observer's logbook, Field Activity Collection System (FACS) logs, and formal FGDC metadata. A filtered and gained GIF image of each seismic profile is also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. The data archived here were collected under a cooperative agreement with the St. Johns River Water Management District as part of the USGS Lakes and Coastal Aquifers (LCA) Project. For further information about this study, refer to http://coastal.er.usgs.gov/stjohns, Kindinger and others (1994), and Kindinger and others (2000). The USGS Florida Integrated Science Center (FISC) - Coastal and Watershed Studies in St. Petersburg, Florida, assigns a unique identifier to each cruise or field activity. For example, 93LCA01 tells us the data were collected in 1993 for the Lakes and Coastal Aquifers (LCA) Project and the data were collected during the first field activity for that project in that calendar year. For a detailed description of the method used to assign the field activity ID, see http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html. The boomer is an acoustic energy source that consists of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled at the sea surface and when discharged emits a short acoustic pulse, or shot, that propagates through the water and sediment column. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by the receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (e.g., 0.5 s) and recorded for specific intervals of time (e.g., 100 ms). In this way, a two-dimensional vertical image of the shallow geologic structure beneath the ship track is produced. Acquisition geometery for 94LCA01 is recorded in the operations logbook. No logbook exists for 93LCA01. Table 1 displays acquisition parameters for both field activities. For more information about the acquisition equipment used, refer to the FACS equipment logs. The unprocessed seismic data are stored in SEG-Y format (Barry and others, 1975). For a detailed description of the data format, refer to the SEG-Y Format page. See the How To Download SEG-Y Data page for more information about these files. Processed profiles can be viewed as GIF images from the Profiles page. Refer to the Software page for details about the processing and examples of the processing scripts. Detailed information about the navigation systems used for each field activity can be found in Table 1 and the FACS equipment logs. To view the trackline maps and navigation files, and for more information about these items, see the Navigation page. The original trace files were recorded in nonstandard ELICS format and later converted to standard SEG-Y format. The original trace files for 94LCA01 lines ORJ127_1, ORJ127_3, and ORJ131_1 were divided into two or more trace files (e.g., ORJ127_1 became ORJ127_1a and ORJ127_1b) because the original total number of traces exceeded the maximum allowed by the processing system. Digital data were not recoverable for 93LCA

  8. Archive of Side Scan Sonar and Swath Bathymetry Data collected during USGS Cruise 10CCT02 Offshore of Petit Bois Island Including Petit Bois Pass, Gulf Islands National Seashore, Mississippi, March 2010

    USGS Publications Warehouse

    Pfeiffer, William R.; Flocks, James G.; DeWitt, Nancy T.; Forde, Arnell S.; Kelso, Kyle; Thompson, Phillip R.; Wiese, Dana S.

    2011-01-01

    In March of 2010, the U.S. Geological Survey (USGS) conducted geophysical surveys offshore of Petit Bois Island, Mississippi, and Dauphin Island, Alabama (fig. 1). These efforts were part of the USGS Gulf of Mexico Science Coordination partnership with the U.S. Army Corps of Engineers (USACE) to assist the Mississippi Coastal Improvements Program (MsCIP) and the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazards Susceptibility Project by mapping the shallow geologic stratigraphic framework of the Mississippi Barrier Island Complex. These geophysical surveys will provide the data necessary for scientists to define, interpret, and provide baseline bathymetry and seafloor habitat for this area and to aid scientists in predicting future geomorphological changes of the islands with respect to climate change, storm impact, and sea-level rise. Furthermore, these data will provide information for barrier island restoration, particularly in Camille Cut, and protection for the historical Fort Massachusetts on Ship Island, Mississippi. For more information please refer to http://ngom.usgs.gov/gomsc/mscip/index.html. This report serves as an archive of the processed swath bathymetry and side scan sonar data (SSS). Data products herein include gridded and interpolated surfaces, seabed backscatter images, and ASCII x,y,z data products for both swath bathymetry and side scan sonar imagery. Additional files include trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal FGDC metadata. Scanned images of the handwritten and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.

  9. Developing a Complete and Effective ACT-R Architecture

    DTIC Science & Technology

    2008-01-01

    of computational primitives , as contrasted with the predominant “one-off” and “grab-bag” cognitive models in the field. These architectures have...transport/ semaphore protocols connected via a glue script. Both protocols rely on the fact that file rename and file remove operations are atomic...the Trial Log file until just prior to processing the next input request. Thus, to perform synchronous identifications it is necessary to run an

  10. Techtalk: Telecommunications for Improving Developmental Education.

    ERIC Educational Resources Information Center

    Caverly, David C.; Broderick, Bill

    1993-01-01

    Explains how to access the Internet, discussing hardware and software considerations, connectivity, and types of access available to users. Describes the uses of electronic mail; TELNET, a method for remotely logging onto another computer; and anonymous File Transfer Protocol (FTP), a method for downloading files from a remote computer. (MAB)

  11. Use patterns of health information exchange through a multidimensional lens: conceptual framework and empirical validation.

    PubMed

    Politi, Liran; Codish, Shlomi; Sagy, Iftach; Fink, Lior

    2014-12-01

    Insights about patterns of system use are often gained through the analysis of system log files, which record the actual behavior of users. In a clinical context, however, few attempts have been made to typify system use through log file analysis. The present study offers a framework for identifying, describing, and discerning among patterns of use of a clinical information retrieval system. We use the session attributes of volume, diversity, granularity, duration, and content to define a multidimensional space in which each specific session can be positioned. We also describe an analytical method for identifying the common archetypes of system use in this multidimensional space. We demonstrate the value of the proposed framework with a log file of the use of a health information exchange (HIE) system by physicians in an emergency department (ED) of a large Israeli hospital. The analysis reveals five distinct patterns of system use, which have yet to be described in the relevant literature. The results of this study have the potential to inform the design of HIE systems for efficient and effective use, thus increasing their contribution to the clinical decision-making process. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilsche, Thomas; Schuchart, Joseph; Cope, Joseph

    Event tracing is an important tool for understanding the performance of parallel applications. As concurrency increases in leadership-class computing systems, the quantity of performance log data can overload the parallel file system, perturbing the application being observed. In this work we present a solution for event tracing at leadership scales. We enhance the I/O forwarding system software to aggregate and reorganize log data prior to writing to the storage system, significantly reducing the burden on the underlying file system for this type of traffic. Furthermore, we augment the I/O forwarding system with a write buffering capability to limit the impactmore » of artificial perturbations from log data accesses on traced applications. To validate the approach, we modify the Vampir tracing tool to take advantage of this new capability and show that the approach increases the maximum traced application size by a factor of 5x to more than 200,000 processors.« less

  13. 46 CFR 196.35-3 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... form CG-706 or in the owner's format for an official logbook. Such logs must be kept available for a... master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection. (b) The... of making entries therein as required by law or regulations in this subchapter. Such logs or records...

  14. 46 CFR 196.35-3 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... form CG-706 or in the owner's format for an official logbook. Such logs must be kept available for a... master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection. (b) The... of making entries therein as required by law or regulations in this subchapter. Such logs or records...

  15. 46 CFR 35.07-5 - Logbooks and records-TB/ALL.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., the master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  16. 29 CFR 1960.28 - Employee reports of unsafe or unhealthful working conditions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... report of an existing or potential unsafe or unhealthful working condition should be recorded on a log maintained at the establishment. If an agency finds it inappropriate to maintain a log of written reports at... sequentially numbered case file, coded for identification, should be assigned for purposes of maintaining an...

  17. 20 CFR 658.422 - Handling of non-JS-related complaints by the Regional Administrator.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... non-JS-related complaints alleging violations of employment related laws shall be logged. The... which the complainant (or complaint) was referred on a complaint log, similar to the one described in § 658.410(c)(1). The appropriate regional official shall also prepare and keep the file specified in...

  18. 46 CFR 35.07-5 - Logbooks and records-TB/ALL.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., the master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  19. LogScope

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex

    2012-01-01

    LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).

  20. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hellfire Missile Test at Yuma Proving Ground

    DTIC Science & Technology

    2001-11-01

    that there were· no· target misses. The Hellfire missile does not have a depleted uranium head . . -,, 2.2.2.3 Tank movement During the test, the...guide otber users through the use of this. complicated program. The_input data files for NOISEMAP consist of a root file name with several extensions...SOURCES subdirectory. This file will have the root file name followed by an accession number, then the .bps extension. The user must check the *.log

  1. Development of a user-friendly system for image processing of electron microscopy by integrating a web browser and PIONE with Eos.

    PubMed

    Tsukamoto, Takafumi; Yasunaga, Takuo

    2014-11-01

    Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also need to provide common workspace for analysis because the client is physically separated from a server. We solved the file format problem by extension of rules of OptionControlFile of Eos. Furthermore, to solve workspace problems, we have developed two type of system. The first system is to use only local environments. The user runs a web server provided by Eos, access to a web client through a web browser, and manipulate the local files with GUI on the web browser. The second system is employing PIONE (Process-rule for Input/Output Negotiation Environment), which is our developing platform that works under heterogenic distributed environment. The users can put their resources, such as microscopic images, text files and so on, into the server-side environment supported by PIONE, and so experts can write PIONE rule definition, which defines a workflow of image processing. PIONE run each image processing on suitable computers, following the defined rule. PIONE has the ability of interactive manipulation, and user is able to try a command with various setting values. In this situation, we contribute to auto-generation of GUI for a PIONE workflow.As advanced functions, we have developed a module to log user actions. The logs include information such as setting values in image processing, procedure of commands and so on. If we use the logs effectively, we can get a lot of advantages. For example, when an expert may discover some know-how of image processing, other users can also share logs including his know-hows and so we may obtain recommendation workflow of image analysis, if we analyze logs. To implement social platform of image processing for electron microscopists, we have developed system infrastructure, as well. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Analysis of the access patterns at GSFC distributed active archive center

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore; Bedet, Jean-Jacques

    1996-01-01

    The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been operational for more than two years. Its mission is to support existing and pre Earth Observing System (EOS) Earth science datasets, facilitate the scientific research, and test Earth Observing System Data and Information System (EOSDIS) concepts. Over 550,000 files and documents have been archived, and more than six Terabytes have been distributed to the scientific community. Information about user request and file access patterns, and their impact on system loading, is needed to optimize current operations and to plan for future archives. To facilitate the management of daily activities, the GSFC DAAC has developed a data base system to track correspondence, requests, ingestion and distribution. In addition, several log files which record transactions on Unitree are maintained and periodically examined. This study identifies some of the users' requests and file access patterns at the GSFC DAAC during 1995. The analysis is limited to the subset of orders for which the data files are under the control of the Hierarchical Storage Management (HSM) Unitree. The results show that most of the data volume ordered was for two data products. The volume was also mostly made up of level 3 and 4 data and most of the volume was distributed on 8 mm and 4 mm tapes. In addition, most of the volume ordered was for deliveries in North America although there was a significant world-wide use. There was a wide range of request sizes in terms of volume and number of files ordered. On an average 78.6 files were ordered per request. Using the data managed by Unitree, several caching algorithms have been evaluated for both hit rate and the overhead ('cost') associated with the movement of data from near-line devices to disks. The algorithm called LRU/2 bin was found to be the best for this workload, but the STbin algorithm also worked well.

  3. 25 CFR 215.23 - Cooperation between superintendent and district mining supervisor.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... notices, reports, drill logs, maps, and records, and all other information relating to mining operations required by said regulations to be submitted by lessees, and shall maintain a file thereof for the superintendent. (b) The files of the Geological Survey supervisor relating to lead and zinc leases of Quapaw...

  4. Agentless Cloud-Wide Monitoring of Virtual Disk State

    DTIC Science & Technology

    2015-10-01

    packages include Apache, MySQL , PHP, Ruby on Rails, Java Application Servers, and many others. Figure 2.12 shows the results of a run of the Software...Linux, Apache, MySQL , PHP (LAMP) set of applications. Thus, many file-level update logs will contain the same versions of files repeated across many

  5. Military Standard Common APSE (Ada Programming Support Environment) Interface Set (CAIS).

    DTIC Science & Technology

    1985-01-01

    QUEUEASE. LAST-KEY (QUEENAME) . LASTREI.TIONI(QUEUE-NAME). FILE-NODE. PORN . ATTRIBUTTES. ACCESSCONTROL. LEVEL); CLOSE (QUEUE BASE); CLOSE(FILE NODE...PROPOSED XIIT-STD-C.4 31 J NNUAfY logs procedure zTERT (ITERATOR: out NODE ITERATON; MAMIE: NAME STRING.KIND: NODE KID : KEY : RELATIONSHIP KEY PA1TTE1 :R

  6. Archive of digital boomer and CHIRP seismic reflection data collected during USGS cruise 06FSH03 offshore of Fort Lauderdale, Florida, September 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Reich, Christopher D.; Wiese, Dana S.; Greenwood, Jason W.; Swarzenski, Peter W.

    2007-01-01

    In September of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Fort Lauderdale, FL. This report serves as an archive of unprocessed digital boomer and CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  7. 47 CFR 76.1704 - Proof-of-performance test data.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-performance test data. (a) The proof of performance tests required by § 76.601 shall be maintained on file at... subscribers, subject to the requirements of § 76.601(d). Note to § 76.1704: If a signal leakage log is being... log must be retained for the period specified in § 76.601(d). ...

  8. 49 CFR Appendix A to Part 225 - Schedule of Civil Penalties 1

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... $1,000 $2,000 225.11Reports of accidents/ incidents 2,500 5,000 225.12(a): Failure to file Railroad... noncompliance: (1) a missing or incomplete log entry for a particular employee's injury or illness; or (2) a missing or incomplete log record for a particular rail equipment accident or incident. Each day a...

  9. 47 CFR 76.1704 - Proof-of-performance test data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-performance test data. (a) The proof of performance tests required by § 76.601 shall be maintained on file at... subscribers, subject to the requirements of § 76.601(d). Note to § 76.1704: If a signal leakage log is being... log must be retained for the period specified in § 76.601(d). ...

  10. Consistency of Students' Pace in Online Learning

    ERIC Educational Resources Information Center

    Hershkovitz, Arnon; Nachmias, Rafi

    2009-01-01

    The purpose of this study is to investigate the consistency of students' behavior regarding their pace of actions over sessions within an online course. Pace in a session is defined as the number of logged actions divided by session length (in minutes). Log files of 6,112 students were collected, and datasets were constructed for examining pace…

  11. Analysis of the Impact of Data Normalization on Cyber Event Correlation Query Performance

    DTIC Science & Technology

    2012-03-01

    2003). Organizations use it in planning, target marketing , decision-making, data analysis, and customer services (Shin, 2003). Organizations that...Following this IP address is a router message sequence number. This is a globally unique number for each router terminal and can range from...Appendix G, invokes the PERL parser for the log files from a particular USAF base, and invokes the CTL file that loads the resultant CSV file into the

  12. Sawmill: A Logging File System for a High-Performance RAID Disk Array

    DTIC Science & Technology

    1995-01-01

    from limiting disk performance, new controller architectures connect the disks directly to the network so that data movement bypasses the file server...These developments raise two questions for file systems: how to get the best performance from a RAID, and how to use such a controller architecture ...the RAID-II storage system; this architecture provides a fast data path that moves data rapidly among the disks, high-speed controller memory, and the

  13. 32 CFR 776.80 - Initial screening and Rules Counsel.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Director, JA Division, HQMC, to JAR. (b) JAG(13) and JAR shall log all complaints received and will ensure... within 30 days of the date of its return, the Rules Counsel may close the file without further action... action to close the file. (2) Complaints that comply with the requirements shall be further reviewed by...

  14. Online Courses Assessment through Measuring and Archetyping of Usage Data

    ERIC Educational Resources Information Center

    Kazanidis, Ioannis; Theodosiou, Theodosios; Petasakis, Ioannis; Valsamidis, Stavros

    2016-01-01

    Database files and additional log files of Learning Management Systems (LMSs) contain an enormous volume of data which usually remain unexploited. A new methodology is proposed in order to analyse these data both on the level of both the courses and the learners. Specifically, "regression analysis" is proposed as a first step in the…

  15. SU-E-T-261: Plan Quality Assurance of VMAT Using Fluence Images Reconstituted From Log-Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katsuta, Y; Shimizu, E; Matsunaga, K

    2014-06-01

    Purpose: A successful VMAT plan delivery includes precise modulations of dose rate, gantry rotational and multi-leaf collimator (MLC) shapes. One of the main problem in the plan quality assurance is dosimetric errors associated with leaf-positional errors are difficult to analyze because they vary with MU delivered and leaf number. In this study, we calculated integrated fluence error image (IFEI) from log-files and evaluated plan quality in the area of all and individual MLC leaves scanned. Methods: The log-file reported the expected and actual position for inner 20 MLC leaves and the dose fraction every 0.25 seconds during prostate VMAT onmore » Elekta Synergy. These data were imported to in-house software that developed to calculate expected and actual fluence images from the difference of opposing leaf trajectories and dose fraction at each time. The IFEI was obtained by adding all of the absolute value of the difference between expected and actual fluence images corresponding. Results: In the area all MLC leaves scanned in the IFEI, the average and root mean square (rms) were 2.5 and 3.6 MU, the area of errors below 10, 5 and 3 MU were 98.5, 86.7 and 68.1 %, the 95 % of area was covered with less than error of 7.1 MU. In the area individual MLC leaves scanned in the IFEI, the average and rms value were 2.1 – 3.0 and 3.1 – 4.0 MU, the area of errors below 10, 5 and 3 MU were 97.6 – 99.5, 81.7 – 89.5 and 51.2 – 72.8 %, the 95 % of area was covered with less than error of 6.6 – 8.2 MU. Conclusion: The analysis of the IFEI reconstituted from log-file was provided detailed information about the delivery in the area of all and individual MLC leaves scanned.« less

  16. Comparing image search behaviour in the ARRS GoldMiner search engine and a clinical PACS/RIS.

    PubMed

    De-Arteaga, Maria; Eggel, Ivan; Do, Bao; Rubin, Daniel; Kahn, Charles E; Müller, Henning

    2015-08-01

    Information search has changed the way we manage knowledge and the ubiquity of information access has made search a frequent activity, whether via Internet search engines or increasingly via mobile devices. Medical information search is in this respect no different and much research has been devoted to analyzing the way in which physicians aim to access information. Medical image search is a much smaller domain but has gained much attention as it has different characteristics than search for text documents. While web search log files have been analysed many times to better understand user behaviour, the log files of hospital internal systems for search in a PACS/RIS (Picture Archival and Communication System, Radiology Information System) have rarely been analysed. Such a comparison between a hospital PACS/RIS search and a web system for searching images of the biomedical literature is the goal of this paper. Objectives are to identify similarities and differences in search behaviour of the two systems, which could then be used to optimize existing systems and build new search engines. Log files of the ARRS GoldMiner medical image search engine (freely accessible on the Internet) containing 222,005 queries, and log files of Stanford's internal PACS/RIS search called radTF containing 18,068 queries were analysed. Each query was preprocessed and all query terms were mapped to the RadLex (Radiology Lexicon) terminology, a comprehensive lexicon of radiology terms created and maintained by the Radiological Society of North America, so the semantic content in the queries and the links between terms could be analysed, and synonyms for the same concept could be detected. RadLex was mainly created for the use in radiology reports, to aid structured reporting and the preparation of educational material (Lanlotz, 2006) [1]. In standard medical vocabularies such as MeSH (Medical Subject Headings) and UMLS (Unified Medical Language System) specific terms of radiology are often underrepresented, therefore RadLex was considered to be the best option for this task. The results show a surprising similarity between the usage behaviour in the two systems, but several subtle differences can also be noted. The average number of terms per query is 2.21 for GoldMiner and 2.07 for radTF, the used axes of RadLex (anatomy, pathology, findings, …) have almost the same distribution with clinical findings being the most frequent and the anatomical entity the second; also, combinations of RadLex axes are extremely similar between the two systems. Differences include a longer length of the sessions in radTF than in GoldMiner (3.4 and 1.9 queries per session on average). Several frequent search terms overlap but some strong differences exist in the details. In radTF the term "normal" is frequent, whereas in GoldMiner it is not. This makes intuitive sense, as in the literature normal cases are rarely described whereas in clinical work the comparison with normal cases is often a first step. The general similarity in many points is likely due to the fact that users of the two systems are influenced by their daily behaviour in using standard web search engines and follow this behaviour in their professional search. This means that many results and insights gained from standard web search can likely be transferred to more specialized search systems. Still, specialized log files can be used to find out more on reformulations and detailed strategies of users to find the right content. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Predicting Correctness of Problem Solving from Low-Level Log Data in Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Cetintas, Suleyman; Si, Luo; Xin, Yan Ping; Hord, Casey

    2009-01-01

    This paper proposes a learning based method that can automatically determine how likely a student is to give a correct answer to a problem in an intelligent tutoring system. Only log files that record students' actions with the system are used to train the model, therefore the modeling process doesn't require expert knowledge for identifying…

  18. 9 CFR 381.204 - Marking of poultry products offered for entry; official import inspection marks and devices.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., Import Inspection Division, is on file at the import inspection facility where the inspection is to be... stamping log containing the following information for each lot of product: the date of inspection, the... container marks, and the MP-410 number covering the product to be inspected. The daily stamping log must be...

  19. From Log Files to Assessment Metrics: Measuring Students' Science Inquiry Skills Using Educational Data Mining

    ERIC Educational Resources Information Center

    Gobert, Janice D.; Sao Pedro, Michael; Raziuddin, Juelaila; Baker, Ryan S.

    2013-01-01

    We present a method for assessing science inquiry performance, specifically for the inquiry skill of designing and conducting experiments, using educational data mining on students' log data from online microworlds in the Inq-ITS system (Inquiry Intelligent Tutoring System; www.inq-its.org). In our approach, we use a 2-step process: First we use…

  20. Parallel file system with metadata distributed across partitioned key-value store c

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  1. Archive of Digital Boomer Seismic Reflection Data Collected During USGS Field Activity 08LCA04 in Lakes Cherry, Helen, Hiawassee, Louisa, and Prevatt, Central Florida, September 2008

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Davis, Jeffrey B.; Flocks, James G.; Wiese, Dana S.

    2009-01-01

    From September 2 through 4, 2008, the U.S. Geological Survey and St. Johns River Water Management District (SJRWMD) conducted geophysical surveys in Lakes Cherry, Helen, Hiawassee, Louisa, and Prevatt, central Florida. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, FACS logs, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  2. Archive of digital chirp subbottom profile data collected during USGS Cruise 13GFP01, Brownlee Dam and Hells Canyon Reservoir, Idaho and Oregon, 2013

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Fosness, Ryan L.; Welcker, Chris; Kelso, Kyle W.

    2014-01-01

    From March 16 - 31, 2013, the U.S. Geological Survey in cooperation with the Idaho Power Company conducted a geophysical survey to investigate sediment deposits and long-term sediment transport within the Snake River from Brownlee Dam to Hells Canyon Reservoir, along the Idaho and Oregon border; this effort will help the USGS to better understand geologic processes. This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of acronyms and abbreviations used in this report.

  3. Archive of digital chirp subbottom profile data collected during USGS cruise 11BIM01 Offshore of the Chandeleur Islands, Louisiana, June 2011

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Miselis, Jennifer L.; Flocks, James G.; Wiese, Dana S.

    2013-01-01

    From June 3 to 13, 2011, the U.S. Geological Survey conducted a geophysical survey to investigate the geologic controls on barrier island framework and long-term sediment transport along the oil spill mitigation sand berm constructed at the north end and just offshore of the Chandeleur Islands, LA. This effort is part of a broader USGS study, which seeks to better understand barrier island evolution over medium time scales (months to years). This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided.

  4. Ground-water data for the Hanna and Carbon basins, south-central Wyoming, through 1980

    USGS Publications Warehouse

    Daddow, P.B.

    1986-01-01

    Groundwater resources in the Hanna and Carbon Basins of Wyoming were assessed in a study from 1974 through 1980 because of the development of coal mining in the area. Data collected from 105 wells during that study, including well-completion records, lithologic logs, and water levels, are presented. The data are from stock wells, coal-test holes completed as observation wells by the U.S. Geological Survey. The data are mostly from mined coal-bearing formations: the Tertiary Hanna Formation and the Tertiary and Cretaceous Ferris Formation. Well-completion data and lithologic logs were collected on-site during drilling of the wells or from U.S. Geological Survey files, company records, Wyoming State Engineer well-permit files, and published reports. (USGS)

  5. VizieR Online Data Catalog: The Gemini Observation Log (CADC, 2001-)

    NASA Astrophysics Data System (ADS)

    Association of Universities For Research in Astronomy

    2018-01-01

    This database contains a log of the Gemini Telescope observations since 2001, managed by the Canadian Astronomical Data Center (CADC). The data are regularly updated (see the date of the last version at the end of this file). The Gemini Observatory consists of twin 8.1-meter diameter optical/infrared telescopes located on two of the best observing sites on the planet. From their locations on mountains in Hawai'i and Chile, Gemini Observatory's telescopes can collectively access the entire sky. Gemini is operated by a partnership of five countries including the United States, Canada, Brazil, Argentina and Chile. Any astronomer in these countries can apply for time on Gemini, which is allocated in proportion to each partner's financial stake. (1 data file).

  6. No3CoGP: non-conserved and conserved coexpressed gene pairs.

    PubMed

    Mal, Chittabrata; Aftabuddin, Md; Kundu, Sudip

    2014-12-08

    Analyzing the microarray data of different conditions, one can identify the conserved and condition-specific genes and gene modules, and thus can infer the underlying cellular activities. All the available tools based on Bioconductor and R packages differ in how they extract differential coexpression and at what level they study. There is a need for a user-friendly, flexible tool which can start analysis using raw or preprocessed microarray data and can report different levels of useful information. We present a GUI software, No3CoGP: Non-Conserved and Conserved Coexpressed Gene Pairs which takes Affymetrix microarray data (.CEL files or log2 normalized.txt files) along with annotation file (.csv file), Chip Definition File (CDF file) and probe file as inputs, utilizes the concept of network density cut-off and Fisher's z-test to extract biologically relevant information. It can identify four possible types of gene pairs based on their coexpression relationships. These are (i) gene pair showing coexpression in one condition but not in the other, (ii) gene pair which is positively coexpressed in one condition but negatively coexpressed in the other condition, (iii) positively and (iv) negatively coexpressed in both the conditions. Further, it can generate modules of coexpressed genes. Easy-to-use GUI interface enables researchers without knowledge in R language to use No3CoGP. Utilization of one or more CPU cores, depending on the availability, speeds up the program. The output files stored in the respective directories under the user-defined project offer the researchers to unravel condition-specific functionalities of gene, gene sets or modules.

  7. Ontology based log content extraction engine for a posteriori security control.

    PubMed

    Azkia, Hanieh; Cuppens-Boulahia, Nora; Cuppens, Frédéric; Coatrieux, Gouenou

    2012-01-01

    In a posteriori access control, users are accountable for actions they performed and must provide evidence, when required by some legal authorities for instance, to prove that these actions were legitimate. Generally, log files contain the needed data to achieve this goal. This logged data can be recorded in several formats; we consider here IHE-ATNA (Integrating the healthcare enterprise-Audit Trail and Node Authentication) as log format. The difficulty lies in extracting useful information regardless of the log format. A posteriori access control frameworks often include a log filtering engine that provides this extraction function. In this paper we define and enforce this function by building an IHE-ATNA based ontology model, which we query using SPARQL, and show how the a posteriori security controls are made effective and easier based on this function.

  8. A Prototype Implementation of a Time Interval File Protection System in Linux

    DTIC Science & Technology

    2006-09-01

    when a user logs in, the /etc/ passwd file is read by the system to get the user’s home directory. The user’s login shell then changes the directory...and don. • Users can be added with the command: # useradd – m <username> • Set the password by: # passwd <username> • Make a copy of the

  9. Archive of side scan sonar and swath bathymetry data collected during USGS cruise 10CCT01 offshore of Cat Island, Gulf Islands National Seashore, Mississippi, March 2010

    USGS Publications Warehouse

    DeWitt, Nancy T.; Flocks, James G.; Pfeiffer, William R.; Wiese, Dana S.

    2010-01-01

    In March of 2010, the U.S. Geological Survey (USGS) conducted geophysical surveys east of Cat Island, Mississippi (fig. 1). The efforts were part of the USGS Gulf of Mexico Science Coordination partnership with the U.S. Army Corps of Engineers (USACE) to assist the Mississippi Coastal Improvements Program (MsCIP) and the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazards Susceptibility Project by mapping the shallow geological stratigraphic framework of the Mississippi Barrier Island Complex. These geophysical surveys will provide the data necessary for scientists to define, interpret, and provide baseline bathymetry and seafloor habitat for this area and to aid scientists in predicting future geomorpholocial changes of the islands with respect to climate change, storm impact, and sea-level rise. Furthermore, these data will provide information for barrier island restoration, particularly in Camille Cut, and provide protection for the historical Fort Massachusetts. For more information refer to http://ngom.usgs.gov/gomsc/mscip/index.html. This report serves as an archive of the processed swath bathymetry and side scan sonar data (SSS). Data products herein include gridded and interpolated surfaces, surface images, and x,y,z data products for both swath bathymetry and side scan sonar imagery. Additional files include trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal FGDC metadata. Scanned images of the handwritten FACS logs and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report or hold the cursor over an acronym for a pop-up explanation. The USGS St. Petersburg Coastal and Marine Science Center assigns a unique identifier to each cruise or field activity. For example, 10CCT01 tells us the data were collected in 2010 for the Coastal Change and Transport (CCT) study and the data were collected during the first field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. Data were collected using a 26-foot (ft) Glacier Bay Catamaran. Side scan sonar and interferometric swath bathymetry data were collected simultaneously along the tracklines. The side scan sonar towfish was towed off the port side just slightly behind the vessel, close to the seafloor. The interferometric swath transducer was sled-mounted on a rail attached between the catamaran hulls. During the survey the sled is secured into position. Navigation was acquired with a CodaOctopus Octopus F190 Precision Attitude and Positioning System and differentially corrected with OmniSTAR. See the digital FACS equipment log for details about the acquisition equipment used. Both raw datasets were stored digitally and processed using CARIS HIPS and SIPS software at the USGS St. Petersburg Coastal and Marine Science Center. For more information on processing refer to the Equipment and Processing page. Post-processing of the swath dataset revealed a motion artifact that is attributed to movement of the pole that the swath transducers are attached to in relation to the boat. The survey took place in the winter months, in which strong winds and rough waves contributed to a reduction in data quality. The rough seas contributed to both the movement of the pole and the very high noise base seen in the raw amplitude data of the side scan sonar. Chirp data were also collected during this survey and are archived separately.

  10. Well 9-1 Logs and Data: Roosevelt Hot Spring Area, Utah (FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 9-1 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  11. The Use of OPAC in a Large Academic Library: A Transactional Log Analysis Study of Subject Searching

    ERIC Educational Resources Information Center

    Villen-Rueda, Luis; Senso, Jose A.; de Moya-Anegon, Felix

    2007-01-01

    The analysis of user searches in catalogs has been the topic of research for over four decades, involving numerous studies and diverse methodologies. The present study looks at how different types of users effect queries in the catalog of a university library. For this purpose, we analyzed log files to determine which was the most frequent type of…

  12. Exploring Online Students' Self-Regulated Learning with Self-Reported Surveys and Log Files: A Data Mining Approach

    ERIC Educational Resources Information Center

    Cho, Moon-Heum; Yoo, Jin Soung

    2017-01-01

    Many researchers who are interested in studying students' online self-regulated learning (SRL) have heavily relied on self-reported surveys. Data mining is an alternative technique that can be used to discover students' SRL patterns from large data logs saved on a course management system. The purpose of this study was to identify students' online…

  13. Family Child Care Inventory-Keeper: The Complete Log for Depreciating and Insuring Your Property. Redleaf Business Series.

    ERIC Educational Resources Information Center

    Copeland, Tom

    Figuring depreciation can be the most difficult aspect of filing tax returns for a family child care program. This inventory log for family child care programs is designed to assist in keeping track of the furniture, appliances, and other property used in the child care business; once these items have been identified, they can be deducted as…

  14. COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.

    PubMed

    Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas

    2014-12-14

    With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.

  15. Wister, CA Downhole and Seismic Data

    DOE Data Explorer

    Akerley, John

    2010-12-18

    This submission contains Downhole geophysical logs associated with Wister, CA Wells 12-27 and 85-20. The logs include Spontaneous Potential (SP), HILT Caliper (HCAL), Gamma Ray (GR), Array Induction (AIT), and Neutron Porosity (NPOR) data. Also included are a well log, Injection Test, Pressure Temperature Spinner log, shut in temperature survey, a final well schematic, and files about the well's location and drilling history. This submission also contains data from a three-dimensional (3D) multi-component (3C) seismic reflection survey on the Wister Geothermal prospect area in the northern portion of the Imperial Valley, California. The Wister seismic survey area was 13.2 square miles. (Resistivity image logs (Schlumberger FMI) in 85-20 indicate that maximum horizontal stress (Shmax) is oriented NNE but that open fractures are oriented suboptimally).

  16. Automated clustering-based workload characterization

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Menasce, Daniel A.; Yesha, Yelena

    1996-01-01

    The demands placed on the mass storage systems at various federal agencies and national laboratories are continuously increasing in intensity. This forces system managers to constantly monitor the system, evaluate the demand placed on it, and tune it appropriately using either heuristics based on experience or analytic models. Performance models require an accurate workload characterization. This can be a laborious and time consuming process. It became evident from our experience that a tool is necessary to automate the workload characterization process. This paper presents the design and discusses the implementation of a tool for workload characterization of mass storage systems. The main features of the tool discussed here are: (1)Automatic support for peak-period determination. Histograms of system activity are generated and presented to the user for peak-period determination; (2) Automatic clustering analysis. The data collected from the mass storage system logs is clustered using clustering algorithms and tightness measures to limit the number of generated clusters; (3) Reporting of varied file statistics. The tool computes several statistics on file sizes such as average, standard deviation, minimum, maximum, frequency, as well as average transfer time. These statistics are given on a per cluster basis; (4) Portability. The tool can easily be used to characterize the workload in mass storage systems of different vendors. The user needs to specify through a simple log description language how the a specific log should be interpreted. The rest of this paper is organized as follows. Section two presents basic concepts in workload characterization as they apply to mass storage systems. Section three describes clustering algorithms and tightness measures. The following section presents the architecture of the tool. Section five presents some results of workload characterization using the tool.Finally, section six presents some concluding remarks.

  17. Well 14-2 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 14-2 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  18. Well 52-21 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 52-21 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  19. Well 82-33 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 82-33 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  20. Well Acord 1-26 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joe Moore

    This is a compilation of logs and data from Well Acord 1-26 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  1. Patterns of usage for a Web-based clinical information system.

    PubMed

    Chen, Elizabeth S; Cimino, James J

    2004-01-01

    Understanding how clinicians are using clinical information systems to assist with their everyday tasks is valuable to the system design and development process. Developers of such systems are interested in monitoring usage in order to make enhancements. System log files are rich resources for gaining knowledge about how the system is being used. We have analyzed the log files of our Web-based clinical information system (WebCIS) to obtain various usage statistics including which WebCIS features are frequently being used. We have also identified usage patterns, which convey how the user is traversing the system. We present our method and these results as well as describe how the results can be used to customize menus, shortcut lists, and patient reports in WebCIS and similar systems.

  2. Simulation Control Graphical User Interface Logging Report

    NASA Technical Reports Server (NTRS)

    Hewling, Karl B., Jr.

    2012-01-01

    One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.

  3. Geohydrologic and water-quality characterization of a fractured-bedrock test hole in an area of Marcellus shale gas development, Bradford County, Pennsylvania

    USGS Publications Warehouse

    Risser, Dennis W.; Williams, John H.; Hand, Kristen L.; Behr, Rose-Anna; Markowski, Antonette K.

    2013-01-01

    Open-File Miscellaneous Investigation 13–01.1 presents the results of geohydrologic investigations on a 1,664-foot-deep core hole drilled in the Bradford County part of the Gleason 7.5-minute quadrangle in north-central Pennsylvania. In the text, the authors discuss their methods of investigation, summarize physical and analytical results, and place those results in context. Four appendices include (1) a full description of the core in an Excel worksheet; (2) water-quality and core-isotope analytical results in Excel workbooks; (3) geophysical logs in LAS and PDF files, and an Excel workbook containing attitudes of bedding and fractures calculated from televiewer logs; and (4) MP4 clips from the downhole video at selected horizons.

  4. Archive of digital chirp subbottom profile data collected during USGS Cruise 13CCT04 offshore of Petit Bois Island, Mississippi, August 2013

    USGS Publications Warehouse

    Forde, Arnell S.; Flocks, James G.; Kindinger, Jack G.; Bernier, Julie C.; Kelso, Kyle W.; Wiese, Dana S.

    2015-01-01

    From August 13-23, 2013, the U.S. Geological Survey (USGS), in cooperation with the U.S. Army Corps of Engineers (USACE) conducted geophysical surveys to investigate the geologic controls on barrier island framework and long-term sediment transport offshore of Petit Bois Island, Mississippi. This investigation is part of a broader USGS study on Coastal Change and Transport (CCT). These surveys were funded through the Mississippi Coastal Improvements Program (MsCIP) with partial funding provided by the Northern Gulf of Mexico Ecosystem Change and Hazard Susceptibility Project. This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained-showing a relative increase in signal amplitude-digital images of the seismic profiles are provided.

  5. Archive of Digital Chirp Sub-bottom Profile Data Collected During USGS Cruises 08CCT02 and 08CCT03, Mississippi Gulf Islands, July and September 2008

    USGS Publications Warehouse

    Barry, K.M.; Cavers, D.A.; Kneale, C.W.

    2011-01-01

    In July and September of 2008, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on island framework from Ship Island to Horn Island, MS, for the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility project. This project is also part of a broader USGS study on Coastal Change and Transport (CCT). This report serves as an archive of unprocessed digital Chirp sub-bottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, observer's logbook, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (a relative increase in signal amplitude) digital images of the sub-bottom profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.

  6. Zebra: A striped network file system

    NASA Technical Reports Server (NTRS)

    Hartman, John H.; Ousterhout, John K.

    1992-01-01

    The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.

  7. Archive of Digital boomer subbottom data collected during USGS cruises 99FGS01 and 99FGS02 offshore southeast and southwest Florida, July and November, 1999

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Wiese, Dana S.; Phelps, Daniel C.

    2013-01-01

    In July (19 - 26) and November (17 - 18) of 1999, the USGS, in cooperation with the Florida Geological Survey (FGS), conducted two geophysical surveys in: (1) the Atlantic Ocean offshore of Florida's east coast from Orchid to Jupiter, FL, and (2) the Gulf of Mexico offshore of Venice, FL. This report serves as an archive of unprocessed digital boomer subbottom data, trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Filtered and gained (showing a relative increase in signal amplitude) digital images of the subbottom profiles are also provided. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, identifiers 99FGS01 and 99FGS02 refer to field data collected in 1999 for cooperative work with the FGS. The numbers 01 and 02 indicate the data were collected during the first and second field activities for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID).

  8. SU-E-J-182: Reproducibility of Tumor Motion Probability Distribution Function in Stereotactic Body Radiation Therapy of Lung Using Real-Time Tumor-Tracking Radiotherapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiinoki, T; Hanazawa, H; Park, S

    2015-06-15

    Purpose: We aim to achieve new four-dimensional radiotherapy (4DRT) using the next generation real-time tumor-tracking (RTRT) system and flattening-filter-free techniques. To achieve new 4DRT, it is necessary to understand the respiratory motion of tumor. The purposes of this study were: 1.To develop the respiratory motion analysis tool using log files. 2.To evaluate the reproducibility of tumor motion probability distribution function (PDF) during stereotactic body RT (SBRT) of lung tumor. Methods: Seven patients having fiducial markers closely implanted to the lung tumor were enrolled in this study. The positions of fiducial markers were measured using the RTRT system (Mitsubishi Electronics Co.,more » JP) and recorded as two types of log files during the course of SBRT. For each patients, tumor motion range and tumor motion PDFs in left-right (LR), anterior-posterior (AP) and superior-inferior (SI) directions were calculated using log files of all beams per fraction (PDFn). Fractional PDF reproducibility (Rn) was calculated as Kullback-Leibler (KL) divergence between PDF1 and PDFn of tumor motion. The mean of Rn (Rm) was calculated for each patient and correlated to the patient’s mean tumor motion range (Am). The change of Rm during the course of SBRT was also evluated. These analyses were performed using in-house developed software. Results: The Rm were 0.19 (0.07–0.30), 0.14 (0.07–0.32) and 0.16 (0.09–0.28) in LR, AP and SI directions, respectively. The Am were 5.11 mm (2.58–9.99 mm), 7.81 mm (2.87–15.57 mm) and 11.26 mm (3.80–21.27 mm) in LR, AP and SI directions, respectively. The PDF reproducibility decreased as the tumor motion range increased in AP and SI direction. That decreased slightly through the course of RT in SI direction. Conclusion: We developed the respiratory motion analysis tool for 4DRT using log files and quantified the range and reproducibility of respiratory motion for lung tumors.« less

  9. Cyber Fundamental Exercises

    DTIC Science & Technology

    2013-03-01

    the /bin, /sbin, /etc, /var/log, /home, /proc, /root, /dev, /tmp, and /lib directories • Describe the purpose of the /etc/shadow and /etc/ passwd ...UNLIMITED 19 2.6.2 /etc/ passwd and /etc/shadow The /etc/shadow file didn’t exist on early Linux distributions. Originally only root could access the...etc/ passwd file, which stored user names, user configuration information, and passwords. However, when common programs such as ls running under

  10. 47 CFR 22.359 - Emission limitations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... + 10 log (P) dB. (b) Measurement procedure. Compliance with these rules is based on the use of... contract in their station files and disclose it to prospective assignees or transferees and, upon request...

  11. 7 CFR 274.5 - Record retention and forms security.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... control logs, or similar controls from the point of initial receipt through the issuance and.... (2) For notices of change which initiate, update or terminate the master issuance file, the State...

  12. Network Basics.

    ERIC Educational Resources Information Center

    Tennant, Roy

    1992-01-01

    Explains how users can find and access information resources available on the Internet. Highlights include network information centers (NICs); lists, both formal and informal; computer networking protocols, including international standards; electronic mail; remote log-in; and file transfer. (LRW)

  13. Archive of digital Chirp sub-bottom profile data collected during USGS Cruise 07SCC01 offshore of the Chandeleur Islands, Louisiana, June 2007

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.

    2010-01-01

    In June of 2007, the U.S. Geological Survey (USGS) conducted a geophysical survey offshore of the Chandeleur Islands, Louisiana, in cooperation with the Louisiana Department of Natural Resources (LDNR) as part of the USGS Barrier Island Comprehensive Monitoring (BICM) project. This project is part of a broader study focused on Subsidence and Coastal Change (SCC). The purpose of the study was to investigate the shallow geologic framework and monitor the enviromental impacts of Hurricane Katrina (Louisiana landfall was on August 29, 2005) on the Gulf Coast's barrier island chains. This report serves as an archive of unprocessed digital 512i and 424 Chirp sub-bottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, observer's logbook, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 07SCC01 tells us the data were collected in 2007 for the Subsidence and Coastal Change (SCC) study and the data were collected during the first field activity for that study in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID). All Chirp systems use a signal of continuously varying frequency; the Chirp systems used during this survey produce high resolution, shallow penetration profile images beneath the seafloor. The towfish is a sound source and receiver, which is typically towed 1 - 2 m below the sea surface. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by a receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.125 s) and recorded for specific intervals of time (for example, 50 ms). In this way, a two-dimensional vertical image of the shallow geologic structure beneath the ship track is produced. Figure 1 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters. See the digital FACS equipment log (11-KB PDF) for details about the acquisition equipment used. Table 2 lists trackline statistics. Scanned images of the handwritten FACS logs and handwritten science logbook (449-KB PDF) are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y rev 1 format (Norris and Faichney, 2002); ASCII character encoding is used for the first 3,200 bytes of the card image header instead of the SEG-Y rev 0 (Barry and others, 1975) EBCDIC format. The SEG-Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG-Y Data page for download instructions. The web version of this archive does not contain the SEG-Y trace files. These files are very large and would require extremely long download times. To obtain the complete DVD archive, contact USGS Information at 1-888-ASK-USGS or infoservices@usgs.gov. The printable profiles provided here are GIF images that were processed and gained using SU software; refer to the Software page for links to example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992). The processed SEG-Y data were also exported to Chesapeake Technology, Inc. (CTI) SonarWeb software to produce an interactive version of the profile that allows the user to obtain a geographic location and depth from the profile for a given cursor position. This information is displayed in the status bar of the browser.

  14. VizieR Online Data Catalog: CoRoT red giants abundances (Morel+, 2014)

    NASA Astrophysics Data System (ADS)

    Morel, T.; Miglio, A.; Lagarde, N.; Montalban, J.; Rainer, M.; Poretti, E.; Eggenberger, P.; Hekker, S.; Kallinger, T.; Mosser, B.; Valentini, M.; Carrier, F.; Hareter, M.; Mantegazza, L.

    2014-02-01

    The equivalent widths were measured manually assuming Gaussian profiles or Voigt profiles for the few lines with extended damping wings. Lines with an unsatisfactory fit or significantly affected by telluric features were discarded. Only values eventually retained for the analysis are provided. For the chemical abundances, the usual notation is used: [X/Y]=[log({epsilon}(X))-log({epsilon}(Y))]star - [log({epsilon}(X))-log({epsilon}(Y))]⊙ with log{epsilon}(X)=12+log[N(X)/N(H)] (N is the number density of the species). For lithium, the following notation is used: [Li/H]=log(N(Li))star-log(N(Li))⊙. The adopted solar abundances are taken from Grevesse & Sauval (1998SSRv...85..161G), except for Li for which we adopt our derived values: log({epsilon}(Li))⊙=1.09 and 1.13 in LTE and NLTE, respectively (see text). All the abundances are computed under the assumption of LTE, except Li for which values corrected for departures from LTE using the data of Lind et al. (2009A&A...503..541L) are also provided. All the quoted error bars are 1-sigma uncertainties. (6 data files).

  15. SU-F-T-230: A Simple Method to Assess Accuracy of Dynamic Wave Arc Irradiation Using An Electronic Portal Imaging Device and Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirashima, H; Miyabe, Y; Yokota, K

    2016-06-15

    Purpose: The Dynamic Wave Arc (DWA) technique, where the multi-leaf collimator (MLC) and gantry/ring move simultaneously in a predefined non-coplanar trajectory, has been developed on the Vero4DRT. The aim of this study is to develop a simple method for quality assurance of DWA delivery using an electronic portal imaging device (EPID) measurements and log files analysis. Methods: The Vero4DRT has an EPID on the beam axis, the resolution of which is 0.18 mm/pixel at the isocenter plane. EPID images were acquired automatically. To verify the detection accuracy of the MLC position by EPID images, the MLC position with intentional errorsmore » was assessed. Tests were designed considering three factors: (1) accuracy of the MLC position (2) dose output consistency with variable dose rate (160–400 MU/min), gantry speed (2.4–6°/s), ring speed (0.5–2.5°/s), and (3) MLC speed (1.6–4.2 cm/s). All the patterns were delivered to the EPID and compared with those obtained with a stationary radiation beam with a 0° gantry angle. The irradiation log, including the MLC position and gantry/ring angle, were recorded simultaneously. To perform independent checks of the machine accuracy, the MLC position and gantry/ring angle position were assessed using log files. Results: 0.1 mm intentional error can be detected by the EPID, which is smaller than the EPID pixel size. The dose outputs with different conditions of the dose rate and gantry/ring speed and MLC speed showed good agreement, with a root mean square (RMS) error of 0.76%. The RMS error between the detected and recorded data were 0.1 mm for the MLC position, 0.12° for the gantry angle, and 0.07° for the ring angle. Conclusion: The MLC position and dose outputs in variable conditions during DWA irradiation can be easily detected using EPID measurements and log file analysis. The proposed method is useful for routine verification. This research is (partially) supported by the Practical Research for Innovative Cancer Control (15Ack0106151h0001) from Japan Agency for Medical Research and development, AMED. Authors Takashi Mizowaki and Masahiro Hiraoka have consultancy agreement with Mitsubishi Heavy Industries, Ltd., Japan.« less

  16. VizieR Online Data Catalog: NGC 2264, NGC 2547 and NGC 2516 stellar radii (Jackson+, 2016)

    NASA Astrophysics Data System (ADS)

    Jackson, R. J.; Jeffries, R. D.; Randich, S.; Bragaglia, A.; Carraro, G.; Costado, M. T.; Flaccomio, E.; Lanzafame; Lardo, C.; Monaco, L.; Morbidelli, L.; Smiljanic, R.; Zaggia, S.

    2015-11-01

    File Table1.dat contains Photometric and spectroscopic data of GES Survey targets in clusters in NGC 2547, NGC 2516, NGC 22264 downloaded from the Edinburugh GES archive (http://ges/roe.ac.uk/) . Photometric data comprised the (Cousins) I magnitude and 2MASS J, H and K magnitudes. Spectroscopic data comprises the signal to noise ratio, S/N of the target spectrum, the radial velocity, RV (in km/s), the projected equatorial velocity, vsini (in km/s), the number of separate observations co-added to produce the target spectrum and the log of effective temperature (logTeff) of the template spectrum fitted to measure RV and vsini. The absolute precision in RV, pRV (in km/s) and relative precision vsini (pvsini) were estimated, as a function of the logTeff, vsini and S/N, using the prescription described in Jackson et al. (2015A&A...580A..75J, Cat. J/A+A/580/A75). File Table3.dat contains measured and calculated properties of cluster targets with resolved vsini and a reported rotation period. The cluster name, right ascension, RA (deg) and declination, Dec (deg) are given for targets with measured periods given in the literature. Dynamic properties comprise: the radial velocity, RV (in km/s), the absolute precision in RV, pRV (km/s), the projected equatorial velocity, vsini (in km/s), the relative precision in vsini (pvsini) and the rotational period (in days). Also shown are values of absolute K magnitude, MK log of luminosity, log L (in solar units) and probability of cluster membership estimated using cluster data given in the text. Period shows reported values of cluster taken from the literature Estimated values of the projected radius, Rsini (in Rsolar) and uncertainty in projected radius, e_Rsini (in Rsolar) are given for targets where vsini>5km/s and pvsini>0.2. The final column shows a flag which is set to 1 for targets in cluster NGC 2264 where a (H-K) versus (J-H) colour-colour plot indicates possible infra-red excess. Period shows reported values of cluster taken from the literature (2 data files).

  17. Optimizing Earth Data Search Ranking using Deep Learning and Real-time User Behaviour

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C. P.; Armstrong, E. M.; Huang, T.; Moroni, D. F.; McGibbney, L. J.; Greguska, F. R., III

    2017-12-01

    Finding Earth science data has been a challenging problem given both the quantity of data available and the heterogeneity of the data across a wide variety of domains. Current search engines in most geospatial data portals tend to induce end users to focus on one single data characteristic dimension (e.g., term frequency-inverse document frequency (TF-IDF) score, popularity, release date, etc.). This approach largely fails to take account of users' multidimensional preferences for geospatial data, and hence may likely result in a less than optimal user experience in discovering the most applicable dataset out of a vast range of available datasets. With users interacting with search engines, sufficient information is already hidden in the log files. Compared with explicit feedback data, information that can be derived/extracted from log files is virtually free and substantially more timely. In this dissertation, I propose an online deep learning framework that can quickly update the learning function based on real-time user clickstream data. The contributions of this framework include 1) a log processor that can ingest, process and create training data from web logs in a real-time manner; 2) a query understanding module to better interpret users' search intent using web log processing results and metadata; 3) a feature extractor that identifies ranking features representing users' multidimensional interests of geospatial data; and 4) a deep learning based ranking algorithm that can be trained incrementally using user behavior data. The search ranking results will be evaluated using precision at K and normalized discounted cumulative gain (NDCG).

  18. 43 CFR 2743.3 - Leased disposal sites.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... review of all records and inspection reports on file with the Bureau of Land Management, State, and local... landfill concerning site management and a review of all reports and logs pertaining to the type and amount...

  19. 25 CFR 214.13 - Diligence; annual expenditures; mining records.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... within 90 days after an ore body of sufficient quantity is discovered, and shown by the logs or records.... Lessee shall, before commencing operations, file with the superintendent a plat and preliminary statement...

  20. 47 CFR 22.861 - Emission limitations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... below the transmitting power (P) by a factor of at least 43 + 10 log (P) dB. (b) Measurement procedure... maintain a copy of the contract in their station files and disclose it to prospective assignees or...

  1. Development of Cross-Platform Software for Well Logging Data Visualization

    NASA Astrophysics Data System (ADS)

    Akhmadulin, R. K.; Miraev, A. I.

    2017-07-01

    Well logging data processing is one of the main sources of information in the oil-gas field analysis and is of great importance in the process of its development and operation. Therefore, it is important to have the software which would accurately and clearly provide the user with processed data in the form of well logs. In this work, there have been developed a software product which not only has the basic functionality for this task (loading data from .las files, well log curves display, etc.), but can be run in different operating systems and on different devices. In the article a subject field analysis and task formulation have been performed, and the software design stage has been considered. At the end of the work the resulting software product interface has been described.

  2. Archive of digital Boomer seismic reflection data collected during USGS Cruises 94CCT01 and 95CCT01, eastern Texas and western Louisiana, 1994 and 1995

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Morton, Robert A.; Wiese, Dana S.

    2004-01-01

    In June of 1994 and August and September of 1995, the U.S. Geological Survey, in cooperation with the University of Texas Bureau of Economic Geology, conducted geophysical surveys of the Sabine and Calcasieu Lake areas and the Gulf of Mexico offshore eastern Texas and western Louisiana. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, observers' logbooks, GIS information, and formal FGDC metadata. In addition, a filtered and gained GIF image of each seismic profile is provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. Processed profile images, trackline maps, navigation files, and formal metadata may be viewed with a web browser. Scanned handwritten logbooks and Field Activity Collection System (FACS) logs may be viewed with Adobe Reader.

  3. Archive of digital Boomer and Chirp seismic reflection data collected during USGS Cruises 01RCE05 and 02RCE01 in the Lower Atchafalaya River, Mississippi River Delta, and offshore southeastern Louisiana, October 23-30, 2001, and August 18-19, 2002

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Ferina, Nicholas F.; Wiese, Dana S.

    2004-01-01

    In October of 2001 and August of 2002, the U.S. Geological Survey conducted geophysical surveys of the Lower Atchafalaya River, the Mississippi River Delta, Barataria Bay, and the Gulf of Mexico south of East Timbalier Island, Louisiana. This report serves as an archive of unprocessed digital marine seismic reflection data, trackline maps, navigation files, observers' logbooks, GIS information, and formal FGDC metadata. In addition, a filtered and gained GIF image of each seismic profile is provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and othes, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. Processed profile images, trackline maps, navigation files, and formal metadata may be viewed with a web browser. Scanned handwritten logbooks and Field Activity Collection System (FACS) logs may be viewed with Adobe Reader.

  4. Sediment data collected in 2010 from Cat Island, Mississippi

    USGS Publications Warehouse

    Buster, Noreen A.; Kelso, Kyle W.; Miselis, Jennifer L.; Kindinger, Jack G.

    2014-01-01

    Scientists from the U.S. Geological Survey, St. Petersburg Coastal and Marine Science Center, in collaboration with the U.S. Army Corps of Engineers, conducted geophysical and sedimentological surveys in 2010 around Cat Island, Mississippi, which is the westernmost island in the Mississippi-Alabama barrier island chain. The objective of the study was to understand the geologic evolution of Cat Island relative to other barrier islands in the northern Gulf of Mexico by identifying relationships between the geologic history, present day morphology, and sediment distribution. This data series serves as an archive of terrestrial and marine sediment vibracores collected August 4-6 and October 20-22, 2010, respectively. Geographic information system data products include marine and terrestrial core locations and 2007 shoreline data. Additional files include marine and terrestrial core description logs, core photos, results of sediment grain-size analyses, optically stimulated luminescence dating and carbon-14 dating locations and results, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.

  5. Windows Instant Messaging App Forensics: Facebook and Skype as Case Studies

    PubMed Central

    Yang, Teing Yee; Dehghantanha, Ali; Choo, Kim-Kwang Raymond; Muda, Zaiton

    2016-01-01

    Instant messaging (IM) has changed the way people communicate with each other. However, the interactive and instant nature of these applications (apps) made them an attractive choice for malicious cyber activities such as phishing. The forensic examination of IM apps for modern Windows 8.1 (or later) has been largely unexplored, as the platform is relatively new. In this paper, we seek to determine the data remnants from the use of two popular Windows Store application software for instant messaging, namely Facebook and Skype on a Windows 8.1 client machine. This research contributes to an in-depth understanding of the types of terrestrial artefacts that are likely to remain after the use of instant messaging services and application software on a contemporary Windows operating system. Potential artefacts detected during the research include data relating to the installation or uninstallation of the instant messaging application software, log-in and log-off information, contact lists, conversations, and transferred files. PMID:26982207

  6. Windows Instant Messaging App Forensics: Facebook and Skype as Case Studies.

    PubMed

    Yang, Teing Yee; Dehghantanha, Ali; Choo, Kim-Kwang Raymond; Muda, Zaiton

    2016-01-01

    Instant messaging (IM) has changed the way people communicate with each other. However, the interactive and instant nature of these applications (apps) made them an attractive choice for malicious cyber activities such as phishing. The forensic examination of IM apps for modern Windows 8.1 (or later) has been largely unexplored, as the platform is relatively new. In this paper, we seek to determine the data remnants from the use of two popular Windows Store application software for instant messaging, namely Facebook and Skype on a Windows 8.1 client machine. This research contributes to an in-depth understanding of the types of terrestrial artefacts that are likely to remain after the use of instant messaging services and application software on a contemporary Windows operating system. Potential artefacts detected during the research include data relating to the installation or uninstallation of the instant messaging application software, log-in and log-off information, contact lists, conversations, and transferred files.

  7. Contamination Analysis Tools

    NASA Technical Reports Server (NTRS)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  8. AliEn—ALICE environment on the GRID

    NASA Astrophysics Data System (ADS)

    Saiz, P.; Aphecetche, L.; Bunčić, P.; Piskač, R.; Revsbech, J.-E.; Šego, V.; Alice Collaboration

    2003-04-01

    AliEn ( http://alien.cern.ch) (ALICE Environment) is a Grid framework built on top of the latest Internet standards for information exchange and authentication (SOAP, PKI) and common Open Source components. AliEn provides a virtual file catalogue that allows transparent access to distributed datasets and a number of collaborating Web services which implement the authentication, job execution, file transport, performance monitor and event logging. In the paper we will present the architecture and components of the system.

  9. Parallel compression of data chunks of a shared data object using a log-structured file system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-10-25

    Techniques are provided for parallel compression of data chunks being written to a shared object. A client executing on a compute node or a burst buffer node in a parallel computing system stores a data chunk generated by the parallel computing system to a shared data object on a storage node by compressing the data chunk; and providing the data compressed data chunk to the storage node that stores the shared object. The client and storage node may employ Log-Structured File techniques. The compressed data chunk can be de-compressed by the client when the data chunk is read. A storagemore » node stores a data chunk as part of a shared object by receiving a compressed version of the data chunk from a compute node; and storing the compressed version of the data chunk to the shared data object on the storage node.« less

  10. Design and development of an automatic data acquisition system for a balance study using a smartcard system.

    PubMed

    Ambrozy, C; Kolar, N A; Rattay, F

    2010-01-01

    For measurement value logging of board angle values during balance training, it is necessary to develop a measurement system. This study will provide data for a balance study using the smartcard. The data acquisition comes automatically. An individually training plan for each proband is necessary. To store the proband identification a smartcard with an I2C data bus protocol and an E2PROM memory system is used. For reading the smartcard data a smartcard reader is connected via universal serial bus (USB) to a notebook. The data acquisition and smartcard read programme is designed with Microsoft® Visual C#. A training plan file contains the individual training plan for each proband. The data of the test persons are saved in a proband directory. Each event is automatically saved as a log-file for the exact documentation. This system makes study development easy and time-saving.

  11. VizieR Online Data Catalog: GOALS sample PACS and SPIRE fluxes (Chu+, 2017)

    NASA Astrophysics Data System (ADS)

    Chu, J. K.; Sanders, D. B.; Larson, K. L.; Mazzarella, J. M.; Howell, J. H.; Diaz-Santos, T.; Xu, K. C.; Paladini, R.; Schulz, B.; Shupe, D.; Appleton, P.; Armus, L.; Billot, N.; Chan, B. H. P.; Evans, A. S.; Fadda, D.; Frayer, D. T.; Haan, S.; Ishida, C. M.; Iwasawa, K.; Kim, D.-C.; Lord, S.; Murphy, E.; Petric, A.; Privon, G. C.; Surace, J. A.; Treister, E.

    2017-06-01

    The IRAS RBGS contains 179 LIRGs (log(LIR/Lȯ)= 22 ultra-luminous infrared galaxies (ULIRGs: log(LIR/Lȯ)>=12.0); these 201 total objects comprise the GOALS sample (Armus et al. 2009), a statistically complete flux-limited sample of infrared-luminous galaxies in the local universe. This paper presents imaging and photometry for all 201 LIRGs and LIRG systems in the IRAS RBGS that were observed during our GOALS Herschel OT1 program. (4 data files).

  12. The key image and case log application: new radiology software for teaching file creation and case logging that incorporates elements of a social network.

    PubMed

    Rowe, Steven P; Siddiqui, Adeel; Bonekamp, David

    2014-07-01

    To create novel radiology key image software that is easy to use for novice users, incorporates elements adapted from social networking Web sites, facilitates resident and fellow education, and can serve as the engine for departmental sharing of interesting cases and follow-up studies. Using open-source programming languages and software, radiology key image software (the key image and case log application, KICLA) was developed. This system uses a lightweight interface with the institutional picture archiving and communications systems and enables the storage of key images, image series, and cine clips. It was designed to operate with minimal disruption to the radiologists' daily workflow. Many features of the user interface have been inspired by social networking Web sites, including image organization into private or public folders, flexible sharing with other users, and integration of departmental teaching files into the system. We also review the performance, usage, and acceptance of this novel system. KICLA was implemented at our institution and achieved widespread popularity among radiologists. A large number of key images have been transmitted to the system since it became available. After this early experience period, the most commonly encountered radiologic modalities are represented. A survey distributed to users revealed that most of the respondents found the system easy to use (89%) and fast at allowing them to record interesting cases (100%). Hundred percent of respondents also stated that they would recommend a system such as KICLA to their colleagues. The system described herein represents a significant upgrade to the Digital Imaging and Communications in Medicine teaching file paradigm with efforts made to maximize its ease of use and inclusion of characteristics inspired by social networking Web sites that allow the system additional functionality such as individual case logging. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  13. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    PubMed

    Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen

    2015-01-01

    Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  14. SU-E-T-144: Effective Analysis of VMAT QA Generated Trajectory Log Files for Medical Accelerator Predictive Maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Able, CM; Baydush, AH; Nguyen, C

    Purpose: To determine the effectiveness of SPC analysis for a model predictive maintenance process that uses accelerator generated parameter and performance data contained in trajectory log files. Methods: Each trajectory file is decoded and a total of 131 axes positions are recorded (collimator jaw position, gantry angle, each MLC, etc.). This raw data is processed and either axis positions are extracted at critical points during the delivery or positional change over time is used to determine axis velocity. The focus of our analysis is the accuracy, reproducibility and fidelity of each axis. A reference positional trace of the gantry andmore » each MLC is used as a motion baseline for cross correlation (CC) analysis. A total of 494 parameters (482 MLC related) were analyzed using Individual and Moving Range (I/MR) charts. The chart limits were calculated using a hybrid technique that included the use of the standard 3σ limits and parameter/system specifications. Synthetic errors/changes were introduced to determine the initial effectiveness of I/MR charts in detecting relevant changes in operating parameters. The magnitude of the synthetic errors/changes was based on: TG-142 and published analysis of VMAT delivery accuracy. Results: All errors introduced were detected. Synthetic positional errors of 2mm for collimator jaw and MLC carriage exceeded the chart limits. Gantry speed and each MLC speed are analyzed at two different points in the delivery. Simulated Gantry speed error (0.2 deg/sec) and MLC speed error (0.1 cm/sec) exceeded the speed chart limits. Gantry position error of 0.2 deg was detected by the CC maximum value charts. The MLC position error of 0.1 cm was detected by the CC maximum value location charts for every MLC. Conclusion: SPC I/MR evaluation of trajectory log file parameters may be effective in providing an early warning of performance degradation or component failure for medical accelerator systems.« less

  15. 47 CFR 22.917 - Emission limitations for cellular equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... frequency ranges must be attenuated below the transmitting power (P) by a factor of at least 43 + 10 log(P... such contract shall maintain a copy of the contract in their station files and disclose it to...

  16. Archive of digital Chirp subbottom profile data collected during USGS cruise 08CCT01, Mississippi Gulf Islands, July 2008

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Worley, Charles R.

    2011-01-01

    In July of 2008, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on island framework from Ship Island to Horn Island, Mississippi, for the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility project. Funding was provided through the Geologic Framework and Holocene Coastal Evolution of the Mississippi-Alabama Region Subtask (http://ngom.er.usgs.gov/task2_2/index.php); this project is also part of a broader USGS study on Coastal Change and Transport (CCT). This report serves as an archive of unprocessed digital Chirp seismic reflection data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, observer's logbook, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.

  17. 75 FR 76426 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-08

    ..., access control lists, file system permissions, intrusion detection and prevention systems and log..., address, mailing address, country, organization, phone, fax, mobile, pager, Defense Switched Network (DSN..., address, mailing address, country, organization, phone, fax, mobile, pager, Defense Switched Network (DSN...

  18. An Efficient Format for Nearly Constant-Time Access to Arbitrary Time Intervals in Large Trace Files

    DOE PAGES

    Chan, Anthony; Gropp, William; Lusk, Ewing

    2008-01-01

    A powerful method to aid in understanding the performance of parallel applications uses log or trace files containing time-stamped events and states (pairs of events). These trace files can be very large, often hundreds or even thousands of megabytes. Because of the cost of accessing and displaying such files, other methods are often used that reduce the size of the tracefiles at the cost of sacrificing detail or other information. This paper describes a hierarchical trace file format that provides for display of an arbitrary time window in a time independent of the total size of the file and roughlymore » proportional to the number of events within the time window. This format eliminates the need to sacrifice data to achieve a smaller trace file size (since storage is inexpensive, it is necessary only to make efficient use of bandwidth to that storage). The format can be used to organize a trace file or to create a separate file of annotations that may be used with conventional trace files. We present an analysis of the time to access all of the events relevant to an interval of time and we describe experiments demonstrating the performance of this file format.« less

  19. Comparative evaluation of calcium hypochlorite and sodium hypochlorite associated with passive ultrasonic irrigation on antimicrobial activity of a root canal system infected with Enterococcus faecalis: an in vitro study.

    PubMed

    de Almeida, Ana Paula; Souza, Matheus Albino; Miyagaki, Daniela Cristina; Dal Bello, Yuri; Cecchin, Doglas; Farina, Ana Paula

    2014-12-01

    The purpose of this study was to compare in vitro the effectiveness of calcium hypochlorite (Ca[OCl]2) and sodium hypochlorite (NaOCl) associated with passive ultrasonic irrigation in root canals of bovine teeth infected with Enterococcus faecalis. The root canals of 60 single-rooted bovine extracted teeth were enlarged up to a file 45, autoclaved, inoculated with Enterococcus faecalis, and incubated for 30 days. The samples were divided into 6 groups (n = 10) according to the protocol for decontamination: G1: no treatment; G2: distilled water; G3: 2.5% NaOCl; G4: 2.5% Ca(OCl)2; G5: 2.5% NaOCl with ultrasonic activation; and G6: 2.5% Ca(OCl)2 with ultrasonic activation (US). Microbiological testing (colony-forming unit [CFU] counting) was performed to evaluate and show, respectively, the effectiveness of the proposed treatments. Data were subjected to 1-way analysis of variance followed by the post hoc Tukey test (α = 0.05). Groups 1 and 2 showed the highest mean contamination (3.26 log10 CFU/mL and 2.69 log10 CFU/mL, respectively), which was statistically different from all other groups (P < .05). Group 6 (Ca[OCl]2 + US) showed the lowest mean contamination (1.00 log10 CFU/mL), with no statistically significant difference found in groups 3 (NaOCl), 4 (Ca[OCl]2), and 5 (NaOCl + US) (P < .05). Ca(OCl)2 as well as passive ultrasonic irrigation can aid in chemomechanical preparation, contributing in a significant way to the reduction of microbial content during root canal treatment. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  20. A Scientific Data Provenance API for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raju, Bibi; Elsethagen, Todd O.; Stephan, Eric G.

    Data provenance has been an active area of research as a means to standardize how the origin of data, process event history, and what or who was responsible for influencing results is explained. There are two approaches to capture provenance information. The first approach is to collect observed evidence produced by an executing application using log files, event listeners, and temporary files that are used by the application or application developer. The provenance translated from these observations is an interpretation of the provided evidence. The second approach is called disclosed because the application provides a firsthand account of the provenancemore » based on the anticipated questions on data flow, process flow, and responsible agents. Most observed provenance collection systems collect lot of provenance information during an application run or workflow execution. The common trend in capturing provenance is to collect all possible information, then attempt to find relevant information, which is not efficient. Existing disclosed provenance system APIs do not work well in distributed environment and have trouble finding where to fit the individual pieces of provenance information. This work focuses on determining more reliable solutions for provenance capture. As part of the Integrated End-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project, an API was developed, called Producer API (PAPI), which can disclose application targeted provenance, designed to work in distributed environments by means of unique object identification methods. The provenance disclosure approach used adds additional metadata to the provenance information to uniquely identify the pieces and connect them together. PAPI uses a common provenance model to support this provenance integration across disclosure sources. The API also provides the flexibility to let the user decide what to do with the collected provenance. The collected provenance can be sent to a triple store using REST services or it can be logged to a file.« less

  1. A Scientific Data Provenance Harvester for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan, Eric G.; Raju, Bibi; Elsethagen, Todd O.

    Data provenance provides a way for scientists to observe how experimental data originates, conveys process history, and explains influential factors such as experimental rationale and associated environmental factors from system metrics measured at runtime. The US Department of Energy Office of Science Integrated end-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project has developed a provenance harvester that is capable of collecting observations from file based evidence typically produced by distributed applications. To achieve this, file based evidence is extracted and transformed into an intermediate data format inspired in part by W3C CSV on the Web recommendations, calledmore » the Harvester Provenance Application Interface (HAPI) syntax. This syntax provides a general means to pre-stage provenance into messages that are both human readable and capable of being written to a provenance store, Provenance Environment (ProvEn). HAPI is being applied to harvest provenance from climate ensemble runs for Accelerated Climate Modeling for Energy (ACME) project funded under the U.S. Department of Energy’s Office of Biological and Environmental Research (BER) Earth System Modeling (ESM) program. ACME informally provides provenance in a native form through configuration files, directory structures, and log files that contain success/failure indicators, code traces, and performance measurements. Because of its generic format, HAPI is also being applied to harvest tabular job management provenance from Belle II DIRAC scheduler relational database tables as well as other scientific applications that log provenance related information.« less

  2. 77 FR 10451 - Fishing Tackle Containing Lead; Disposition of Petition Filed Pursuant to TSCA Section 21

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-22

    ... visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...

  3. Archive of digital boomer subbottom profile data collected in the Atlantic Ocean offshore northeast Florida during USGS cruises 03FGS01 and 03FGS02 in September and October of 2003

    USGS Publications Warehouse

    Calderon, Karynna; Forde, Arnell S.; Dadisman, Shawn V.; Wiese, Dana S.; Phelps, Daniel C.

    2012-01-01

    In September and October of 2003, the U.S. Geological Survey (USGS), in cooperation with the Florida Geological Survey, conducted geophysical surveys of the Atlantic Ocean offshore northeast Florida from St. Augustine, Florida, to the Florida-Georgia border. This report serves as an archive of unprocessed digital boomer subbottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Filtered and gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of all acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 03FGS01 tells us the data were collected in 2003 as part of cooperative work with the Florida Geological Survey (FGS) and that the data were collected during the first field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID). The naming convention used for each seismic line is as follows: yye##a, where 'yy' are the last two digits of the year in which the data were collected, 'e' is a 1-letter abbreviation for the equipment type (for example, b for boomer), '##' is a 2-digit number representing a specific track, and 'a' is a letter representing the section of a line if recording was prematurely terminated or rerun for quality or acquisition problems. The boomer plate is an acoustic energy source that consists of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled floating on the water surface and when discharged emits a short acoustic pulse, or shot, which propagates through the water, sediment column, or rock beneath. The acoustic energy is reflected at density boundaries (such as the seafloor, sediment, or rock layers beneath the seafloor), detected by hydrophone receivers, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.5 seconds) and recorded for specific intervals of time (for example, 100 milliseconds). In this way, a two-dimensional (2-D) vertical profile of the shallow geologic structure beneath the ship track is produced. Refer to the handwritten FACS operation log (PDF, 442 KB) for diagrams and descriptions of acquisition geometry, which varied throughout the cruises. Table 1 displays a summary of acquisition parameters. See the digital FACS equipment logs (PDF, 9-13 KB each) for details about the acquisition equipment used. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y (Barry and others, 1975) format (rev. 0), except for the first 3,200 bytes of the card image header, which are stored in ASCII format instead of the standard EBCDIC format. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2005). See the How To Download SEG Y Data page for download instructions. The printable profiles provided here are Graphics Interchange Format (GIF) images that were filtered and gained using SU software. Refer to the Software page for details about the processing and links to example SU processing scripts and USGS software for viewing the SEG Y files (Zihlman, 1992).

  4. Gamma-index method sensitivity for gauging plan delivery accuracy of volumetric modulated arc therapy.

    PubMed

    Park, Jong In; Park, Jong Min; Kim, Jung-In; Park, So-Yeon; Ye, Sung-Joon

    2015-12-01

    The aim of this study was to investigate the sensitivity of the gamma-index method according to various gamma criteria for volumetric modulated arc therapy (VMAT). Twenty head and neck (HN) and twenty prostate VMAT plans were retrospectively selected for this study. Both global and local 2D gamma evaluations were performed with criteria of 3%/3 mm, 2%/2 mm, 1%/2 mm and 2%/1 mm. In this study, the global and local gamma-index calculated the differences in doses relative to the maximum dose and the dose at the current measurement point, respectively. Using log files acquired during delivery, the differences in parameters at every control point between the VMAT plans and the log files were acquired. The differences in dose-volumetric parameters between reconstructed VMAT plans using the log files and the original VMAT plans were calculated. The Spearman's rank correlation coefficients (rs) were calculated between the passing rates and those differences. Considerable correlations with statistical significances were observed between global 1%/2 mm, local 1%/2 mm and local 2%/1 mm and the MLC position differences (rs = -0.712, -0.628 and -0.581). The numbers of rs values with statistical significance between the passing rates and the changes in dose-volumetric parameters were largest in global 2%/2 mm (n = 16), global 2%/1 mm (n = 15) and local 2%/1 mm (n = 13) criteria. Local gamma-index method with 2%/1 mm generally showed higher sensitivity to detect deviations between a VMAT plan and the delivery of the VMAT plan. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. Neuropsychological constraints to human data production on a global scale

    NASA Astrophysics Data System (ADS)

    Gros, C.; Kaczor, G.; Marković, D.

    2012-01-01

    Which are the factors underlying human information production on a global level? In order to gain an insight into this question we study a corpus of 252-633 mil. publicly available data files on the Internet corresponding to an overall storage volume of 284-675 Terabytes. Analyzing the file size distribution for several distinct data types we find indications that the neuropsychological capacity of the human brain to process and record information may constitute the dominant limiting factor for the overall growth of globally stored information, with real-world economic constraints having only a negligible influence. This supposition draws support from the observation that the files size distributions follow a power law for data without a time component, like images, and a log-normal distribution for multimedia files, for which time is a defining qualia.

  6. Logging Student Learning via a Puerto Rico-based Geologic Mapping Game on the Google Earth Virtual Globe

    NASA Astrophysics Data System (ADS)

    Gobert, J.; Toto, E.; Wild, S. C.; Dordevic, M. M.; De Paor, D. G.

    2013-12-01

    A hindrance to migrating undergraduate geoscience courses online is the challenge of giving students a quasi-authentic field experience. As part of an NSF TUES Type 2 project (# NSF-DUE 1022755), we addressed this challenge by designing a Google Earth (GE) mapping game centered on Puerto Rico, a place we chose in order to connect with underrepresented minorities but also because its simple geologic divisions minimized map complexity. The game invites student groups to explore the island and draw a geological map with these divisions: Rugged Volcanic Terrain, Limestone Karst Topography, and Surficial Sands & Gravels. Students, represented as avatars via COLLADA models and the GE browser plugin, can move about, text fellow students, and click a 'drill here' button that tells them what lies underground. They need to learn to read the topography because the number of holes they can drill is limited to 30. Then using the GE Polygon tool, they create a map, aided by a custom 'snapping' algorithm that stitches adjacent contacts, preventing gaps and overlaps, and they submit this map for evaluation by their instructor, an evaluation we purposefully did not automate. Initially we assigned students to groups of 4 and gave each group a field vehicle avatar with a designated driver, however students hated the experience unless they were the designated driver, so we revised the game to allow all students to roam independently, however we retained the mutual texting feature amongst students in groups. We implemented the activity with undergraduates from a university in South East USA. All student movements and actions on the GE terrain were logged. We wrote algorithms to evaluate student learning processes via log files, including, but not limited to, number of places drilled and their locations. Pre-post gains were examined, as well as correlations between data from log files and pre-post data. There was a small but statistically significant post-pre gain including a positive correlation between diagram-based post-test questions and: 1) total number of drills; 2) number of correct within-polygon identifications (Evidently those who did more drilling inside polygons and drew boundaries accordingly, learn more. Drills 'mistakingly' plotted outside formation polygons were negatively correlated with extra post-test questions but this was not statistically significant --likely due to low statistical power because there were few students who did this); and 3) average distance between drills (Students whose drill holes were further apart, learn more. This makes sense since more information can be gleaned this way and this may also be indicative of a skilled learning strategy because there is little point to doing close/overlapping drills when the permitted number is small and the region is large.) No significant correlation between pre-test score and diagram-based post-test questions was found; this suggests that prior knowledge is not accounting for above correlations. Data will be discussed with respect to GE's utility to convey geoscience principles to geology undergraduates, as well as the affordances for analyzing students' log files in order to better understand their learning processes.

  7. SU-E-T-100: Designing a QA Tool for Enhance Dynamic Wedges Based On Dynalog Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yousuf, A; Hussain, A

    2014-06-01

    Purpose: A robust quality assurance (QA) program for computer controlled enhanced dynamic wedge (EDW) has been designed and tested. Calculations to perform such QA test is based upon the EDW dynamic log files generated during dose delivery. Methods: Varian record and verify system generates dynamic log (dynalog) files during dynamic dose delivery. The system generated dynalog files contain information such as date and time of treatment, energy, monitor units, wedge orientation, and type of treatment. It also contains the expected calculated segmented treatment tables (STT) and the actual delivered STT for the treatment delivery as a verification record. These filesmore » can be used to assess the integrity and precision of the treatment plan delivery. The plans were delivered with a 6 MV beam from a Varian linear accelerator. For available EDW angles (10°, 15°, 20°, 25°, 30°, 45°, and 60°) Varian STT values were used to manually calculate monitor units for each segment. It can also be used to calculate the EDW factors. Independent verification of fractional MUs per segment was performed against those generated from dynalog files. The EDW factors used to calculate MUs in TPS were dosimetrically verified in solid water phantom with semiflex chamber on central axis. Results: EDW factors were generated from the STT provided by Varian and verified against practical measurements. The measurements were in agreement of the order of 1 % to the calculated EDW data. Variation between the MUs per segment obtained from dynalog files and those manually calculated was found to be less than 2%. Conclusion: An efficient and easy tool to perform routine QA procedure of EDW is suggested. The method can be easily implemented in any institution without a need for expensive QA equipment. An error of the order of ≥2% can be easily detected.« less

  8. 75 FR 27051 - Privacy Act of 1974: System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-13

    ... address and appears below: DOT/FMCSA 004 SYSTEM NAME: National Consumer Complaint Database (NCCDB.... A system, database, and procedures for filing and logging consumer complaints relating to household... are stored in an automated system operated and maintained at the Volpe National Transportation Systems...

  9. 20 CFR 655.201 - Temporary labor certification applications.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Temporary labor certification applications... applications. (a)(1) An employer who anticipates a labor shortage of workers for agricultural or logging... an agent file, in duplicate, a temporary labor certification application, signed by the employer...

  10. Information Retrieval Using Hadoop Big Data Analysis

    NASA Astrophysics Data System (ADS)

    Motwani, Deepak; Madan, Madan Lal

    This paper concern on big data analysis which is the cognitive operation of probing huge amounts of information in an attempt to get uncovers unseen patterns. Through Big Data Analytics Applications such as public and private organization sectors have formed a strategic determination to turn big data into cut throat benefit. The primary occupation of extracting value from big data give rise to a process applied to pull information from multiple different sources; this process is known as extract transforms and lode. This paper approach extract information from log files and Research Paper, awareness reduces the efforts for blueprint finding and summarization of document from several positions. The work is able to understand better Hadoop basic concept and increase the user experience for research. In this paper, we propose an approach for analysis log files for finding concise information which is useful and time saving by using Hadoop. Our proposed approach will be applied on different research papers on a specific domain and applied for getting summarized content for further improvement and make the new content.

  11. Application of Architectural Patterns and Lightweight Formal Method for the Validation and Verification of Safety Critical Systems

    DTIC Science & Technology

    2013-09-01

    to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS

  12. Development of a Methodology for Customizing Insider Threat Auditing on a Linux Operating System

    DTIC Science & Technology

    2010-03-01

    information /etc/group, passwd ,gshadow,shadow,/security/opasswd 16 User A attempts to access User B directory 17 User A attempts to access User B file w/o...configuration Handled by audit rules for root actions Audit user write attempts to system files -w /etc/group –p wxa -w /etc/ passwd –p wxa -w /etc/gshadow –p...information (/etc/group, /etc/ passwd , /etc/gshadow, /etc/shadow, /etc/sudoers, /etc/security/opasswd) Procedure: 1. User2 logs into the system

  13. Archive of digital chirp subbottom profile data collected during USGS cruise 12BIM03 offshore of the Chandeleur Islands, Louisiana, July 2012

    USGS Publications Warehouse

    Forde, Arnell S.; Miselis, Jennifer L.; Wiese, Dana S.

    2014-01-01

    From July 23 - 31, 2012, the U.S. Geological Survey conducted geophysical surveys to investigate the geologic controls on barrier island framework and long-term sediment transport along the oil spill mitigation sand berm constructed at the north end and just offshore of the Chandeleur Islands, La. (figure 1). This effort is part of a broader USGS study, which seeks to better understand barrier island evolution over medium time scales (months to years). This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Abbreviations page for expansions of acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 12BIM03 tells us the data were collected in 2012 during the third field activity for that project in that calendar year and BIM is a generic code, which represents efforts related to Barrier Island Mapping. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. All chirp systems use a signal of continuously varying frequency; the EdgeTech SB-424 system used during this survey produces high-resolution, shallow-penetration (typically less than 50 milliseconds (ms)) profile images of sub-seafloor stratigraphy. The towfish contains a transducer that transmits and receives acoustic energy and is typically towed 1 - 2 m below the sea surface. As transmitted acoustic energy intersects density boundaries, such as the seafloor or sub-surface sediment layers, energy is reflected back toward the transducer, received, and recorded by a PC-based seismic acquisition system. This process is repeated at regular time intervals (for example, 0.125 seconds (s)) and returned energy is recorded for a specific duration (for example, 50 ms). In this way, a two-dimensional (2-D) vertical image of the shallow geologic structure beneath the ship track is produced. Figure 2 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters and table 2 for trackline statistics. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in ASCII format instead of EBCDIC format. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The web version of this archive does not contain the SEG Y trace files. These files are very large and would require extremely long download times. To obtain the complete DVD archive, contact USGS Information Services at 1-888-ASK-USGS or infoservices@usgs.gov. The printable profiles provided here are GIF images that were processed and gained using SU software and can be viewed from the Profiles page or from links located on the trackline maps; refer to the Software page for links to example SU processing scripts. The SEG Y files are available on the DVD version of this report or on the Web, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software. Detailed information about the navigation system used can be found in table 1 and the Field Activity Collection System (FACS) logs. To view the trackline maps and navigation files, and for more information about these items, see the Navigation page.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    HUBER, J.H.

    An Enraf Densitometer is installed on tank 241-AY-102. The Densitometer will frequently be tasked to obtain and log density profiles. The activity can be effected a number of ways. Enraf Incorporated provides a software package called ''Logger18'' to its customers for the purpose of in-shop testing of their gauges. Logger18 is capable of accepting an input file which can direct the gauge to obtain a density profile for a given tank level and bottom limit. Logger18 is a complex, DOS based program which will require trained technicians and/or tank farm entries to obtain the data. ALARA considerations have prompted themore » development of a more user-friendly, computer-based interface to the Enraf densitometers. This document records the plan by which this new Enraf data acquisition software will be developed, reviewed, verified, and released. This plan applies to the development and implementation of a one-time-use software program, which will be called ''Enraf Control Panel.'' The software will be primarily used for remote operation of Enraf Densitometers for the purpose of obtaining and logging tank product density profiles.« less

  15. 75 FR 69644 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-15

    ..., organization, phone, fax, mobile, pager, Defense Switched Network (DSN) phone, other fax, other mobile, other.../Transport Layer Security (SSL/ TLS) connections, access control lists, file system permissions, intrusion detection and prevention systems and log monitoring. Complete access to all records is restricted to and...

  16. The RIACS Intelligent Auditing and Categorizing System

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1988-01-01

    The organization of the RIACS auditing package is described along with how to installation instructions and how to interpret the output. How to set up both local and remote file system auditing is given. Logging is done on a time driven basis, and auditing in a passive mode.

  17. VizieR Online Data Catalog: Wide binaries in Tycho-Gaia: search method (Andrews+, 2017)

    NASA Astrophysics Data System (ADS)

    Andrews, J. J.; Chaname, J.; Agueros, M. A.

    2017-11-01

    Our catalogue of wide binaries identified in the Tycho-Gaia Astrometric Solution catalogue. The Gaia source IDs, Tycho IDs, astrometry, posterior probabilities for both the log-flat prior and power-law prior models, and angular separation are presented. (1 data file).

  18. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  19. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  20. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  1. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  2. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doug Blankenship

    Natural fracture data from wells 33-7, 33A-7,52A-7, 52B-7 and 83-11 at West Flank. Fracture orientations were determined from image logs of these wells (see accompanying submissions). Data files contain depth, apparent (in wellbore reference frame) and true (in geographic reference frame) azimuth and dip, respectively.

  4. TU-D-209-05: Automatic Calculation of Organ and Effective Dose for CBCT and Interventional Fluoroscopic Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Oines, A

    Purpose: To compare PCXMC and EGSnrc calculated organ and effective radiation doses from cone-beam computed tomography (CBCT) and interventional fluoroscopically-guided procedures using automatic exposure-event grouping. Methods: For CBCT, we used PCXMC20Rotation.exe to automatically calculate the doses and compared the results to those calculated using EGSnrc with the Zubal patient phantom. For interventional procedures, we use the dose tracking system (DTS) which we previously developed to produce a log file of all geometry and exposure parameters for every x-ray pulse during a procedure, and the data in the log file is input into PCXMC and EGSnrc for dose calculation. A MATLABmore » program reads data from the log files and groups similar exposures to reduce calculation time. The definition files are then automatically generated in the format used by PCXMC and EGSnrc. Processing is done at the end of the procedure after all exposures are completed. Results: For the Toshiba Infinix CBCT LCI-Middle-Abdominal protocol, most organ doses calculated with PCXMC20Rotation closely matched those calculated with EGSnrc. The effective doses were 33.77 mSv with PCXMC20Rotation and 32.46 mSv with EGSnrc. For a simulated interventional cardiac procedure, similar close agreement in organ dose was obtained between the two codes; the effective doses were 12.02 mSv with PCXMC and 11.35 mSv with EGSnrc. The calculations can be completed on a PC without manual intervention in less than 15 minutes with PCXMC and in about 10 hours with EGSnrc, depending on the level of data grouping and accuracy desired. Conclusion: Effective dose and most organ doses in CBCT and interventional radiology calculated by PCXMC closely match those calculated by EGSnrc. Data grouping, which can be done automatically, makes the calculation time with PCXMC on a standard PC acceptable. This capability expands the dose information that can be provided by the DTS. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.« less

  5. VizieR Online Data Catalog: New atmospheric parameters of MILES cool stars (Sharma+, 2016)

    NASA Astrophysics Data System (ADS)

    Sharma, K.; Prugniel, P.; Singh, H. P.

    2015-11-01

    MILES V2 spectral interpolator The FITS file is an improved version of MILES interpolator previously presented in PVK. It contains the coefficients of the interpolator, which allows one to compute an interpolated spectrum, giving an effective temperature, log of surface gravity and metallicity (Teff, logg, and [Fe/H]). The file consists of three extensions containing the three temperature regimes described in the paper. Extension Teff range 0 warm 4000-9000K 1 hot >7000K 2 cold <4550K The three functions are linearly interpolated in the Teff overlapping regions. Each extension contains a 2D image-type array, whose first axis is the wavelength described by a WCS (Air wavelength, starting at 3536Å, step=0.9Å). This FITS file can be used by the ULySS v1.3 or higher. (5 data files).

  6. VizieR Online Data Catalog: Distances to RRab stars from WISE and Gaia (Sesar+, 2017)

    NASA Astrophysics Data System (ADS)

    Sesar, B.; Fouesneau, M.; Price-Whelan, A. M.; Bailer-Jones, C. A. L.; Gould, A.; Rix, H.-W.

    2017-10-01

    To constrain the period-luminosity-metallicity (PLZ) relations for RR Lyrae stars in WISE W1 and W2 bands, we use TGAS trigonometric parallaxes (barω), spectroscopic metallicities ([Fe/H]; Fernley+ 1998, J/A+A/330/515), log-periods (logP, base 10), and apparent magnitudes (m; Klein+ 2014, J/MNRAS/440/L96) for 102 RRab stars within ~2.5kpc from the Sun. The E(B-V) reddening at a star's position is obtained from the Schlegel+ (1998ApJ...500..525S) dust map. (1 data file).

  7. Wave-Ice Interaction and the Marginal Ice Zone

    DTIC Science & Technology

    2013-09-30

    concept, using a high-quality attitude and heading reference system ( AHRS ) together with an accurate twin-antennae GPS compass. The instruments logged...the AHRS parameters at 50Hz, together with GPS-derived fixes, heading (accurate to better than 1o) and velocities at 10Hz. The 30MB hourly files

  8. Log on to the Future: One School's Success Story.

    ERIC Educational Resources Information Center

    Hovenic, Ginger

    This paper describes Clear View Elementary School's (California) successful experience with integrating technology into the curriculum. Since its inception seven years ago, the school has acquired 250 computers, networked them all on two central file servers, and computerized the library and trained all staff members to be proficient facilitators…

  9. 40 CFR 146.14 - Information to be considered by the Director.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., logging procedures, deviation checks, and a drilling, testing, and coring program; and (16) A certificate... information listed below which are current and accurate in the file. For a newly drilled Class I well, the..., construction, date drilled, location, depth, record of plugging and/or completion, and any additional...

  10. All Aboard the Internet.

    ERIC Educational Resources Information Center

    Descy, Don E.

    1993-01-01

    This introduction to the Internet with examples for Macintosh computer users demonstrates the ease of using e-mail, participating on discussion group listservs, logging in to remote sites using Telnet, and obtaining resources using the File Transfer Protocol (FTP). Included are lists of discussion groups, Telnet sites, and FTP Archive sites. (EA)

  11. A Query Analysis of Consumer Health Information Retrieval

    PubMed Central

    Hong, Yi; de la Cruz, Norberto; Barnas, Gary; Early, Eileen; Gillis, Rick

    2002-01-01

    The log files of MCW HealthLink web site were analyzed to study users' needs for consumer health information and get a better understanding of the health topics users are searching for, the paths users usually take to find consumer health information and the way to improve search effectiveness.

  12. The Internet and Technical Services: A Point Break Approach.

    ERIC Educational Resources Information Center

    McCombs, Gillian M.

    1994-01-01

    Discusses implications of using the Internet for library technical services. Topics addressed include creative uses of the Internet; three basic applications on the Internet, i.e., electronic mail, remote log-in to another computer, and file transfer; electronic processing of information; electronic access to information; and electronic processing…

  13. 77 FR 35956 - Appalachian Power Company; Notice of Application Accepted for Filing, Soliciting Motions To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-15

    ...) screened intake structures; (3) a concrete powerhouse containing three turbine-generator units with a total... structures; (3) a concrete powerhouse containing three turbine-generator units with a total installed... by a log boom; (2) screened intake structures; (3) a concrete powerhouse containing three turbine...

  14. Library Web Proxy Use Survey Results.

    ERIC Educational Resources Information Center

    Murray, Peter E.

    2001-01-01

    Outlines the use of proxy Web servers by libraries and reports on a survey on their use in libraries. Highlights include proxy use for remote resource access, for filtering, for bandwidth conservation, and for gathering statistics; privacy policies regarding the use of proxy server log files; and a copy of the survey. (LRW)

  15. Archive of digital chirp subbottom profile data collected during USGS cruises 13BIM02 and 13BIM07 offshore of the Chandeleur Islands, Louisiana, 2013

    USGS Publications Warehouse

    Forde, Arnell S.; Miselis, Jennifer L.; Flocks, James G.; Bernier, Julie C.; Wiese, Dana S.

    2014-01-01

    On July 5–19 (cruise 13BIM02) and August 22–September 1 (cruise 13BIM07), 2013, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on barrier island evolution and medium-term and interannual sediment transport along the oil spill mitigation sand berm constructed at the north end and offshore of the Chandeleur Islands, Louisiana. This investigation is part of a broader USGS study, which seeks to understand barrier island evolution better over medium time scales (months to years). This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained–showing a relative increase in signal amplitude–digital images of the seismic profiles are provided. Refer to the Abbreviations page for explanations of acronyms and abbreviations used in this report.

  16. Archive of digital Chirp subbottom profile data collected during USGS cruises 09CCT03 and 09CCT04, Mississippi and Alabama Gulf Islands, June and July 2009

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.

    2011-01-01

    In June and July of 2009, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on island framework from Cat Island, Mississippi, to Dauphin Island, Alabama, as part of a broader USGS study on Coastal Change and Transport (CCT). The surveys were funded through the Northern Gulf of Mexico Ecosystem Change and Hazard Susceptibility Project as part of the Holocene Evolution of the Mississippi-Alabama Region Subtask (http://ngom.er.usgs.gov/task2_2/index.php). This report serves as an archive of unprocessed digital Chirp seismic profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Single-beam and Swath bathymetry data were also collected during these cruises and will be published as a separate archive. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.

  17. Coastal single-beam bathymetry data collected in 2015 from Raccoon Point to Point Au Fer Island, Louisiana

    USGS Publications Warehouse

    Stalk, Chelsea A.; DeWitt, Nancy T.; Kindinger, Jack L.; Flocks, James G.; Reynolds, Billy J.; Kelso, Kyle W.; Fredericks, Joseph J.; Tuten, Thomas M.

    2017-03-10

    As part of the Barrier Island Comprehensive Monitoring Program (BICM), scientists from the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center conducted a nearshore single-beam bathymetry survey along the south-central coast of Louisiana, from Raccoon Point to Point Au Fer Island, in July 2015. The goal of the BICM program is to provide long-term data on Louisiana’s coastline and use this data to plan, design, evaluate, and maintain current and future barrier island restoration projects. The data described in this report will provide baseline bathymetric information for future research investigating island evolution, sediment transport, and recent and long-term geomorphic change, and will support modeling of future changes in response to restoration and storm impacts. The survey area encompasses more than 300 square kilometers of nearshore environment from Raccoon Point to Point Au Fer Island. This data series serves as an archive of processed single-beam bathymetry data, collected from July 22–29, 2015, under USGS Field Activity Number 2015-320-FA. Geographic information system data products include a 200-meter-cell-size interpolated bathymetry grid, trackline maps, and point data files. Additional files include error analysis maps, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.

  18. Chirp subbottom profile data collected in 2015 from the northern Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    Forde, Arnell S.; DeWitt, Nancy T.; Fredericks, Jake J.; Miselis, Jennifer L.

    2018-01-30

    As part of the Barrier Island Evolution Research project, scientists from the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center conducted a nearshore geophysical survey around the northern Chandeleur Islands, Louisiana, in September 2015. The objective of the project is to improve the understanding of barrier island geomorphic evolution, particularly storm-related depositional and erosional processes that shape the islands over annual to interannual time scales (1–5 years). Collecting geophysical data can help researchers identify relations between the geologic history of the islands and their present day morphology and sediment distribution. High-resolution geophysical data collected along this rapidly changing barrier island system can provide a unique time-series dataset to further the analyses and geomorphological interpretations of this and other coastal systems, improving our understanding of coastal response and evolution over medium-term time scales (months to years). Subbottom profile data were collected in September 2015 offshore of the northern Chandeleur Islands, during USGS Field Activity Number 2015-331-FA. Data products, including raw digital chirp subbottom data, processed subbottom profile images, survey trackline map, navigation files, geographic information system data files and formal Federal Geographic Data Committee metadata, and Field Activity Collection System and operation logs are available for download.

  19. The Added Value of Log File Analyses of the Use of a Personal Health Record for Patients With Type 2 Diabetes Mellitus

    PubMed Central

    Kelders, Saskia M.; Braakman-Jansen, Louise M. A.; van Gemert-Pijnen, Julia E. W. C.

    2014-01-01

    The electronic personal health record (PHR) is a promising technology for improving the quality of chronic disease management. Until now, evaluations of such systems have provided only little insight into why a particular outcome occurred. The aim of this study is to gain insight into the navigation process (what functionalities are used, and in what sequence) of e-Vita, a PHR for patients with type 2 diabetes mellitus (T2DM), to increase the efficiency of the system and improve the long-term adherence. Log data of the first visits in the first 6 weeks after the release of a renewed version of e-Vita were analyzed to identify the usage patterns that emerge when users explore a new application. After receiving the invitation, 28% of all registered users visited e-Vita. In total, 70 unique usage patterns could be identified. When users visited the education service first, 93% of all users ended their session. Most users visited either 1 or 5 or more services during their first session, but the distribution of the routes was diffuse. In conclusion, log file analyses can provide valuable prompts for improving the system design of a PHR. In this way, the match between the system and its users and the long-term adherence has the potential to increase. PMID:24876574

  20. PDB explorer -- a web based algorithm for protein annotation viewer and 3D visualization.

    PubMed

    Nayarisseri, Anuraj; Shardiwal, Rakesh Kumar; Yadav, Mukesh; Kanungo, Neha; Singh, Pooja; Shah, Pratik; Ahmed, Sheaza

    2014-12-01

    The PDB file format, is a text format characterizing the three dimensional structures of macro molecules available in the Protein Data Bank (PDB). Determined protein structure are found in coalition with other molecules or ions such as nucleic acids, water, ions, Drug molecules and so on, which therefore can be described in the PDB format and have been deposited in PDB database. PDB is a machine generated file, it's not human readable format, to read this file we need any computational tool to understand it. The objective of our present study is to develop a free online software for retrieval, visualization and reading of annotation of a protein 3D structure which is available in PDB database. Main aim is to create PDB file in human readable format, i.e., the information in PDB file is converted in readable sentences. It displays all possible information from a PDB file including 3D structure of that file. Programming languages and scripting languages like Perl, CSS, Javascript, Ajax, and HTML have been used for the development of PDB Explorer. The PDB Explorer directly parses the PDB file, calling methods for parsed element secondary structure element, atoms, coordinates etc. PDB Explorer is freely available at http://www.pdbexplorer.eminentbio.com/home with no requirement of log-in.

  1. Fort Bliss Geothermal Area Data: Temperature profile, logs, schematic model and cross section

    DOE Data Explorer

    Adam Brandt

    2015-11-15

    This dataset contains a variety of data about the Fort Bliss geothermal area, part of the southern portion of the Tularosa Basin, New Mexico. The dataset contains schematic models for the McGregor Geothermal System, a shallow temperature survey of the Fort Bliss geothermal area. The dataset also contains Century OH logs, a full temperature profile, and complete logs from well RMI 56-5, including resistivity and porosity data, drill logs with drill rate, depth, lithology, mineralogy, fractures, temperature, pit total, gases, and descriptions among other measurements as well as CDL, CNL, DIL, GR Caliper and Temperature files. A shallow (2 meter depth) temperature survey of the Fort Bliss geothermal area with 63 data points is also included. Two cross sections through the Fort Bliss area, also included, show well position and depth. The surface map included shows faults and well spatial distribution. Inferred and observed fault distributions from gravity surveys around the Fort Bliss geothermal area.

  2. Stratigraphic framework of Cambrian and Ordovician rocks in the central Appalachian basin from Medina County, Ohio, through southwestern and south-central Pennsylvania to Hampshire County, West Virginia: Chapter E.2.2 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Harris, Anita G.; Repetski, John E.; Crangle, Robert D.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    This chapter is a re-release of U.S. Geological Survey Bulletin 1839-K, of the same title, by Ryder and others (1992; online version 2.0 revised and digitized by Robert D. Crangle, Jr., 2003). It consists of one file of the report text as it appeared in USGS Bulletin 1839-K and a second file containing the cross section, figures 1 and 2, and tables 1 and 2 on one oversized sheet; the second file was digitized in 2003 as version 2.0 and also includes the gamma-ray well log traces.

  3. Improved method estimating bioconcentration/bioaccumulation factor from octanol/water partition coefficient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meylan, W.M.; Howard, P.H.; Aronson, D.

    1999-04-01

    A compound`s bioconcentration factor (BDF) is the most commonly used indicator of its tendency to accumulate in aquatic organisms from the surrounding medium. Because it is expensive to measure, the BCF is generally estimated from the octanol/water partition coefficient (K{sub ow}), but currently used regression equations were developed from small data sets that do not adequately represent the wide range of chemical substances now subject to review. To develop and improved method, the authors collected BCF data in a file that contained information on measured BCFs and other key experimental details for 694 chemicals. Log BCF was then regressed againstmore » log K{sub ow} and chemicals with significant deviations from the line of best fit were analyzed by chemical structure. The resulting algorithm classifies a substance as either nonionic or ionic, the latter group including carboxylic acids, sulfonic acids and their salts, and quaternary N compounds. Log BCF for nonionics is estimated from log K{sub ow} and a series of correction factors if applicable; different equations apply for log K{sub ow} 1.0 to 7.0 and >7.0. For ionics, chemicals are categorized by log K{sub ow} and a log BCF in the range 0.5 to 1.75 is assigned. Organometallics, nonionics with long alkyl chains, and aromatic azo compounds receive special treatment. The correlation coefficient and mean error for log BCF indicate that the new method is a significantly better fit to existing data than other methods.« less

  4. Production, prices, employment, and trade in Northwest forest industries, third quarter 1996.

    Treesearch

    Debra D. Warren

    1997-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries: international trade in logs, lumber, and plywood: volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  5. Production, prices, employment, and trade in Northwest forest industries, all quarters 2000.

    Treesearch

    Debra D. Warren

    2002-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  6. Production, prices, employment, and trade in Northwest forest industries, all quarters 2002.

    Treesearch

    Debra D. Warren

    2004-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  7. Production, prices, employment, and trade in Northwest forest industries, all quarters 2005.

    Treesearch

    Debra D. Warren

    2007-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  8. Production, prices, employment, and trade in Northwest forest industries, all quarters 2006.

    Treesearch

    Debra D. Warren

    2008-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  9. Production, prices, employment, and trade in Northwest forest industries, all quarters 2004.

    Treesearch

    Debra D. Warren

    2006-01-01

    Provides current information on lumber and plywood production and prices; employment in forest industries; international trade in logs, lumber, and plywood; volumes and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  10. Voting with Their Seats: Computer Laboratory Design and the Casual User

    ERIC Educational Resources Information Center

    Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David

    2007-01-01

    Student computer laboratories are provided by most teaching institutions around the world; however, what is the most effective layout for such facilities? The log-in data files from computer laboratories at a regional university in Australia were analysed to determine whether there was a pattern in student seating. In particular, it was…

  11. Using Learning Styles and Viewing Styles in Streaming Video

    ERIC Educational Resources Information Center

    de Boer, Jelle; Kommers, Piet A. M.; de Brock, Bert

    2011-01-01

    Improving the effectiveness of learning when students observe video lectures becomes urgent with the rising advent of (web-based) video materials. Vital questions are how students differ in their learning preferences and what patterns in viewing video can be detected in log files. Our experiments inventory students' viewing patterns while watching…

  12. Recommendations for Benchmarking Web Site Usage among Academic Libraries.

    ERIC Educational Resources Information Center

    Hightower, Christy; Sih, Julie; Tilghman, Adam

    1998-01-01

    To help library directors and Web developers create a benchmarking program to compare statistics of academic Web sites, the authors analyzed the Web server log files of 14 university science and engineering libraries. Recommends a centralized voluntary reporting structure coordinated by the Association of Research Libraries (ARL) and a method for…

  13. Motivational Aspects of Learning Genetics with Interactive Multimedia

    ERIC Educational Resources Information Center

    Tsui, Chi-Yan; Treagust, David F.

    2004-01-01

    A BioLogica trial in six U.S. schools using interpretive approach is conducted by the Concord Consortium that examined the student motivation of learning genetics. Multiple data sources like online tests, computer data log files and classroom observation are used that found the result in terms of interviewees' perception, class-wide online…

  14. 16. Photocopy of photograph (4 x 5 inch reduction of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. Photocopy of photograph (4 x 5 inch reduction of 1939 3-1/4 x 5-5/8 inch print, photographer unknown; in Recreation files, Supervisor's Office, Mt. Baker-Snoqualmie National Forest) GENERAL VIEW, NORTHEAST CORNER, INTERPRETIVE LOG TO LEFT. - Glacier Ranger Station, Washington State Route 542, Glacier, Whatcom County, WA

  15. 20 CFR 658.410 - Establishment of State agency JS complaint system.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... system. At the local office level, the local office manager shall be responsible for the management of... related), the local office manager shall transmit a copy of that portion of the log containing the... established for the handling of complaints and files relating to the handling of complaints. The Manager or...

  16. 76 FR 4463 - Privacy Act of 1974; Report of Modified or Altered System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... occupationally related mortality or morbidity is occurring. In the event of litigation where the defendant is: (a... diseases and which provides for the confidentiality of the information. In the event of litigation..., limited log-ins, virus protection, and user rights/file attribute restrictions. Password protection...

  17. Production, prices, employment, and trade in Northwest forest industries, all quarters 1998.

    Treesearch

    Debra D. Warren

    2000-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  18. Production, prices, employment, and trade in Northwest forest industries, fourth quarter 1996.

    Treesearch

    Debra D. Warren

    1997-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  19. Production, prices, employment, and trade in Northwest forest industries, all quarters of 2007.

    Treesearch

    Debra D. Warren

    2008-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  20. Production, prices, employment, and trade in Northwest forest industries, all quarters 2003.

    Treesearch

    Debra D. Warren

    2005-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  1. Production, prices, employment, and trade in Northwest forest industries, all quarters 2008

    Treesearch

    Debra Warren

    2009-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  2. Data Retention Policy | High-Performance Computing | NREL

    Science.gov Websites

    HPC Data Retention Policy. File storage areas on Peregrine and Gyrfalcon are either user-centric to reclaim storage. We can make special arrangements for permanent storage, if needed. User-Centric > is 3 months after the last project ends. During this retention period, the user may log in to

  3. Elementary School Students' Strategic Learning: Does Task-Type Matter?

    ERIC Educational Resources Information Center

    Malmberg, Jonna; Järvelä, Sanna; Kirschner, Paul A.

    2014-01-01

    This study investigated what types of learning patterns and strategies elementary school students use to carry out ill- and well-structured tasks. Specifically, it was investigated which and when learning patterns actually emerge with respect to students' task solutions. The present study uses computer log file traces to investigate how…

  4. Patterns in Elementary School Students' Strategic Actions in Varying Learning Situations

    ERIC Educational Resources Information Center

    Malmberg, Jonna; Järvenoja, Hanna; Järvelä, Sanna

    2013-01-01

    This study uses log file traces to examine differences between high-and low-achieving students' strategic actions in varying learning situations. In addition, this study illustrates, in detail, what strategic and self-regulated learning constitutes in practice. The study investigates the learning patterns that emerge in learning situations…

  5. 78 FR 56873 - Information Collection Being Reviewed by the Federal Communications Commission

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... on the respondents, including the use of automated collection techniques or other forms of....: 3060-0360. Title: Section 80.409, Station Logs (Maritime Services). Form No.: N/A. Type of Review... the claim or complaint has been satisfied or barred by statute limiting the time for filing suits upon...

  6. TraceContract

    NASA Technical Reports Server (NTRS)

    Kavelund, Klaus; Barringer, Howard

    2012-01-01

    TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.

  7. Rapid Diagnostics of Onboard Sequences

    NASA Technical Reports Server (NTRS)

    Starbird, Thomas W.; Morris, John R.; Shams, Khawaja S.; Maimone, Mark W.

    2012-01-01

    Keeping track of sequences onboard a spacecraft is challenging. When reviewing Event Verification Records (EVRs) of sequence executions on the Mars Exploration Rover (MER), operators often found themselves wondering which version of a named sequence the EVR corresponded to. The lack of this information drastically impacts the operators diagnostic capabilities as well as their situational awareness with respect to the commands the spacecraft has executed, since the EVRs do not provide argument values or explanatory comments. Having this information immediately available can be instrumental in diagnosing critical events and can significantly enhance the overall safety of the spacecraft. This software provides auditing capability that can eliminate that uncertainty while diagnosing critical conditions. Furthermore, the Restful interface provides a simple way for sequencing tools to automatically retrieve binary compiled sequence SCMFs (Space Command Message Files) on demand. It also enables developers to change the underlying database, while maintaining the same interface to the existing applications. The logging capabilities are also beneficial to operators when they are trying to recall how they solved a similar problem many days ago: this software enables automatic recovery of SCMF and RML (Robot Markup Language) sequence files directly from the command EVRs, eliminating the need for people to find and validate the corresponding sequences. To address the lack of auditing capability for sequences onboard a spacecraft during earlier missions, extensive logging support was added on the Mars Science Laboratory (MSL) sequencing server. This server is responsible for generating all MSL binary SCMFs from RML input sequences. The sequencing server logs every SCMF it generates into a MySQL database, as well as the high-level RML file and dictionary name inputs used to create the SCMF. The SCMF is then indexed by a hash value that is automatically included in all command EVRs by the onboard flight software. Second, both the binary SCMF result and the RML input file can be retrieved simply by specifying the hash to a Restful web interface. This interface enables command line tools as well as large sophisticated programs to download the SCMF and RMLs on-demand from the database, enabling a vast array of tools to be built on top of it. One such command line tool can retrieve and display RML files, or annotate a list of EVRs by interleaving them with the original sequence commands. This software has been integrated with the MSL sequencing pipeline where it will serve sequences useful in diagnostics, debugging, and situational awareness throughout the mission.

  8. VizieR Online Data Catalog: Bessel (1825) calculation for geodesic measurements (Karney+, 2010)

    NASA Astrophysics Data System (ADS)

    Karney, C. F. F.; Deakin, R. E.

    2010-06-01

    The solution of the geodesic problem for an oblate ellipsoid is developed in terms of series. Tables are provided to simplify the computation. Included here are the tables that accompanied Bessel's paper (with corrections). The tables were crafted by Bessel to be minimize the labor of hand calculations. To this end, he adjusted the intervals in the tables, the number of terms included in the series, and the number of significant digits given so that the final results are accurate to about 8 places. For that reason, the most useful form of the tables is as the PDF file which provides the tables in a layout close to the original. Also provided is the LaTeX source file for the PDF file. Finally, the data has been put into a format so that it can be read easily by computer programs. All the logarithms are in base 10 (common logarithms). The characteristic and the mantissa should be read separately (indicated as x.c and x.m in the file description). Thus the first entry in the table, -4.4, should be parsed as "-4" (the characteristic) and ".4" (the mantissa); the anti-log for this entry is 10(-4+0.4)=2.5e-4. The "Delta" columns give the first difference of the preceding column, i.e., the difference of the preceding column in the next row and the preceding column in the current row. In the printed tables these are expressed as "units in the last place" and the differences are of the rounded representations in the preceding columns (to minimize interpolation errors). In table1.dat these are given scaled to a match the format used for the preceding column, as indicated by the units given for these columns. The unit log(") (in the description within square brackets [arcsec]) means the logarithm of a quantity expressed in arcseconds. (3 data files).

  9. Grid-wide neuroimaging data federation in the context of the NeuroLOG project

    PubMed Central

    Michel, Franck; Gaignard, Alban; Ahmad, Farooq; Barillot, Christian; Batrancourt, Bénédicte; Dojat, Michel; Gibaud, Bernard; Girard, Pascal; Godard, David; Kassel, Gilles; Lingrand, Diane; Malandain, Grégoire; Montagnat, Johan; Pélégrini-Issac, Mélanie; Pennec, Xavier; Rojas Balderrama, Javier; Wali, Bacem

    2010-01-01

    Grid technologies are appealing to deal with the challenges raised by computational neurosciences and support multi-centric brain studies. However, core grids middleware hardly cope with the complex neuroimaging data representation and multi-layer data federation needs. Moreover, legacy neuroscience environments need to be preserved and cannot be simply superseded by grid services. This paper describes the NeuroLOG platform design and implementation, shedding light on its Data Management Layer. It addresses the integration of brain image files, associated relational metadata and neuroscience semantic data in a heterogeneous distributed environment, integrating legacy data managers through a mediation layer. PMID:20543431

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stathakis, S; Defoor, D; Linden, P

    Purpose: To study the frequency of Multi-Leaf Collimator (MLC) leaf failures, investigate methods to predict them and reduce linac downtime. Methods: A Varian HD120 MLC was used in our study. The hyperterminal MLC errors logged from 06/2012 to 12/2014 were collected. Along with the hyperterminal errors, the MLC motor changes and all other MLC interventions by the linear accelerator engineer were recorded. The MLC dynalog files were also recorded on a daily basis for each treatment and during linac QA. The dynalog files were analyzed to calculate root mean square errors (RMS) and cumulative MLC travel distance per motor. Anmore » in-house MatLab code was used to analyze all dynalog files, record RMS errors and calculate the distance each MLC traveled per day. Results: A total of 269 interventions were recorded over a period of 18 months. Of these, 146 included MLC motor leaf change, 39 T-nut replacements, and 84 MLC cleaning sessions. Leaves close to the middle of each side required the most maintenance. In the A bank, leaves A27 to A40 recorded 73% of all interventions, while the same leaves in the B bank counted for 52% of the interventions. On average, leaves in the middle of the bank had their motors changed approximately every 1500m of travel. Finally, it was found that the number of RMS errors increased prior to an MLC motor change. Conclusion: An MLC dynalog file analysis software was developed that can be used to log daily MLC usage. Our eighteen-month data analysis showed that there is a correlation between the distance an MLC travels, the RMS and the life of the MLC motor. We plan to use this tool to predict MLC motor failures and with proper and timely intervention, reduce the downtime of the linac during clinical hours.« less

  11. Comparison of fracture and deformation in the rotary endodontic instruments: Protaper versus K-3 system.

    PubMed

    Nagi, Sana Ehsen; Khan, Farhan Raza; Rahman, Munawar

    2016-03-01

    This experimental study was done on extracted human teeth to compare the fracture and deformation of the two rotary endodontic files system namely K-3 and Protapers. It was conducted at the dental clinics of the Aga Khan University Hospital, Karachi, A log of file deformation or fracture during root canal preparation was kept. The location of fracture was noted along with the identity of the canal in which fracture took place. The fracture in the two rotary systems was compared. SPSS 20 was used for data analysis. Of the 172(80.4%) teeth possessing more than 15 degrees of curvature, fracture occurred in 7(4.1%) cases and deformation in 10(5.8%). Of the 42(19.6%) teeth possessing less than 15 degrees of curvature, fracture occurred in none of them while deformation was seen in 1(2.4%). There was no difference in K-3 and Protaper files with respect to file deformation and fracture. Most of the fractures occurred in mesiobuccal canals of maxillary molars, n=3(21.4%). The likelihood of file fracture increased 5.65-fold when the same file was used more than 3 times. Irrespective of the rotary system, apical third of the root canal space was the most common site for file fracture.

  12. An alternative to sneakernet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrell, S.; Ralstin, S.

    1992-04-01

    Many computer security plans specify that only a small percentage of the data processed will be classified. Thus, the bulk of the data on secure systems must be unclassified. Secure limited access sites operating approved classified computing systems sometimes also have a system ostensibly containing only unclassified files but operating within the secure environment. That system could be networked or otherwise connected to a classified system(s) in order that both be able to use common resources for file storage or computing power. Such a system must operate under the same rules as the secure classified systems. It is in themore » nature of unclassified files that they either came from, or will eventually migrate to, a non-secure system. Today, unclassified files are exported from systems within the secure environment typically by loading transport media and carrying them to an open system. Import of unclassified files is handled similarly. This media transport process, sometimes referred to as sneaker net, often is manually logged and controlled only by administrative procedures. A comprehensive system for secure bi-directional transfer of unclassified files between secure and open environments has yet to be developed. Any such secure file transport system should be required to meet several stringent criteria. It is the purpose of this document to begin a definition of these criteria.« less

  13. An alternative to sneakernet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrell, S.; Ralstin, S.

    1992-01-01

    Many computer security plans specify that only a small percentage of the data processed will be classified. Thus, the bulk of the data on secure systems must be unclassified. Secure limited access sites operating approved classified computing systems sometimes also have a system ostensibly containing only unclassified files but operating within the secure environment. That system could be networked or otherwise connected to a classified system(s) in order that both be able to use common resources for file storage or computing power. Such a system must operate under the same rules as the secure classified systems. It is in themore » nature of unclassified files that they either came from, or will eventually migrate to, a non-secure system. Today, unclassified files are exported from systems within the secure environment typically by loading transport media and carrying them to an open system. Import of unclassified files is handled similarly. This media transport process, sometimes referred to as sneaker net, often is manually logged and controlled only by administrative procedures. A comprehensive system for secure bi-directional transfer of unclassified files between secure and open environments has yet to be developed. Any such secure file transport system should be required to meet several stringent criteria. It is the purpose of this document to begin a definition of these criteria.« less

  14. SU-C-BRD-03: Analysis of Accelerator Generated Text Logs for Preemptive Maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Able, CM; Baydush, AH; Nguyen, C

    2014-06-15

    Purpose: To develop a model to analyze medical accelerator generated parameter and performance data that will provide an early warning of performance degradation and impending component failure. Methods: A robust 6 MV VMAT quality assurance treatment delivery was used to test the constancy of accelerator performance. The generated text log files were decoded and analyzed using statistical process control (SPC) methodology. The text file data is a single snapshot of energy specific and overall systems parameters. A total of 36 system parameters were monitored which include RF generation, electron gun control, energy control, beam uniformity control, DC voltage generation, andmore » cooling systems. The parameters were analyzed using Individual and Moving Range (I/MR) charts. The chart limits were calculated using a hybrid technique that included the use of the standard 3σ limits and the parameter/system specification. Synthetic errors/changes were introduced to determine the initial effectiveness of I/MR charts in detecting relevant changes in operating parameters. The magnitude of the synthetic errors/changes was based on: the value of 1 standard deviation from the mean operating parameter of 483 TB systems, a small fraction (≤ 5%) of the operating range, or a fraction of the minor fault deviation. Results: There were 34 parameters in which synthetic errors were introduced. There were 2 parameters (radial position steering coil, and positive 24V DC) in which the errors did not exceed the limit of the I/MR chart. The I chart limit was exceeded for all of the remaining parameters (94.2%). The MR chart limit was exceeded in 29 of the 32 parameters (85.3%) in which the I chart limit was exceeded. Conclusion: Statistical process control I/MR evaluation of text log file parameters may be effective in providing an early warning of performance degradation or component failure for digital medical accelerator systems. Research is Supported by Varian Medical Systems, Inc.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wijesooriya, K; Seitter, K; Desai, V

    Purpose: To present our single institution experience on catching errors with trajectory log file analysis. The reported causes of failures, probability of occurrences (O), severity of effects (S), and the probability of the failures to be undetected (D) could be added to guidelines of FMEA analysis. Methods: From March 2013 to March 2014, 19569 patient treatment fields/arcs were analyzed. This work includes checking all 131 treatment delivery parameters for all patients, all treatment sites and all treatment delivery fractions. TrueBeam trajectory log files for all treatment field types as well as all imaging types were accessed, read in every 20ms,more » and every control point (total of 37 million parameters) compared to the physician approved plan in the planning system. Results: Couch angle outlier occurrence: N= 327, range = −1.7 −1.2 deg; gantry angle outlier occurrence: N =59, range = 0.09 – 5.61 deg, collimator angle outlier occurrence: N = 13, range = −0.2 – 0.2 deg. VMAT cases have slightly larger variations in mechanical parameters. MLC: 3D single control point fields have a maximum deviation of 0.04 mm, 39 step and shoot IMRT cases have MLC −0.3 – 0.5 mm deviations, all (1286) VMAT cases have −0.9 – 0.7 mm deviations. Two possible serious errors were found: 1) A 4 cm isocenter shift for the PA beam of an AP-PA pair, under-dosing a portion of PTV by 25%. 2) Delivery with MLC leaves abutted behind the jaws as opposed to the midline as planned, leading to a under-dosing of a small volume of the PTV by 25%, by just the boost plan. Due to their error origin, neither of these errors could have been detected by pre-treatment verification. Conclusion: Performing Trajectory Log file analysis could catch typically undetected errors to avoid potentially adverse incidents.« less

  16. Activity Catalog Tool (ACT) user manual, version 2.0

    NASA Technical Reports Server (NTRS)

    Segal, Leon D.; Andre, Anthony D.

    1994-01-01

    This report comprises the user manual for version 2.0 of the Activity Catalog Tool (ACT) software program, developed by Leon D. Segal and Anthony D. Andre in cooperation with NASA Ames Aerospace Human Factors Research Division, FLR branch. ACT is a software tool for recording and analyzing sequences of activity over time that runs on the Macintosh platform. It was designed as an aid for professionals who are interested in observing and understanding human behavior in field settings, or from video or audio recordings of the same. Specifically, the program is aimed at two primary areas of interest: human-machine interactions and interactions between humans. The program provides a means by which an observer can record an observed sequence of events, logging such parameters as frequency and duration of particular events. The program goes further by providing the user with a quantified description of the observed sequence, through application of a basic set of statistical routines, and enables merging and appending of several files and more extensive analysis of the resultant data.

  17. The medium is NOT the message or Indefinitely long-term file storage at Leeds University

    NASA Technical Reports Server (NTRS)

    Holdsworth, David

    1996-01-01

    Approximately 3 years ago we implemented an archive file storage system which embodies experiences gained over more than 25 years of using and writing file storage systems. It is the third in-house system that we have written, and all three systems have been adopted by other institutions. This paper discusses the requirements for long-term data storage in a university environment, and describes how our present system is designed to meet these requirements indefinitely. Particular emphasis is laid on experiences from past systems, and their influence on current system design. We also look at the influence of the IEEE-MSS standard. We currently have the system operating in five UK universities. The system operates in a multi-server environment, and is currently operational with UNIX (SunOS4, Solaris2, SGI-IRIX, HP-UX), NetWare3 and NetWare4. PCs logged on to NetWare can also archive and recover files that live on their hard disks.

  18. Efficacy of 3D conforming nickel titanium rotary instruments in eliminating canal wall bacteria from oval-shaped root canals.

    PubMed

    Bortoluzzi, Eduardo A; Carlon, Daniel; Meghil, Mohamed M; El-Awady, Ahmed R; Niu, Lina; Bergeron, Brian E; Susin, Lisiane; Cutler, Christopher W; Pashley, David H; Tay, Franklin R

    2015-05-01

    To evaluate the effectiveness of TRUShape® 3D Conforming Files, compared with Twisted Files, in reducing bacteria load from root canal walls, in the presence or absence of irrigant agitation. Extracted human premolars with single oval-shaped canals were infected with Enterococcus faecalis. Teeth in Group I (N=10; NaOCl and QMix® 2in1 as respective initial and final irrigants) were subdivided into 4 subgroups: (A) TRUShape® instrumentation without irrigant activation; (B) TRUShape® instrumentation with sonic irrigant agitation; (C) Twisted Files without irrigant agitation; (D) Twisted Files with sonic irrigant agitation. To remove confounding factor (antimicrobial irrigants), teeth in Group II (N=10) were irrigated with sterile saline, using the same subgroup designations. Specimens before and after chemomechanical débridement were cultured for quantification of colony-forming units (CFUs). Data from each group were analyzed separately using two-factor ANOVA and Holm-Sidak multiple comparison (α=0.05). Canal wall bacteria were qualitatively examined using scanning electron microscopy (SEM) and light microscopy of Taylor-modified Brown and Brenn-stained demineralised sections. CFUs from subgroups in Group I were not significantly different (P=0.935). For Group II, both file type (P<0.001) and irrigant agitation (P<0.001) significantly affected log-reduction in CFU concentrations. The interaction of these two factors was not significant (P=0.601). Although SEM showed reduced canal wall bacteria, bacteria were present within dentinal tubules after rotary instrumentation, as revealed by light microscopy of longitudinal root sections. TRUShape® files removed significantly more canal wall bacteria than Twisted Files when used without an antibacterial irrigant; the latter is required to decontaminate dentinal tubules. Root canal disinfection should not be focused only on a mechanistic approach. Rather, the rational choice of a rotary instrumentation system should be combined with the use of well-tested antimicrobial irrigants and delivery/agitation techniques to establish a clinically realistic chemomechanical débridement protocol. Published by Elsevier Ltd.

  19. The design and implementation of the HY-1B Product Archive System

    NASA Astrophysics Data System (ADS)

    Liu, Shibin; Liu, Wei; Peng, Hailong

    2010-11-01

    Product Archive System (PAS), as a background system, is the core part of the Product Archive and Distribution System (PADS) which is the center for data management of the Ground Application System of HY-1B satellite hosted by the National Satellite Ocean Application Service of China. PAS integrates a series of updating methods and technologies, such as a suitable data transmittal mode, flexible configuration files and log information in order to make the system with several desirable characteristics, such as ease of maintenance, stability, minimal complexity. This paper describes seven major components of the PAS (Network Communicator module, File Collector module, File Copy module, Task Collector module, Metadata Extractor module, Product data Archive module, Metadata catalogue import module) and some of the unique features of the system, as well as the technical problems encountered and resolved.

  20. Self-Regulation during E-Learning: Using Behavioural Evidence from Navigation Log Files

    ERIC Educational Resources Information Center

    Jeske, D.; Backhaus, J.; Stamov Roßnagel, C.

    2014-01-01

    The current paper examined the relationship between perceived characteristics of the learning environment in an e-module in relation to test performance among a group of e-learners. Using structural equation modelling, the relationship between these variables is further explored in terms of the proposed double mediation as outlined by Ning and…

  1. Microanalytic Case studies of Individual Participation Patterns in an Asynchronous Online Discussion in an Undergraduate Blended Course

    ERIC Educational Resources Information Center

    Wise, Alyssa Friend; Perera, Nishan; Hsiao, Ying-Ting; Speer, Jennifer; Marbouti, Farshid

    2012-01-01

    This study presents three case studies of students' participation patterns in an online discussion to address the gap in our current understanding of how "individuals" experience asynchronous learning environments. Cases were constructed via microanalysis of log-file data, post contents, and the evolving discussion structure. The first student was…

  2. Query Classification and Study of University Students' Search Trends

    ERIC Educational Resources Information Center

    Maabreh, Majdi A.; Al-Kabi, Mohammed N.; Alsmadi, Izzat M.

    2012-01-01

    Purpose: This study is an attempt to develop an automatic identification method for Arabic web queries and divide them into several query types using data mining. In addition, it seeks to evaluate the impact of the academic environment on using the internet. Design/methodology/approach: The web log files were collected from one of the higher…

  3. VizieR Online Data Catalog: GAMA. Stellar mass budget (Moffett+, 2016)

    NASA Astrophysics Data System (ADS)

    Moffett, A. J.; Lange, R.; Driver, S. P.; Robotham, A. S. G.; Kelvin, L. S.; Alpaslan, M.; Andrews, S. K.; Bland-Hawthorn, J.; Brough, S.; Cluver, M. E.; Colless, M.; Davies, L. J. M.; Holwerda, B. W.; Hopkins, A. M.; Kafle, P. R.; Liske, J.; Meyer, M.

    2018-04-01

    Using the recently expanded Galaxy and Mass Assembly (GAMA) survey phase II visual morphology sample and the large-scale bulge and disc decomposition analysis of Lange et al. (2016MNRAS.462.1470L), we derive new stellar mass function fits to galaxy spheroid and disc populations down to log(M*/Mȯ)=8. (1 data file).

  4. 25 CFR 547.12 - What are the minimum technical standards for downloading on a Class II gaming system?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...

  5. 25 CFR 547.12 - What are the minimum technical standards for downloading on a Class II gaming system?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...

  6. 76 FR 54835 - Child Labor Regulations, Orders and Statements of Interpretation; Child Labor Violations-Civil...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ....m. in your local time zone, or log onto the Wage and Hour Division's Web site for a nationwide... INFORMATION: I. Electronic Access and Filing Comments Public Participation: This notice of proposed rulemaking is available through the Federal Register and the http://www.regulations.gov Web site. You may also...

  7. Capabilities Report 2012, West Desert Test Center

    DTIC Science & Technology

    2012-03-12

    132 FT- IR Spectrometer...electronic system files, paper logs, production batch records, QA/QC data, and PCR data generated during a test. Data analysts also track and QC raw data...Advantage +SL bench-top freeze dryers achieve shelf temperatures as low as -57°C and condenser temperatures to -67°C. The bulk milling facility produces

  8. 15. Photocopy of photograph (4 x 5 inch reduction of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. Photocopy of photograph (4 x 5 inch reduction of 1939 3-1/4 x 5-1/2 inch print, photographer unknown; in Recreation files, Supervisor's Office, Mt. Baker-Snoqualmie National Forest) GENERAL VIEW, LOOKING SOUTHWEST, SHOWING INTERPRETIVE LOG AND PROTECTION ASSISTANT'S HOUSE IN BACKGROUND. - Glacier Ranger Station, Washington State Route 542, Glacier, Whatcom County, WA

  9. Negotiating the Context of Online In-Service Training: "Expert" and "Non-Expert" Footings

    ERIC Educational Resources Information Center

    Nilsen, Mona

    2010-01-01

    This paper focuses on how people working in the Swedish food production industry engage in in-service training by means of computer-mediated communication. The empirical material consists of archived chat log files from a course concerning quality assurance and food safety hazards control in the preparation and handling of foodstuff. Drawing on…

  10. Learner Characteristics Predict Performance and Confidence in E-Learning: An Analysis of User Behavior and Self-Evaluation

    ERIC Educational Resources Information Center

    Jeske, Debora; Roßnagell, Christian Stamov; Backhaus, Joy

    2014-01-01

    We examined the role of learner characteristics as predictors of four aspects of e-learning performance, including knowledge test performance, learning confidence, learning efficiency, and navigational effectiveness. We used both self reports and log file records to compute the relevant statistics. Regression analyses showed that both need for…

  11. Web-Based Learning Programs: Use by Learners with Various Cognitive Styles

    ERIC Educational Resources Information Center

    Chen, Ling-Hsiu

    2010-01-01

    To consider how Web-based learning program is utilized by learners with different cognitive styles, this study presents a Web-based learning system (WBLS) and analyzes learners' browsing data recorded in the log file to identify how learners' cognitive styles and learning behavior are related. In order to develop an adapted WBLS, this study also…

  12. Developing an Efficient Computational Method that Estimates the Ability of Students in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2012-01-01

    This paper presents a computational method that can efficiently estimate the ability of students from the log files of a Web-based learning environment capturing their problem solving processes. The computational method developed in this study approximates the posterior distribution of the student's ability obtained from the conventional Bayes…

  13. Making Sense of Students' Actions in an Open-Ended Virtual Laboratory Environment

    ERIC Educational Resources Information Center

    Gal, Ya'akov; Uzan, Oriel; Belford, Robert; Karabinos, Michael; Yaron, David

    2015-01-01

    A process for analyzing log files collected from open-ended learning environments is developed and tested on a virtual lab problem involving reaction stoichiometry. The process utilizes a set of visualization tools that, by grouping student actions in a hierarchical manner, helps experts make sense of the linear list of student actions recorded in…

  14. Mission Operations Center (MOC) - Precipitation Processing System (PPS) Interface Software System (MPISS)

    NASA Technical Reports Server (NTRS)

    Ferrara, Jeffrey; Calk, William; Atwell, William; Tsui, Tina

    2013-01-01

    MPISS is an automatic file transfer system that implements a combination of standard and mission-unique transfer protocols required by the Global Precipitation Measurement Mission (GPM) Precipitation Processing System (PPS) to control the flow of data between the MOC and the PPS. The primary features of MPISS are file transfers (both with and without PPS specific protocols), logging of file transfer and system events to local files and a standard messaging bus, short term storage of data files to facilitate retransmissions, and generation of file transfer accounting reports. The system includes a graphical user interface (GUI) to control the system, allow manual operations, and to display events in real time. The PPS specific protocols are an enhanced version of those that were developed for the Tropical Rainfall Measuring Mission (TRMM). All file transfers between the MOC and the PPS use the SSH File Transfer Protocol (SFTP). For reports and data files generated within the MOC, no additional protocols are used when transferring files to the PPS. For observatory data files, an additional handshaking protocol of data notices and data receipts is used. MPISS generates and sends to the PPS data notices containing data start and stop times along with a checksum for the file for each observatory data file transmitted. MPISS retrieves the PPS generated data receipts that indicate the success or failure of the PPS to ingest the data file and/or notice. MPISS retransmits the appropriate files as indicated in the receipt when required. MPISS also automatically retrieves files from the PPS. The unique feature of this software is the use of both standard and PPS specific protocols in parallel. The advantage of this capability is that it supports users that require the PPS protocol as well as those that do not require it. The system is highly configurable to accommodate the needs of future users.

  15. Logging utilization in Idaho: Current and past trends

    Treesearch

    Eric A. Simmons; Todd A. Morgan; Erik C. Berg; Stanley J. Zarnoch; Steven W. Hayes; Mike T. Thompson

    2014-01-01

    A study of commercial timber-harvesting activities in Idaho was conducted during 2008 and 2011 to characterize current tree utilization, logging operations, and changes from previous Idaho logging utilization studies. A two-stage simple random sampling design was used to select sites and felled trees for measurement within active logging sites. Thirty-three logging...

  16. VizieR Online Data Catalog: Stellar parameters of KIC planet-host stars (Bastien+, 2014)

    NASA Astrophysics Data System (ADS)

    Bastien, F. A.; Stassun, K. G.; Pepper, J.

    2017-07-01

    We draw our bright KOI sample from the NASA Exoplanet Archive (NEA; Akeson et al. 2013PASP..125..989A) accessed on 2014 January 7. We restrict the sample to stars with 6650 K>Teff>4500 K, the Teff range for which F8 is calibrated. We exclude 28 stars with overall range of photometric variability >10 ppt (parts per thousand), as phenomena in the light curves of such chromospherically active stars can boost the measured F8 and thus result in an erroneous F8-based log g. These excluded stars (10% of the sample) are cooler than average for the overall sample, as expected given their large variability. Our sample after applying these cuts contains 289 stars (407 KOIs). (1 data file).

  17. An analysis of image storage systems for scalable training of deep neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Young, Steven R; Patton, Robert M

    This study presents a principled empirical evaluation of image storage systems for training deep neural networks. We employ the Caffe deep learning framework to train neural network models for three different data sets, MNIST, CIFAR-10, and ImageNet. While training the models, we evaluate five different options to retrieve training image data: (1) PNG-formatted image files on local file system; (2) pushing pixel arrays from image files into a single HDF5 file on local file system; (3) in-memory arrays to hold the pixel arrays in Python and C++; (4) loading the training data into LevelDB, a log-structured merge tree based key-valuemore » storage; and (5) loading the training data into LMDB, a B+tree based key-value storage. The experimental results quantitatively highlight the disadvantage of using normal image files on local file systems to train deep neural networks and demonstrate reliable performance with key-value storage based storage systems. When training a model on the ImageNet dataset, the image file option was more than 17 times slower than the key-value storage option. Along with measurements on training time, this study provides in-depth analysis on the cause of performance advantages/disadvantages of each back-end to train deep neural networks. We envision the provided measurements and analysis will shed light on the optimal way to architect systems for training neural networks in a scalable manner.« less

  18. Toward a Real-Time (Day) Dreamcatcher: Sensor-Free Detection of Mind Wandering during Online Reading

    ERIC Educational Resources Information Center

    Mills, Caitlin; D'Mello, Sidney

    2015-01-01

    This paper reports the results from a sensor-free detector of mind wandering during an online reading task. Features consisted of reading behaviors (e.g., reading time) and textual features (e.g., level of difficulty) extracted from self-paced reading log files. Supervised machine learning was applied to two datasets in order to predict if…

  19. Real-Time Population Health Detector

    DTIC Science & Technology

    2004-11-01

    military and civilian populations. General Dynamics (then Veridian Systems Division), in cooperation with Stanford University, won a competitive DARPA...via the sequence of one-step ahead forecast errors from the Kalman recursions: 1| −−= tttt Hye µ The log-likelihood then follows by treating the... parking in the transient parking structure. Norfolk Area Military Treatment Facility Patient Files GDAIS received historic CHCS data from all

  20. Diagnostic Problem-Solving Process in Professional Contexts: Theory and Empirical Investigation in the Context of Car Mechatronics Using Computer-Generated Log-Files

    ERIC Educational Resources Information Center

    Abele, Stephan

    2018-01-01

    This article deals with a theory-based investigation of the diagnostic problem-solving process in professional contexts. To begin with, a theory of the diagnostic problem-solving process was developed drawing on findings from different professional contexts. The theory distinguishes between four sub-processes of the diagnostic problem-solving…

  1. Some Features of "Alt" Texts Associated with Images in Web Pages

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    2006-01-01

    Introduction: This paper extends a series on summaries of Web objects, in this case, the alt attribute of image files. Method: Data were logged from 1894 pages from Yahoo!'s random page service and 4703 pages from the Google directory; an img tag was extracted randomly from each where present; its alt attribute, if any, was recorded; and the…

  2. Sediment data collected in 2013 from the northern Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    Buster, Noreen A.; Kelso, Kyle W.; Bernier, Julie C.; Flocks, James G.; Miselis, Jennifer L.; DeWitt, Nancy T.

    2014-01-01

    This data series serves as an archive of sediment data collected in July 2013 from the Chandeleur Islands sand berm and adjacent barrier-island environments. Data products include descriptive core logs, core photographs and x-radiographs, results of sediment grain-size analyses, sample location maps, and Geographic Information System data files with accompanying formal Federal Geographic Data Committee metadata.

  3. Well construction information, lithologic logs, water level data, and overview of research in Handcart Gulch, Colorado: an alpine watershed affected by metalliferous hydrothermal alteration

    USGS Publications Warehouse

    Caine, Jonathan S.; Manning, Andrew H.; Verplanck, Philip L.; Bove, Dana J.; Kahn, Katherine Gurley; Ge, Shemin

    2006-01-01

    Integrated, multidisciplinary studies of the Handcart Gulch alpine watershed provide a unique opportunity to study and characterize the geology and hydrology of an alpine watershed along the Continental Divide. The study area arose out of the donation of four abandoned, deep mineral exploration boreholes to the U.S. Geological Survey for research purposes by Mineral Systems Inc. These holes were supplemented with nine additional shallow holes drilled by the U.S. Geological Survey along the Handcart Gulch trunk stream. All of the holes were converted into observation wells, and a variety of data and samples were measured and collected from each. This open-file report contains: (1) An overview of the research conducted to date in Handcart Gulch; (2) well location, construction, lithologic log, and water level data from the research boreholes; and (3) a brief synopsis of preliminary results. The primary purpose of this report is to provide a research overview as well as raw data from the boreholes. Interpretation of the data will be reported in future publications. The drill hole data were tabulated into a spreadsheet included with this digital open-file report.

  4. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). Version 3.5, Quick Reference Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, B.G.; Richards, R.E.; Reece, W.J.

    1992-10-01

    This Reference Guide contains instructions on how to install and use Version 3.5 of the NRC-sponsored Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). The NUCLARR data management system is contained in compressed files on the floppy diskettes that accompany this Reference Guide. NUCLARR is comprised of hardware component failure data (HCFD) and human error probability (HEP) data, both of which are available via a user-friendly, menu driven retrieval system. The data may be saved to a file in a format compatible with IRRAS 3.0 and commercially available statistical packages, or used to formulate log-plots and reports of data retrievalmore » and aggregation findings.« less

  5. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, B.G.; Richards, R.E.; Reece, W.J.

    1992-10-01

    This Reference Guide contains instructions on how to install and use Version 3.5 of the NRC-sponsored Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). The NUCLARR data management system is contained in compressed files on the floppy diskettes that accompany this Reference Guide. NUCLARR is comprised of hardware component failure data (HCFD) and human error probability (HEP) data, both of which are available via a user-friendly, menu driven retrieval system. The data may be saved to a file in a format compatible with IRRAS 3.0 and commercially available statistical packages, or used to formulate log-plots and reports of data retrievalmore » and aggregation findings.« less

  6. Induction conductivity and natural gamma logs collected in 15 wells at Camp Stanley Storage Activity, Bexar County, Texas

    USGS Publications Warehouse

    Stanton, Gregory P.

    2005-01-01

    The U.S. Geological Survey, in cooperation with the Camp Stanley Storage Activity conducted electromagnetic induction conductivity and natural gamma logging of 15 selected wells on the Camp Stanley Storage Activity, located in northern Bexar County, Texas, during March 28–30, 2005. In late 2004, a helicopter electromagnetic survey was flown of the Camp Stanley Storage Activity as part of a U.S. Geological Survey project to better define subsurface geologic units, the structure, and the catchment area of the Trinity aquifer. The electromagnetic induction conductivity and natural gamma log data in this report were collected to constrain the calculation of resistivity depth sections and to provide subsurface controls for interpretation of the helicopter electromagnetic data collected for the Camp Stanley Storage Activity. Logs were recorded digitally while moving the probe in an upward direction to maintain proper depth control. Logging speed was no greater than 30 feet per minute. During logging, a repeat section of at least 100 feet was recorded to check repeatability of log responses. Several of the wells logged were completed with polyvinyl chloride casing that can be penetrated by electromagnetic induction fields and allows conductivity measurement. However, some wells were constructed with steel centralizers and stainless steel screen that caused spikes on both conductivity and resulting resistivity log curves. These responses are easily recognizable and appear at regular intervals on several logs.

  7. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  8. Sedimentologic characteristics of recent washover deposits from Assateague Island, Maryland

    USGS Publications Warehouse

    Bernier, Julie C.; Zaremba, Nicholas J.; Wheaton, Cathryn J.; Ellis, Alisha M.; Marot, Marci E.; Smith, Christopher G.

    2016-06-08

    This report describes sediment data collected using sand augers in active overwash zones on Assateague Island in Maryland. Samples were collected by the U.S. Geological Survey (USGS) during two surveys in March/April and October 2014 (USGS Field Activity Numbers [FAN] 2014-301-FA and 2014-322-FA, respectively). The physical characteristics (for example, sediment texture or bedding structure) of and spatial differences among these deposits will provide information about overwash processes and sediment transport from the sandy barrier-island reaches to the back-barrier environments. Metrics derived from these data, such as mean grain size or deposit thicknesses, can be used to ground-truth remote sensing and geophysical data and can also be incorporated into sediment transport models. Data products, including sample location tables, descriptive core logs, core photographs and x-radiographs, the results of sediment grain-size analyses, and Geographic Information System (GIS) data files with accompanying formal Federal Geographic Data Committee (FGDC) metadata can be downloaded from the Data Downloads page.

  9. Perceived Task-Difficulty Recognition from Log-File Information for the Use in Adaptive Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Janning, Ruth; Schatten, Carlotta; Schmidt-Thieme, Lars

    2016-01-01

    Recognising students' emotion, affect or cognition is a relatively young field and still a challenging task in the area of intelligent tutoring systems. There are several ways to use the output of these recognition tasks within the system. The approach most often mentioned in the literature is using it for giving feedback to the students. The…

  10. Scalable Trust of Next-Generation Management (STRONGMAN)

    DTIC Science & Technology

    2004-10-01

    remote logins might be policy controlled to allow only strongly encrypted IPSec tunnels to log in remotely, to access selected files, etc. The...and Angelos D. Keromytis. Drop-in Security for Distributed and Portable Computing Elements. Emerald Journal of Internet Research. Electronic...Security and Privacy, pp. 17-31, May 1999. [2] S. M. Bellovin. Distributed Firewalls. ; login : magazine, special issue on security, November 1999. [3] M

  11. 77 FR 66608 - New England Hydropower Company, LLC; Notice of Preliminary Permit Application Accepted for Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-06

    ... Spillway Dike with an 8-foot-long stop-log slot; (2) an existing 31-foot-long, 42-inch-diameter low level penstock; (3) an existing 0.13 acre impoundment with a normal maximum water surface elevation of 66.3 feet... transmission line connected to the NSTAR regional grid. The project would have an estimated average annual...

  12. 76 FR 7838 - Claverack Creek, LLC; Notice of Preliminary Permit Application Accepted for Filing and Soliciting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ...-deep intake canal; (5) new trash racks, head gates, and stop log structure; (6) an existing 6-foot... Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site http://www.ferc... copy of the application, can be viewed or printed on the ``eLibrary'' link of the Commission's Web site...

  13. Evaluation of an interactive case simulation system in dermatology and venereology for medical students

    PubMed Central

    Wahlgren, Carl-Fredrik; Edelbring, Samuel; Fors, Uno; Hindbeck, Hans; Ståhle, Mona

    2006-01-01

    Background Most of the many computer resources used in clinical teaching of dermatology and venereology for medical undergraduates are information-oriented and focus mostly on finding a "correct" multiple-choice alternative or free-text answer. We wanted to create an interactive computer program, which facilitates not only factual recall but also clinical reasoning. Methods Through continuous interaction with students, a new computerised interactive case simulation system, NUDOV, was developed. It is based on authentic cases and contains images of real patients, actors and healthcare providers. The student selects a patient and proposes questions for medical history, examines the skin, and suggests investigations, diagnosis, differential diagnoses and further management. Feedback is given by comparing the user's own suggestions with those of a specialist. In addition, a log file of the student's actions is recorded. The program includes a large number of images, video clips and Internet links. It was evaluated with a student questionnaire and by randomising medical students to conventional teaching (n = 85) or conventional teaching plus NUDOV (n = 31) and comparing the results of the two groups in a final written examination. Results The questionnaire showed that 90% of the NUDOV students stated that the program facilitated their learning to a large/very large extent, and 71% reported that extensive working with authentic computerised cases made it easier to understand and learn about diseases and their management. The layout, user-friendliness and feedback concept were judged as good/very good by 87%, 97%, and 100%, respectively. Log files revealed that the students, in general, worked with each case for 60–90 min. However, the intervention group did not score significantly better than the control group in the written examination. Conclusion We created a computerised case simulation program allowing students to manage patients in a non-linear format supporting the clinical reasoning process. The student gets feedback through comparison with a specialist, eliminating the need for external scoring or correction. The model also permits discussion of case processing, since all transactions are stored in a log file. The program was highly appreciated by the students, but did not significantly improve their performance in the written final examination. PMID:16907972

  14. 78 FR 44957 - Agency Information Collection Activities: BioWatch Filter Holder Log, Filter Holder Log DHS Form...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-25

    ... DEPARTMENT OF HOMELAND SECURITY Agency Information Collection Activities: BioWatch Filter Holder Log, Filter Holder Log DHS Form 9500 AGENCY: Office of Health Affairs, DHS. ACTION: 60-Day Notice and....: Daniel Yereb, [email protected] 703- 647-8052. SUPPLEMENTARY INFORMATION: Following collection, the filter...

  15. Disaster Radio for Communication of Vital Messages and Health-Related Information: Experiences From the Haiyan Typhoon, the Philippines.

    PubMed

    Hugelius, Karin; Gifford, Mervyn; Örtenwall, Per; Adolfsson, Annsofie

    2016-08-01

    Crisis communication is seen as an integrated and essential part of disaster management measures. After Typhoon Haiyan (Yolanda) in the Philippines 2013, radio was used to broadcast information to the affected community. The aim of this study was to describe how disaster radio was used to communicate vital messages and health-related information to the public in one affected region after Typhoon Haiyan. Mixed-methods analysis using qualitative content analysis and descriptive statistics was used to analyze 2587 logged radio log files. Radio was used to give general information and to demonstrate the capability of officials to manage the situation, to encourage, to promote recovery and foster a sense of hope, and to give practical advice and encourage self-activity. The content and focus of the messages changed over time. Encouraging messages were the most frequently broadcast messages. Health-related messages were a minor part of all information broadcast and gaps in the broadcast over time were found. Disaster radio can serve as a transmitter of vital messages including health-related information and psychological support in disaster areas. The present study indicated the potential for increased use. The perception, impact, and use of disaster radio need to be further evaluated. (Disaster Med Public Health Preparedness. 2016;10:591-597).

  16. Adiposity and Age Explain Most of the Association between Physical Activity and Fitness in Physically Active Men

    PubMed Central

    Serrano-Sánchez, José A.; Delgado-Guerra, Safira; Olmedillas, Hugo; Guadalupe-Grau, Amelia; Arteaga-Ortiz, Rafael; Sanchis-Moysi, Joaquín; Dorado, Cecilia; Calbet, José A. L.

    2010-01-01

    Background To determine if there is an association between physical activity assessed by the short version of the International Physical Activity Questionnaire (IPAQ) and cardiorespiratory and muscular fitness. Methodology/Principal Findings One hundred and eighty-two young males (age range: 20–55 years) completed the short form of the IPAQ to assess physical activity. Body composition (dual-energy X-Ray absorptiometry), muscular fitness (static and dynamic muscle force and power, vertical jump height, running speed [30 m sprint], anaerobic capacity [300 m running test]) and cardiorespiratory fitness (estimated VO2max: 20 m shuttle run test) were also determined in all subjects. Activity-related energy expenditure of moderate and vigorous intensity (EEPAmoderate and EEPAvigorous, respectively) was inversely associated with indices of adiposity (r = −0.21 to −0.37, P<0.05). Cardiorespiratory fitness (VO2max) was positively associated with LogEEPAmoderate (r = 0.26, P<0.05) and LogEEPAvigorous (r = 0.27). However, no association between VO2max with LogEEPAmoderate, LogEPPAvigorous and LogEEPAtotal was observed after adjusting for the percentage of body fat. Multiple stepwise regression analysis to predict VO2max from LogEEPAwalking, LogEEPAmoderate, LogEEPAvigorous, LogEEPAtotal, age and percentage of body fat (%fat) showed that the %fat alone explained 62% of the variance in VO2max and that the age added another 10%, while the other variables did not add predictive value to the model [VO2max  = 129.6−(25.1× Log %fat) − (34.0× Log age); SEE: 4.3 ml.kg−1. min−1; R2 = 0.72 (P<0.05)]. No positive association between muscular fitness-related variables and physical activity was observed, even after adjusting for body fat or body fat and age. Conclusions/Significance Adiposity and age are the strongest predictors of VO2max in healthy men. The energy expended in moderate and vigorous physical activities is inversely associated with adiposity. Muscular fitness does not appear to be associated with physical activity as assessed by the IPAQ. PMID:20976154

  17. Aero/fluids database system

    NASA Technical Reports Server (NTRS)

    Reardon, John E.; Violett, Duane L., Jr.

    1991-01-01

    The AFAS Database System was developed to provide the basic structure of a comprehensive database system for the Marshall Space Flight Center (MSFC) Structures and Dynamics Laboratory Aerophysics Division. The system is intended to handle all of the Aerophysics Division Test Facilities as well as data from other sources. The system was written for the DEC VAX family of computers in FORTRAN-77 and utilizes the VMS indexed file system and screen management routines. Various aspects of the system are covered, including a description of the user interface, lists of all code structure elements, descriptions of the file structures, a description of the security system operation, a detailed description of the data retrieval tasks, a description of the session log, and a description of the archival system.

  18. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  19. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  20. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  1. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  2. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  3. 105-KE Isolation Barrier Leak Rate Acceptance Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCracken, K.J.

    1995-06-14

    This Acceptance Test Report (ATR) contains the completed and signed Acceptance Procedure (ATP) for the 105-KE Isolations Barrier Leak Rate Test. The Test Engineer`s log, the completed sections of the ATP in the Appendix for Repeat Testing (Appendix K), the approved WHC J-7s (Appendix H), the data logger files (Appendices T and U), and the post test calibration checks (Appendix V) are included.

  4. Gigabit Network Communications Research

    DTIC Science & Technology

    1992-12-31

    additional BPF channels, raw bytesync support for video codecs, and others. All source file modifications were logged with RCS. Source and object trees were...34 (RFCs). 20 RFCs were published this quarter: RFC 1366: Gerich, E., " Guidelines for Management of IP Address Space", Merit, October 1992. RFC 1367...Topolcic, C., "Schedule for IP Address Space Management Guidelines ", CNRI, October 1992. RFC 1368: McMaster, D. (Synoptics Communications, Inc.), K

  5. VizieR Online Data Catalog: Reference Catalogue of Bright Galaxies (RC1; de Vaucouleurs+ 1964)

    NASA Astrophysics Data System (ADS)

    de Vaucouleurs, G.; de Vaucouleurs, A.

    1995-11-01

    The Reference Catalogue of Bright Galaxies lists for each entry the following information: NGC number, IC number, or A number; A, B, or C designation; B1950.0 positions, position at 100 year precession; galactic and supergalactic positions; revised morphological type and source; type and color class in Yerkes list 1 and 2; Hubble-Sandage type; revised Hubble type according to Holmberg; logarithm of mean major diameter (log D) and ratio of major to minor diameter (log R) and their weights; logarithm of major diameter; sources of the diameters; David Dunlap Observatory type and luminosity class; Harvard photographic apparent magnitude; weight of V, B-V(0), U-B(0); integrated magnitude B(0) and its weight in the B system; mean surface brightness in magnitude per square minute of arc and sources for the B magnitude; mean B surface brightness derived from corrected Harvard magnitude; the integrated color index in the standard B-V system; "intrinsic" color index; sources of B-V and/or U-B; integrated color in the standard U-B system; observed radial velocity in km/sec; radial velocity corrected for solar motion in km/sec; sources of radial velocities; solar motion correction; and direct photographic source. The catalog was created by concatenating four files side by side. (1 data file).

  6. A compiler and validator for flight operations on NASA space missions

    NASA Astrophysics Data System (ADS)

    Fonte, Sergio; Politi, Romolo; Capria, Maria Teresa; Giardino, Marco; De Sanctis, Maria Cristina

    2016-07-01

    In NASA missions the management and the programming of the flight systems is performed by a specific scripting language, the SASF (Spacecraft Activity Sequence File). In order to perform a check on the syntax and grammar it is necessary a compiler that stress the errors (eventually) found in the sequence file produced for an instrument on board the flight system. In our experience on Dawn mission, we developed VIRV (VIR Validator), a tool that performs checks on the syntax and grammar of SASF, runs a simulations of VIR acquisitions and eventually finds violation of the flight rules of the sequences produced. The project of a SASF compiler (SSC - Spacecraft Sequence Compiler) is ready to have a new implementation: the generalization for different NASA mission. In fact, VIRV is a compiler for a dialect of SASF; it includes VIR commands as part of SASF language. Our goal is to produce a general compiler for the SASF, in which every instrument has a library to be introduced into the compiler. The SSC can analyze a SASF, produce a log of events, perform a simulation of the instrument acquisition and check the flight rules for the instrument selected. The output of the program can be produced in GRASS GIS format and may help the operator to analyze the geometry of the acquisition.

  7. Historical files from Federal government mineral exploration-assistance programs, 1950 to 1974

    USGS Publications Warehouse

    Frank, David G.

    2010-01-01

    Congress enacted the Defense Production Act in 1950 to provide funding and support for the exploration and development of critical mineral resources. From 1950 to 1974, three Department of the Interior agencies carried out this mission. Contracts with mine owners provided financial assistance for mineral exploration on a joint-participation basis. These contracts are documented in more than 5,000 'dockets' now archived online by the U.S. Geological Survey. This archive provides access to unique and difficult to recreate information, such as drill logs, assay results, and underground geologic maps, that is invaluable to land and resource management organizations and the minerals industry. An effort to preserve the data began in 2009, and the entire collection of dockets was electronically scanned. The scanning process used optical character recognition (OCR) when possible, and files were converted into Portable Document Format (.pdf) files, which require Adobe Reader or similar software for viewing. In 2010, the scans were placed online (http://minerals.usgs.gov/dockets/) and are available to download free of charge.

  8. Archive of Digitized Analog Boomer Seismic Reflection Data Collected from the Mississippi-Alabama-Florida Shelf During Cruises Onboard the R/V Kit Jones, June 1990 and July 1991

    USGS Publications Warehouse

    Sanford, Jordan M.; Harrison, Arnell S.; Wiese, Dana S.; Flocks, James G.

    2009-01-01

    In June of 1990 and July of 1991, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the shallow geologic framework of the Mississippi-Alabama-Florida shelf in the northern Gulf of Mexico, from Mississippi Sound to the Florida Panhandle. Work was done onboard the Mississippi Mineral Resources Institute R/V Kit Jones as part of a project to study coastal erosion and offshore sand resources. This report is part of a series to digitally archive the legacy analog data collected from the Mississippi-Alabama SHelf (MASH). The MASH data rescue project is a cooperative effort by the USGS and the Minerals Management Service (MMS). This report serves as an archive of high-resolution scanned Tagged Image File Format (TIFF) and Graphics Interchange Format (GIF) images of the original boomer paper records, navigation files, trackline maps, Geographic Information System (GIS) files, cruise logs, and formal Federal Geographic Data Committee (FGDC) metadata.

  9. Multipurpose Controller with EPICS integration and data logging: BPM application for ESS Bilbao

    NASA Astrophysics Data System (ADS)

    Arredondo, I.; del Campo, M.; Echevarria, P.; Jugo, J.; Etxebarria, V.

    2013-10-01

    This work presents a multipurpose configurable control system which can be integrated in an EPICS control network, this functionality being configured through a XML configuration file. The core of the system is the so-called Hardware Controller which is in charge of the control hardware management, the set up and communication with the EPICS network and the data storage. The reconfigurable nature of the controller is based on a single XML file, allowing any final user to easily modify and adjust the control system to any specific requirement. The selected Java development environment ensures a multiplatform operation and large versatility, even regarding the control hardware to be controlled. Specifically, this paper, focused on fast control based on a high performance FPGA, describes also an application approach for the ESS Bilbao's Beam Position Monitoring system. The implementation of the XML configuration file and the satisfactory performance outcome achieved are presented, as well as a general description of the Multipurpose Controller itself.

  10. VizieR Online Data Catalog: X-ray sources in Hickson Compact Groups (Tzanavaris+, 2014)

    NASA Astrophysics Data System (ADS)

    Tzanavaris, P.; Gallagher, S. C.; Hornschemeier, A. E.; Fedotov, K.; Eracleous, M.; Brandt, W. N.; Desjardins, T. D.; Charlton, J. C.; Gronwall, C.

    2014-06-01

    By virtue of their selection criteria, Hickson Compact Groups (HCGs) constitute a distinct class among small galaxy agglomerations. The Hickson catalog (Hickson et al. 1992, Cat. VII/213) comprises 92 spectroscopically confirmed nearby compact groups with three or more members with accordant redshifts (i.e., within 1000km/s of the group mean). In this paper we present nine of these groups, for which both archival Chandra X-ray and Swift UVOT ultraviolet data are available. An observation log for the Chandra data is presented in Table 1. An observation log for the Swift UVOT data is presented in Tzanavaris et al. (2010ApJ...716..556T). In addition, note that in the present work we have included UVOT data for HCGs 90 and 92. (3 data files).

  11. Poster — Thur Eve — 30: 4D VMAT dose calculation methodology to investigate the interplay effect: experimental validation using TrueBeam Developer Mode and Gafchromic film

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teke, T; Milette, MP; Huang, V

    2014-08-15

    The interplay effect between the tumor motion and the radiation beam modulation during a VMAT treatment delivery alters the delivered dose distribution from the planned one. This work present and validate a method to accurately calculate the dose distribution in 4D taking into account the tumor motion, the field modulation and the treatment starting phase. A QUASAR™ respiratory motion phantom was 4D scanned with motion amplitude of 3 cm and with a 3 second period. A static scan was also acquired with the lung insert and the tumor contained in it centered. A VMAT plan with a 6XFFF beam wasmore » created on the averaged CT and delivered on a Varian TrueBeam and the trajectory log file was saved. From the trajectory log file 10 VMAT plans (one for each breathing phase) and a developer mode XML file were created. For the 10 VMAT plans, the tumor motion was modeled by moving the isocentre on the static scan, the plans were re-calculated and summed in the treatment planning system. In the developer mode, the tumor motion was simulated by moving the couch dynamically during the treatment. Gafchromic films were placed in the QUASAR phantom static and irradiated using the developer mode. Different treatment starting phase were investigated (no phase shift, maximum inhalation and maximum exhalation). Calculated and measured isodose lines and profiles are in very good agreement. For each starting phase, the dose distribution exhibit significant differences but are accurately calculated with the methodology presented in this work.« less

  12. Co-PylotDB - A Python-Based Single-Window User Interface for Transmitting Information to a Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnette, Daniel W.

    2012-01-05

    Co-PylotDB, written completely in Python, provides a user interface (UI) with which to select user and data file(s), directories, and file content, and provide or capture various other information for sending data collected from running any computer program to a pre-formatted database table for persistent storage. The interface allows the user to select input, output, make, source, executable, and qsub files. It also provides fields for specifying the machine name on which the software was run, capturing compile and execution lines, and listing relevant user comments. Data automatically captured by Co-PylotDB and sent to the database are user, current directory,more » local hostname, current date, and time of send. The UI provides fields for logging into a local or remote database server, specifying a database and a table, and sending the information to the selected database table. If a server is not available, the UI provides for saving the command that would have saved the information to a database table for either later submission or for sending via email to a collaborator who has access to the desired database.« less

  13. Designing a data portal for synthesis modeling

    NASA Astrophysics Data System (ADS)

    Holmes, M. A.

    2006-12-01

    Processing of field and model data in multi-disciplinary integrated science studies is a vital part of synthesis modeling. Collection and storage techniques for field data vary greatly between the participating scientific disciplines due to the nature of the data being collected, whether it be in situ, remotely sensed, or recorded by automated data logging equipment. Spreadsheets, personal databases, text files and binary files are used in the initial storage and processing of the raw data. In order to be useful to scientists, engineers and modelers the data need to be stored in a format that is easily identifiable, accessible and transparent to a variety of computing environments. The Model Operations and Synthesis (MOAS) database and associated web portal were created to provide such capabilities. The industry standard relational database is comprised of spatial and temporal data tables, shape files and supporting metadata accessible over the network, through a menu driven web-based portal or spatially accessible through ArcSDE connections from the user's local GIS desktop software. A separate server provides public access to spatial data and model output in the form of attributed shape files through an ArcIMS web-based graphical user interface.

  14. Measurement and visualization of file-to-wall contact during ultrasonically activated irrigation in simulated canals.

    PubMed

    Boutsioukis, C; Verhaagen, B; Walmsley, A D; Versluis, M; van der Sluis, L W M

    2013-11-01

    (i) To quantify in a simulated root canal model the file-to-wall contact during ultrasonic activation of an irrigant and to evaluate the effect of root canal size, file insertion depth, ultrasonic power, root canal level and previous training, (ii) To investigate the effect of file-to-wall contact on file oscillation. File-to-wall contact was measured during ultrasonic activation of the irrigant performed by 15 trained and 15 untrained participants in two metal root canal models. Results were analyzed by two 5-way mixed-design anovas. The level of significance was set at P < 0.05. Additionally, high-speed visualizations, laser-vibrometer measurements and numerical simulations of the file oscillation were conducted. File-to-wall contact occurred in all cases during 20% of the activation time. Contact time was significantly shorter at high power (P < 0.001), when the file was positioned away from working length (P < 0.001), in the larger root canal (P < 0.001) and from coronal towards apical third of the root canal (P < 0.002), in most of the cases studied. Previous training did not show a consistent significant effect. File oscillation was affected by contact during 94% of the activation time. During wall contact, the file bounced back and forth against the wall at audible frequencies (ca. 5 kHz), but still performed the original 30 kHz oscillations. Travelling waves were identified on the file. The file oscillation was not dampened completely due to the contact and hydrodynamic cavitation was detected. Considerable file-to-wall contact occur-red during irrigant activation. Therefore, the term 'Passive Ultrasonic Irrigation' should be amended to 'Ultrasonically Activated Irrigation'. © 2013 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  15. Obesity History and Daily Patterns of Physical Activity at Age 60-64 Years: Findings From the MRC National Survey of Health and Development.

    PubMed

    Cooper, Rachel; Huang, Lei; Hardy, Rebecca; Crainiceanu, Adina; Harris, Tamara; Schrack, Jennifer A; Crainiceanu, Ciprian; Kuh, Diana

    2017-10-01

    The aim of this study was to investigate associations of current body mass index (BMI) and obesity history with daily patterns of physical activity. At age 60-64, participants from a British birth cohort study wore accelerometers for 5 days. Accelerometry counts were log-transformed and mean log-counts were used to derive a summary variable indicating total daily log-activity counts. Among those with complete data (n = 1,388) the associations of current BMI and age of first obesity were examined with: (a) total daily log-activity counts and (b) total log-activity counts in four segments of the day. Higher current BMI and younger age at obesity were strongly associated with lower levels of total daily activity at age 60-64 even after adjustment for sex, socioeconomic factors, and health status. The fully-adjusted mean difference in total daily log-activity counts was -581.7 (95% confidence interval: -757.2, -406.3) when comparing BMI ≥35 kg/m2 with <25 kg/m2, representing an 18.4% difference. Participants who had been obese since early adulthood had the lowest levels of activity (mean difference in total daily log-activity counts was -413.1 (-638.1, -188.2) when comparing those who were obese by age 26 or 36 with those who were never obese, representing a 13.1% difference). Obese older adults may require targeted interventions and additional support to improve their daily activity levels. As younger generations with greater lifetime exposure to obesity reach old age the proportion of adults achieving sufficient levels of activity to realize its associated health benefits is likely to decline. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America.

  16. Obesity History and Daily Patterns of Physical Activity at Age 60–64 Years: Findings From the MRC National Survey of Health and Development

    PubMed Central

    Cooper, Rachel; Huang, Lei; Hardy, Rebecca; Crainiceanu, Adina; Harris, Tamara; Schrack, Jennifer A; Crainiceanu, Ciprian; Kuh, Diana

    2017-01-01

    Abstract Background The aim of this study was to investigate associations of current body mass index (BMI) and obesity history with daily patterns of physical activity. Methods At age 60–64, participants from a British birth cohort study wore accelerometers for 5 days. Accelerometry counts were log-transformed and mean log-counts were used to derive a summary variable indicating total daily log-activity counts. Among those with complete data (n = 1,388) the associations of current BMI and age of first obesity were examined with: (a) total daily log-activity counts and (b) total log-activity counts in four segments of the day. Results Higher current BMI and younger age at obesity were strongly associated with lower levels of total daily activity at age 60–64 even after adjustment for sex, socioeconomic factors, and health status. The fully-adjusted mean difference in total daily log-activity counts was −581.7 (95% confidence interval: −757.2, −406.3) when comparing BMI ≥35 kg/m2 with <25 kg/m2, representing an 18.4% difference. Participants who had been obese since early adulthood had the lowest levels of activity (mean difference in total daily log-activity counts was −413.1 (−638.1, −188.2) when comparing those who were obese by age 26 or 36 with those who were never obese, representing a 13.1% difference). Conclusions Obese older adults may require targeted interventions and additional support to improve their daily activity levels. As younger generations with greater lifetime exposure to obesity reach old age the proportion of adults achieving sufficient levels of activity to realize its associated health benefits is likely to decline. PMID:28329086

  17. Experience with a Spanish-language laparoscopy website.

    PubMed

    Moreno-Sanz, Carlos; Seoane-González, Jose B

    2006-02-01

    Although there are no clearly defined electronic tools for continuing medical education (CME), new information technologies offer a basic platform for presenting training content on the internet. Due to the shortage of websites about minimally invasive surgery in the Spanish language, we set up a topical website in Spanish. This study considers the experience with the website between April 2001 and January 2005. To study the activity of the website, the registry information was analyzed descriptively using the log files of the server. To study the characteristics of the users, we searched the database of registered users. We found a total of 107,941 visits to our website and a total of 624,895 page downloads. Most visits to the site were made from Spanish-speaking countries. The most frequent professional profile of the registered users was that of general surgeon. The development, implementation, and evaluation of Spanish-language CME initiatives over the internet is promising but presents challenges.

  18. ILRS Station Reporting

    NASA Technical Reports Server (NTRS)

    Noll, Carey E.; Pearlman, Michael Reisman; Torrence, Mark H.

    2013-01-01

    Network stations provided system configuration documentation upon joining the ILRS. This information, found in the various site and system log files available on the ILRS website, is essential to the ILRS analysis centers, combination centers, and general user community. Therefore, it is imperative that the station personnel inform the ILRS community in a timely fashion when changes to the system occur. This poster provides some information about the various documentation that must be maintained. The ILRS network consists of over fifty global sites actively ranging to over sixty satellites as well as five lunar reflectors. Information about these stations are available on the ILRS website (http://ilrs.gsfc.nasa.gov/network/stations/index.html). The ILRS Analysis Centers must have current information about the stations and their system configuration in order to use their data in generation of derived products. However, not all information available on the ILRS website is as up-to-date as necessary for correct analysis of their data.

  19. VizieR Online Data Catalog: PTPS stars. III. The evolved stars sample (Niedzielski+, 2016)

    NASA Astrophysics Data System (ADS)

    Niedzielski, A.; Deka-Szymankiewicz, B.; Adamczyk, M.; Adamow, M.; Nowak, G.; Wolszczan, A.

    2015-11-01

    We present basic atmospheric parameters (Teff, logg, vt and [Fe/H]), rotation velocities and absolute radial velocities as well as luminosities, masses, ages and radii for 402 stars (including 11 single-lined spectroscopic binaries), mostly subgiants and giants. For 272 of them we present parameters for the first time. For another 53 stars we present estimates of Teff and log g based on photometric calibrations. We also present basic properties of the complete list of 744 stars that form the PTPS evolved stars sample. We examined stellar masses for 1255 stars in five other planet searches and found some of them likely to be significantly overestimated. Applying our uniformly determined stellar masses we confirm the apparent increase of companions masses for evolved stars, and we explain it, as well as lack of close-in planets with limited effective radial velocity precision for those stars due to activity. (5 data files).

  20. XRootD popularity on hadoop clusters

    NASA Astrophysics Data System (ADS)

    Meoni, Marco; Boccali, Tommaso; Magini, Nicolò; Menichetti, Luca; Giordano, Domenico; CMS Collaboration

    2017-10-01

    Performance data and metadata of the computing operations at the CMS experiment are collected through a distributed monitoring infrastructure, currently relying on a traditional Oracle database system. This paper shows how to harness Big Data architectures in order to improve the throughput and the efficiency of such monitoring. A large set of operational data - user activities, job submissions, resources, file transfers, site efficiencies, software releases, network traffic, machine logs - is being injected into a readily available Hadoop cluster, via several data streamers. The collected metadata is further organized running fast arbitrary queries; this offers the ability to test several Map&Reduce-based frameworks and measure the system speed-up when compared to the original database infrastructure. By leveraging a quality Hadoop data store and enabling an analytics framework on top, it is possible to design a mining platform to predict dataset popularity and discover patterns and correlations.

  1. A depth video sensor-based life-logging human activity recognition system for elderly care in smart indoor environments.

    PubMed

    Jalal, Ahmad; Kamal, Shaharyar; Kim, Daijin

    2014-07-02

    Recent advancements in depth video sensors technologies have made human activity recognition (HAR) realizable for elderly monitoring applications. Although conventional HAR utilizes RGB video sensors, HAR could be greatly improved with depth video sensors which produce depth or distance information. In this paper, a depth-based life logging HAR system is designed to recognize the daily activities of elderly people and turn these environments into an intelligent living space. Initially, a depth imaging sensor is used to capture depth silhouettes. Based on these silhouettes, human skeletons with joint information are produced which are further used for activity recognition and generating their life logs. The life-logging system is divided into two processes. Firstly, the training system includes data collection using a depth camera, feature extraction and training for each activity via Hidden Markov Models. Secondly, after training, the recognition engine starts to recognize the learned activities and produces life logs. The system was evaluated using life logging features against principal component and independent component features and achieved satisfactory recognition rates against the conventional approaches. Experiments conducted on the smart indoor activity datasets and the MSRDailyActivity3D dataset show promising results. The proposed system is directly applicable to any elderly monitoring system, such as monitoring healthcare problems for elderly people, or examining the indoor activities of people at home, office or hospital.

  2. A Depth Video Sensor-Based Life-Logging Human Activity Recognition System for Elderly Care in Smart Indoor Environments

    PubMed Central

    Jalal, Ahmad; Kamal, Shaharyar; Kim, Daijin

    2014-01-01

    Recent advancements in depth video sensors technologies have made human activity recognition (HAR) realizable for elderly monitoring applications. Although conventional HAR utilizes RGB video sensors, HAR could be greatly improved with depth video sensors which produce depth or distance information. In this paper, a depth-based life logging HAR system is designed to recognize the daily activities of elderly people and turn these environments into an intelligent living space. Initially, a depth imaging sensor is used to capture depth silhouettes. Based on these silhouettes, human skeletons with joint information are produced which are further used for activity recognition and generating their life logs. The life-logging system is divided into two processes. Firstly, the training system includes data collection using a depth camera, feature extraction and training for each activity via Hidden Markov Models. Secondly, after training, the recognition engine starts to recognize the learned activities and produces life logs. The system was evaluated using life logging features against principal component and independent component features and achieved satisfactory recognition rates against the conventional approaches. Experiments conducted on the smart indoor activity datasets and the MSRDailyActivity3D dataset show promising results. The proposed system is directly applicable to any elderly monitoring system, such as monitoring healthcare problems for elderly people, or examining the indoor activities of people at home, office or hospital. PMID:24991942

  3. Using a Formal Approach for Reverse Engineering and Design Recovery to Support Software Reuse

    NASA Technical Reports Server (NTRS)

    Gannod, Gerald C.

    2002-01-01

    This document describes 3rd year accomplishments and summarizes overall project accomplishments. Included as attachments are all published papers from year three. Note that the budget for this project was discontinued after year two, but that a residual budget from year two allowed minimal continuance into year three. Accomplishments include initial investigations into log-file based reverse engineering, service-based software reuse, and a source to XML generator.

  4. Transaction aware tape-infrastructure monitoring

    NASA Astrophysics Data System (ADS)

    Nikolaidis, Fotios; Kruse, Daniele Francesco

    2014-06-01

    Administrating a large scale, multi protocol, hierarchical tape infrastructure like the CERN Advanced STORage manager (CASTOR)[2], which stores now 100 PB (with an increasing step of 25 PB per year), requires an adequate monitoring system for quick spotting of malfunctions, easier debugging and on demand report generation. The main challenges for such system are: to cope with CASTOR's log format diversity and its information scattered among several log files, the need for long term information archival, the strict reliability requirements and the group based GUI visualization. For this purpose, we have designed, developed and deployed a centralized system consisting of four independent layers: the Log Transfer layer for collecting log lines from all tape servers to a single aggregation server, the Data Mining layer for combining log data into transaction context, the Storage layer for archiving the resulting transactions and finally the Web UI layer for accessing the information. Having flexibility, extensibility and maintainability in mind, each layer is designed to work as a message broker for the next layer, providing a clean and generic interface while ensuring consistency, redundancy and ultimately fault tolerance. This system unifies information previously dispersed over several monitoring tools into a single user interface, using Splunk, which also allows us to provide information visualization based on access control lists (ACL). Since its deployment, it has been successfully used by CASTOR tape operators for quick overview of transactions, performance evaluation, malfunction detection and from managers for report generation.

  5. New method for calculating a mathematical expression for streamflow recession

    USGS Publications Warehouse

    Rutledge, Albert T.

    1991-01-01

    An empirical method has been devised to calculate the master recession curve, which is a mathematical expression for streamflow recession during times of negligible direct runoff. The method is based on the assumption that the storage-delay factor, which is the time per log cycle of streamflow recession, varies linearly with the logarithm of streamflow. The resulting master recession curve can be nonlinear. The method can be executed by a computer program that reads a data file of daily mean streamflow, then allows the user to select several near-linear segments of streamflow recession. The storage-delay factor for each segment is one of the coefficients of the equation that results from linear least-squares regression. Using results for each recession segment, a mathematical expression of the storage-delay factor as a function of the log of streamflow is determined by linear least-squares regression. The master recession curve, which is a second-order polynomial expression for time as a function of log of streamflow, is then derived using the coefficients of this function.

  6. Preferred computer activities among individuals with dementia: a pilot study.

    PubMed

    Tak, Sunghee H; Zhang, Hongmei; Hong, Song Hee

    2015-03-01

    Computers offer new activities that are easily accessible, cognitively stimulating, and enjoyable for individuals with dementia. The current descriptive study examined preferred computer activities among nursing home residents with different severity levels of dementia. A secondary data analysis was conducted using activity observation logs from 15 study participants with dementia (severe = 115 logs, moderate = 234 logs, and mild = 124 logs) who participated in a computer activity program. Significant differences existed in preferred computer activities among groups with different severity levels of dementia. Participants with severe dementia spent significantly more time watching slide shows with music than those with both mild and moderate dementia (F [2,12] = 9.72, p = 0.003). Preference in playing games also differed significantly across the three groups. It is critical to consider individuals' interests and functional abilities when computer activities are provided for individuals with dementia. A practice guideline for tailoring computer activities is detailed. Copyright 2015, SLACK Incorporated.

  7. TH-A-9A-10: Prostate SBRT Delivery with Flattening-Filter-Free Mode: Benefit and Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, T; Yuan, L; Sheng, Y

    Purpose: Flattening-filter-free (FFF) beam mode offered on TrueBeam™ linac enables delivering IMRT at 2400 MU/min dose rate. This study investigates the benefit and delivery accuracy of using high dose rate in the context of prostate SBRT. Methods: 8 prostate SBRT patients were retrospectively studied. In 5 cases treated with 600-MU/min dose rate, continuous prostate motion data acquired during radiation-beam-on was used to analyze motion range. In addition, the initial 1/3 of prostate motion trajectories during each radiation-beam-on was separated to simulate motion range if 2400-MU/min were used. To analyze delivery accuracy in FFF mode, MLC trajectory log files from anmore » additional 3 cases treated at 2400-MU/min were acquired. These log files record MLC expected and actual positions every 20ms, and therefore can be used to reveal delivery accuracy. Results: (1) Benefit. On average treatment at 600-MU/min takes 30s per beam; whereas 2400-MU/min requires only 11s. When shortening delivery time to ~1/3, the prostate motion range was significantly smaller (p<0.001). Largest motion reduction occurred in Sup-Inf direction, from [−3.3mm, 2.1mm] to [−1.7mm, 1.7mm], followed by reduction from [−2.1mm, 2.4mm] to [−1.0mm, 2.4mm] in Ant-Pos direction. No change observed in LR direction [−0.8mm, 0.6mm]. The combined motion amplitude (vector norm) confirms that average motion and ranges are significantly smaller when beam-on was limited to the 1st 1/3 of actual delivery time. (2) Accuracy. Trajectory log file analysis showed excellent delivery accuracy with at 2400 MU/min. Most leaf deviations during beam-on were within 0.07mm (99-percentile). Maximum leaf-opening deviations during each beam-on were all under 0.1mm for all leaves. Dose-rate was maintained at 2400-MU/min during beam-on without dipping. Conclusion: Delivery prostate SBRT with 2400 MU/min is both beneficial and accurate. High dose rates significantly reduced both treatment time and intra-beam prostate motion range. Excellent delivery accuracy was confirmed with very small leaf motion deviation.« less

  8. Coastal single-beam bathymetry data collected in 2015 from the Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    Stalk, Chelsea A.; DeWitt, Nancy T.; Bernier, Julie C.; Kindinger, Jack G.; Flocks, James G.; Miselis, Jennifer L.; Locker, Stanley D.; Kelso, Kyle W.; Tuten, Thomas M.

    2017-02-23

    As part of the Louisiana Coastal Protection and Restoration Authority (CPRA) Barrier Island Comprehensive Monitoring Program, scientists from the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center conducted a single-beam bathymetry survey around the Chandeleur Islands, Louisiana, in June 2015. The goal of the program is to provide long-term data on Louisiana’s barrier islands and use this data to plan, design, evaluate, and maintain current and future barrier island restoration projects. The data described in this report, along with (1) USGS bathymetry data collected in 2013 as a part of the Barrier Island Evolution Research project covering the northern Chandeleur Islands, and (2) data collected in 2014 in collaboration with the Louisiana CPRA Barrier Island Comprehensive Monitoring Program around Breton Island, will be used to assess bathymetric change since 2006‒2007 as well as serve as a bathymetric control in supporting modeling of future changes in response to restoration and storm impacts. The survey area encompasses approximately 435 square kilometers of nearshore and back-barrier environments around Hewes Point, the Chandeleur Islands, and Curlew and Grand Gosier Shoals. This Data Series serves as an archive of processed single-beam bathymetry data, collected in the nearshore of the Chandeleur Islands, Louisiana, from June 17‒24, 2015, during USGS Field Activity Number 2015-317-FA. Geographic information system data products include a 200-meter-cell-size interpolated bathymetry grid, trackline maps, and xyz point data files. Additional files include error analysis maps, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.

  9. Geologic Map of Prescott National Forest and the Headwaters of the Verde River, Yavapai and Coconino Counties, Arizona

    USGS Publications Warehouse

    DeWitt, Ed; Langenheim, V.E.; Force, Eric; Vance, R.K.; Lindberg, P.A.; Driscoll, R.L.

    2008-01-01

    This 1:100,000-scale digital geologic map details the complex Early Proterozoic metavolcanic and plutonic basement of north-central Arizona; shows the mildly deformed cover of Paleozoic rocks; reveals where Laramide to mid-Tertiary plutonic rocks associated with base- and precious-metals deposits are exposed; subdivides the Tertiary volcanic rocks according to chemically named units; and maps the Pliocene to Miocene fill of major basins. Associated digital files include more than 1,300 geochemical analyses of all rock units; 1,750 logs of water wells deeper than 300 feet; and interpreted logs of 300 wells that define the depth to basement in major basins. Geophysically interpreted buried features include normal faults defining previous unknown basins, mid-Tertiary intrusive rocks, and half-grabens within shallow bains.

  10. Preliminary report on geophysical well-logging activity on the Salton Sea Scientific Drilling Project, Imperial Valley, California

    USGS Publications Warehouse

    Paillet, Frederick L.; Morin, R.H.; Hodges, H.E.

    1986-01-01

    The Salton Sea Scientific Drilling Project has culminated in a 10,564-ft deep test well, State 2-14 well, in the Imperial Valley of southern California. A comprehensive scientific program of drilling, coring, and downhole measurements, which was conducted for about 5 months, has obtained much scientific information concerning the physical and chemical processes associated with an active hydrothermal system. This report primarily focuses on the geophysical logging activities at the State 2-14 well and provides early dissemination of geophysical data to other investigators working on complementary studies. Geophysical-log data were obtained by a commercial logging company and by the U.S. Geological Survey (USGS). Most of the commercial logs were obtained during three visits to the site; only one commercial log was obtained below a depth of 6,000 ft. The commercial logs obtained were dual induction, natural gamma, compensated neutron formation density, caliper and sonic. The USGS logging effort consisted of four primary periods, with many logs extending below a depth of 6,000 ft. The USGS logs obtained were temperature, caliper, natural gamma, gamma spectral, epithermal neutron, acoustic velocity, full-waveform, and acoustic televiewer. Various problems occurred throughout the drilling phase of the Salton Sea Scientific Drilling Project that made successful logging difficult: (1) borehole constrictions, possibly resulting from mud coagulation, (2) maximum temperatures of about 300 C, and (3) borehole conditions unfavorable for logging because of numerous zones of fluid loss, cement plugs, and damage caused by repeated trips in and out of the hole. These factors hampered and compromised logging quality at several open-hole intervals. The quality of the logs was dependent on the degree of probe sophistication and sensitivity to borehole-wall conditions. Digitized logs presented were processed on site and are presented in increments of 1,000 ft. A summary of the numerous factors that may be relevant to this interpretation also is presented. (Lantz-PTT)

  11. The AVO Website - a Comprehensive Tool for Information Management and Dissemination

    NASA Astrophysics Data System (ADS)

    Snedigar, S.; Cameron, C.; Nye, C. J.

    2008-12-01

    The Alaska Volcano Observatory (AVO) website serves as a primary information management, browsing, and dissemination tool. It is database-driven, thus easy to maintain and update. There are two different, yet fully integrated parts of the website. An external site (www.avo.alaska.edu) allows the general public to track eruptive activity by viewing the latest photographs, webcam images, seismic data, and official information releases about the volcano, as well as maps, previous eruption information, and bibliographies. This website is also the single most comprehensive source of Alaska volcano information available. The database now contains 14,000 images, 3,300 of which are publicly viewable, and 4,300 bibliographic citations - many linked to full-text downloadable files.. The internal portion of the website is essential to routine observatory operations, and hosts browse images of diverse geophysical and geological data in a format accessible by AVO staff regardless of location. An observation log allows users to enter information about anything from satellite passes to seismic activity to ash fall reports into a searchable database, and has become the permanent record of observatory function. The individual(s) on duty at home, at the watch office, or elsewhere use forms on the internal website to log information about volcano activity. These data are then automatically parsed into a number of primary activity notices which are the formal communication to appropriate agencies and interested individuals. Geochemistry, geochronology, and geospatial data modules are currently being developed. The website receives over 100 million hits, and serves 1,300 GB of data annually. It is dynamically generated from a MySQL database with over 300 tables and several thousand lines of php code which write the actual web display. The primary webserver is housed at (but not owned by) the University of Alaska Fairbanks, and currently holds 200 GB of data. Webcam images, webicorder graphs, earthquake location plots, and spectrograms are pulled and generated by other servers in Fairbanks and Anchorage.

  12. SU-F-T-465: Two Years of Radiotherapy Treatments Analyzed Through MLC Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Defoor, D; Kabat, C; Papanikolaou, N

    Purpose: To present treatment statistics of a Varian Novalis Tx using more than 90,000 Varian Dynalog files collected over the past 2 years. Methods: Varian Dynalog files are recorded for every patient treated on our Varian Novalis Tx. The files are collected and analyzed daily to check interfraction agreement of treatment deliveries. This is accomplished by creating fluence maps from the data contained in the Dynalog files. From the Dynalog files we have also compiled statistics for treatment delivery times, MLC errors, gantry errors and collimator errors. Results: The mean treatment time for VMAT patients was 153 ± 86 secondsmore » while the mean treatment time for step & shoot was 256 ± 149 seconds. Patient’s treatment times showed a variation of 0.4% over there treatment course for VMAT and 0.5% for step & shoot. The average field sizes were 40 cm2 and 26 cm2 for VMAT and step & shoot respectively. VMAT beams contained and average overall leaf travel of 34.17 meters and step & shoot beams averaged less than half of that at 15.93 meters. When comparing planned and delivered fluence maps generated using the Dynalog files VMAT plans showed an average gamma passing percentage of 99.85 ± 0.47. Step & shoot plans showed an average gamma passing percentage of 97.04 ± 0.04. 5.3% of beams contained an MLC error greater than 1 mm and 2.4% had an error greater than 2mm. The mean gantry speed for VMAT plans was 1.01 degrees/s with a maximum of 6.5 degrees/s. Conclusion: Varian Dynalog files are useful for monitoring machine performance treatment parameters. The Dynalog files have shown that the performance of the Novalis Tx is consistent over the course of a patients treatment with only slight variations in patient treatment times and a low rate of MLC errors.« less

  13. The three-dimensional structure of "Lonely Guy" from Claviceps purpurea provides insights into the phosphoribohydrolase function of Rossmann fold-containing lysine decarboxylase-like proteins.

    PubMed

    Dzurová, Lenka; Forneris, Federico; Savino, Simone; Galuszka, Petr; Vrabka, Josef; Frébort, Ivo

    2015-08-01

    The recently discovered cytokinin (CK)-specific phosphoribohydrolase "Lonely Guy" (LOG) is a key enzyme of CK biosynthesis, converting inactive CK nucleotides into biologically active free bases. We have determined the crystal structures of LOG from Claviceps purpurea (cpLOG) and its complex with the enzymatic product phosphoribose. The structures reveal a dimeric arrangement of Rossmann folds, with the ligands bound to large pockets at the interface between cpLOG monomers. Structural comparisons highlight the homology of cpLOG to putative lysine decarboxylases. Extended sequence analysis enabled identification of a distinguishing LOG sequence signature. Taken together, our data suggest phosphoribohydrolase activity for several proteins of unknown function. © 2015 Wiley Periodicals, Inc.

  14. Influence of environmental factors on activity patterns of Incisitermes minor (Isoptera: Kalotermitidae) in naturally infested logs.

    PubMed

    Lewis, Vernard R; Leighton, Shawn; Tabuchi, Robin; Baldwin, James A; Haverty, Michael I

    2013-02-01

    Acoustic emission (AE) activity patterns were measured from seven loquat [Eriobotrya japonica (Thunb.) Lindl.] logs, five containing live western drywood termite [Incisitermes minor (Hagen)] infestations, and two without an active drywood termite infestation. AE activity, as well as temperature, were monitored every 3 min under unrestricted ambient conditions in a small wooden building, under unrestricted ambient conditions but in constant darkness, or in a temperature-controlled cabined under constant darkness. Logs with active drywood termite infestations displayed similar diurnal cycles of AE activity that closely followed temperature with a peak of AE activity late in the afternoon (1700-1800 hours). When light was excluded from the building, a circadian pattern continued and apparently was driven by temperature. When the seven logs were kept at a relatively constant temperature (approximately 23 +/- 0.9 degrees C) and constant darkness, the pattern of activity was closely correlated with temperature, even with minimal changes in temperature. Temperature is the primary driver of activity of these drywood termites, but the effects are different when temperature is increasing or decreasing. At constant temperature, AE activity was highly correlated with the number of termites in the logs. The possible implications of these findings on our understanding of drywood termite biology and how this information may affect inspections and posttreatment evaluations are discussed.

  15. VizieR Online Data Catalog: CCD {Delta}a-photometry of 5 open clusters (Paunzen+, 2003)

    NASA Astrophysics Data System (ADS)

    Paunzen, E.; Pintado, O. I.; Maitzen, H. M.

    2004-01-01

    Observations of the five open clusters were performed with the Bochum 61cm (ESO-La Silla), the Helen-Sawyer-Hogg 61cm telescope (UTSO-Las Campanas Observatory), the 2.15m telescope at the Complejo Astronomico el Leoncito (CASLEO) and the L. Figl Observatory (FOA) with the 150cm telescope on Mt. Schopfl (Austria) using the multimode instrument OEFOSC (see the observation log in Table 1). (5 data files).

  16. Ending the U.S. War in Iraq: The Final Transition, Operational Maneuver, and Disestablishment of United States Forces-Iraq

    DTIC Science & Technology

    2013-01-01

    management survey and ensure that all databases (military and contracted civilian), key leader engagement logs, assistance project files, and other...Princeton University Press, 2000; Michael I. Handel, War Termination—A Critical Survey , Jeru- salem: Hebrew University, 1978; Jane Holl Lute, From the...DoS did not plan to install permanent and more costly security measures.133 Security surveys undertaken collaboratively by USF-I and multiple

  17. Evaluation of electrical impedance ratio measurements in accuracy of electronic apex locators.

    PubMed

    Kim, Pil-Jong; Kim, Hong-Gee; Cho, Byeong-Hoon

    2015-05-01

    The aim of this paper was evaluating the ratios of electrical impedance measurements reported in previous studies through a correlation analysis in order to explicit it as the contributing factor to the accuracy of electronic apex locator (EAL). The literature regarding electrical property measurements of EALs was screened using Medline and Embase. All data acquired were plotted to identify correlations between impedance and log-scaled frequency. The accuracy of the impedance ratio method used to detect the apical constriction (APC) in most EALs was evaluated using linear ramp function fitting. Changes of impedance ratios for various frequencies were evaluated for a variety of file positions. Among the ten papers selected in the search process, the first-order equations between log-scaled frequency and impedance were in the negative direction. When the model for the ratios was assumed to be a linear ramp function, the ratio values decreased if the file went deeper and the average ratio values of the left and right horizontal zones were significantly different in 8 out of 9 studies. The APC was located within the interval of linear relation between the left and right horizontal zones of the linear ramp model. Using the ratio method, the APC was located within a linear interval. Therefore, using the impedance ratio between electrical impedance measurements at different frequencies was a robust method for detection of the APC.

  18. Modulation indices for volumetric modulated arc therapy.

    PubMed

    Park, Jong Min; Park, So-Yeon; Kim, Hyoungnyoun; Kim, Jin Ho; Carlson, Joel; Ye, Sung-Joon

    2014-12-07

    The aim of this study is to present a modulation index (MI) for volumetric modulated arc therapy (VMAT) based on the speed and acceleration analysis of modulating-parameters such as multi-leaf collimator (MLC) movements, gantry rotation and dose-rate, comprehensively. The performance of the presented MI (MIt) was evaluated with correlation analyses to the pre-treatment quality assurance (QA) results, differences in modulating-parameters between VMAT plans versus dynamic log files, and differences in dose-volumetric parameters between VMAT plans versus reconstructed plans using dynamic log files. For comparison, the same correlation analyses were performed for the previously suggested modulation complexity score (MCS(v)), leaf travel modulation complexity score (LTMCS) and MI by Li and Xing (MI Li&Xing). In the two-tailed unpaired parameter condition, p values were acquired. The Spearman's rho (r(s)) values of MIt, MCSv, LTMCS and MI Li&Xing to the local gamma passing rate with 2%/2 mm criterion were -0.658 (p < 0.001), 0.186 (p = 0.251), 0.312 (p = 0.05) and -0.455 (p = 0.003), respectively. The values of rs to the modulating-parameter (MLC positions) differences were 0.917, -0.635, -0.857 and 0.795, respectively (p < 0.001). For dose-volumetric parameters, MIt showed higher statistically significant correlations than the conventional MIs. The MIt showed good performance for the evaluation of the modulation-degree of VMAT plans.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sublet, J.-Ch.; Koning, A.J.; Forrest, R.A.

    The reasons for the conversion of the European Activation File, EAF into ENDF-6 format are threefold. First, it significantly enhances the JEFF-3.0 release by the addition of an activation file. Second, to considerably increase its usage by using a recognized, official file format, allowing existing plug-in processes to be effective; and third, to move towards a universal nuclear data file in contrast to the current separate general and special-purpose files. The format chosen for the JEFF-3.0/A file uses reaction cross sections (MF-3), cross sections (MF-10), and multiplicities (MF-9). Having the data in ENDF-6 format allows the ENDF suite of utilitiesmore » and checker codes to be used alongside many other utility, visualizing, and processing codes. It is based on the EAF activation file used for many applications from fission to fusion, including dosimetry, inventories, depletion-transmutation, and geophysics. JEFF-3.0/A takes advantage of four generations of EAF files. Extensive benchmarking activities on these files provide feedback and validation with integral measurements. These, in parallel with a detailed graphical analysis based on EXFOR, have been applied stimulating new measurements, significantly increasing the quality of this activation file. The next step is to include the EAF uncertainty data for all channels into JEFF-3.0/A.« less

  20. The design and implementation of an automated system for logging clinical experiences using an anesthesia information management system.

    PubMed

    Simpao, Allan; Heitz, James W; McNulty, Stephen E; Chekemian, Beth; Brenn, B Randall; Epstein, Richard H

    2011-02-01

    Residents in anesthesia training programs throughout the world are required to document their clinical cases to help ensure that they receive adequate training. Current systems involve self-reporting, are subject to delayed updates and misreported data, and do not provide a practicable method of validation. Anesthesia information management systems (AIMS) are being used increasingly in training programs and are a logical source for verifiable documentation. We hypothesized that case logs generated automatically from an AIMS would be sufficiently accurate to replace the current manual process. We based our analysis on the data reporting requirements of the American College of Graduate Medical Education (ACGME). We conducted a systematic review of ACGME requirements and our AIMS record, and made modifications after identifying data element and attribution issues. We studied 2 methods (parsing of free text procedure descriptions and CPT4 procedure code mapping) to automatically determine ACGME case categories and generated AIMS-based case logs and compared these to assignments made by manual inspection of the anesthesia records. We also assessed under- and overreporting of cases entered manually by our residents into the ACGME website. The parsing and mapping methods assigned cases to a majority of the ACGME categories with accuracies of 95% and 97%, respectively, as compared with determinations made by 2 residents and 1 attending who manually reviewed all procedure descriptions. Comparison of AIMS-based case logs with reports from the ACGME Resident Case Log System website showed that >50% of residents either underreported or overreported their total case counts by at least 5%. The AIMS database is a source of contemporaneous documentation of resident experience that can be queried to generate valid, verifiable case logs. The extent of AIMS adoption by academic anesthesia departments should encourage accreditation organizations to support uploading of AIMS-based case log files to improve accuracy and to decrease the clerical burden on anesthesia residents.

  1. VizieR Online Data Catalog: Astron low resolution UV spectra (Boyarchuk+, 1994)

    NASA Astrophysics Data System (ADS)

    Boyarchuk, A. A.

    2017-05-01

    Astron was a Soviet spacecraft launched on 23 March 1983, and it was operational for eight years as the largest ultraviolet space telescope during its lifetime. Astron's payload consisted of an 80 cm ultraviolet telescope Spica and an X-ray spectroscope. We present 159 low resolution spectra of stars obtained during the Astron space mission (Tables 4, 5; hereafter table numbers in Boyarchuk et al. 1994 are given). Table 4 (observational log, logs.dat) contains data on 142 sessions for 90 stars (sorted in ascending order of RA), where SED was obtained by scanning method, and then data on 17 sessions for 15 stars (also sorted in ascending order of RA), where multicolor photometry was done. Kilpio et al. (2016, Baltic Astronomy 25, 23) presented results of the comparison of Astron data to the modern UV stellar data, discussed Astron precision and accuracy, and made some conclusions on potential application areas of these data. Also 34 sessions of observations of 27 stellar systems (galaxies and globular clusters) are presented. Observational log was published in Table 10 and data were published in Table 11, respectively. Also 16 sessions of observations of 12 nebulae (Table 12 for observational log and Table 13 for data themselves) are presented. Background radiation intensity data (Table 14) are presented in Table 15. At last, data on comets are presented in different forms. We draw your attention that observational data for stars, stellar systems, nebulae and comets are expressed in log [erg/s/cm^2/A], while for comets data 10E-13 erg/s/cm^2/A units are used, hydroxyl band photometric data for comets are expressed in log [erg/s/cm^2], and for the background data it is radiation intensity expressed in log [erg/s/cm^2/A/sr]. Scanned (PDF version of) Boyarchuk et al. (1994) book is available at http://www.inasan.ru/~astron/astron.pdf (12 data files).

  2. Optimal File-Distribution in Heterogeneous and Asymmetric Storage Networks

    NASA Astrophysics Data System (ADS)

    Langner, Tobias; Schindelhauer, Christian; Souza, Alexander

    We consider an optimisation problem which is motivated from storage virtualisation in the Internet. While storage networks make use of dedicated hardware to provide homogeneous bandwidth between servers and clients, in the Internet, connections between storage servers and clients are heterogeneous and often asymmetric with respect to upload and download. Thus, for a large file, the question arises how it should be fragmented and distributed among the servers to grant "optimal" access to the contents. We concentrate on the transfer time of a file, which is the time needed for one upload and a sequence of n downloads, using a set of m servers with heterogeneous bandwidths. We assume that fragments of the file can be transferred in parallel to and from multiple servers. This model yields a distribution problem that examines the question of how these fragments should be distributed onto those servers in order to minimise the transfer time. We present an algorithm, called FlowScaling, that finds an optimal solution within running time {O}(m log m). We formulate the distribution problem as a maximum flow problem, which involves a function that states whether a solution with a given transfer time bound exists. This function is then used with a scaling argument to determine an optimal solution within the claimed time complexity.

  3. D0 Superconducting Solenoid Quench Data and Slow Dump Data Acquisition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markley, D.; /Fermilab

    1998-06-09

    This Dzero Engineering note describes the method for which the 2 Tesla Superconducting Solenoid Fast Dump and Slow Dump data are accumulated, tracked and stored. The 2 Tesla Solenoid has eleven data points that need to be tracked and then stored when a fast dump or a slow dump occur. The TI555(Texas Instruments) PLC(Programmable Logic Controller) which controls the DC power circuit that powers the Solenoid, also has access to all the voltage taps and other equipment in the circuit. The TI555 constantly logs these eleven points in a rotating memory buffer. When either a fast dump(dump switch opens) ormore » a slow dump (power supply turns off) occurs, the TI555 organizes the respective data and will down load the data to a file on DO-CCRS2. This data in this file is moved over ethernet and is stored in a CSV (comma separated format) file which can easily be examined by Microsoft Excel or any other spreadsheet. The 2 Tesla solenoid control system also locks in first fault information. The TI555 decodes the first fault and passes it along to the program collecting the data and storing it on DO-CCRS2. This first fault information is then part of the file.« less

  4. Pizza.py Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plimpton, Steve; Jones, Matt; Crozier, Paul

    2006-01-01

    Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs on any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invokedmore » interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.« less

  5. Addressing fluorogenic real-time qPCR inhibition using the novel custom Excel file system 'FocusField2-6GallupqPCRSet-upTool-001' to attain consistently high fidelity qPCR reactions

    PubMed Central

    Ackermann, Mark R.

    2006-01-01

    The purpose of this manuscript is to discuss fluorogenic real-time quantitative polymerase chain reaction (qPCR) inhibition and to introduce/define a novel Microsoft Excel-based file system which provides a way to detect and avoid inhibition, and enables investigators to consistently design dynamically-sound, truly LOG-linear qPCR reactions very quickly. The qPCR problems this invention solves are universal to all qPCR reactions, and it performs all necessary qPCR set-up calculations in about 52 seconds (using a pentium 4 processor) for up to seven qPCR targets and seventy-two samples at a time – calculations that commonly take capable investigators days to finish. We have named this custom Excel-based file system "FocusField2-6GallupqPCRSet-upTool-001" (FF2-6-001 qPCR set-up tool), and are in the process of transforming it into professional qPCR set-up software to be made available in 2007. The current prototype is already fully functional. PMID:17033699

  6. HLYWD: a program for post-processing data files to generate selected plots or time-lapse graphics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munro, J.K. Jr.

    1980-05-01

    The program HLYWD is a post-processor of output files generated by large plasma simulation computations or of data files containing a time sequence of plasma diagnostics. It is intended to be used in a production mode for either type of application; i.e., it allows one to generate along with the graphics sequence, segments containing title, credits to those who performed the work, text to describe the graphics, and acknowledgement of funding agency. The current version is designed to generate 3D plots and allows one to select type of display (linear or semi-log scales), choice of normalization of function values formore » display purposes, viewing perspective, and an option to allow continuous rotations of surfaces. This program was developed with the intention of being relatively easy to use, reasonably flexible, and requiring a minimum investment of the user's time. It uses the TV80 library of graphics software and ORDERLIB system software on the CDC 7600 at the National Magnetic Fusion Energy Computing Center at Lawrence Livermore Laboratory in California.« less

  7. Reproducibility and validity of the Shanghai Men's Health Study physical activity questionnaire.

    PubMed

    Jurj, Adriana L; Wen, Wanqing; Xiang, Yong-Bing; Matthews, Charles E; Liu, Dake; Zheng, Wei; Shu, Xiao-Ou

    2007-05-15

    Reproducibility and validity of the physical activity questionnaire (PAQ) used in the Shanghai Men's Health Study (2003-2006, People's Republic of China) was evaluated in a random sample of 196 participants aged 40-74 years. Participants completed a PAQ at baseline and again 1 year later, 12 monthly 7-day physical activity recalls, and four quarterly 1-week physical activity logs. Reproducibility was evaluated by using the two PAQs and validity by comparing the PAQs with 1-year averages of the two criterion measures: 7-day physical activity recall and physical activity log. The PAQ had moderate to high reproducibility for measuring adult exercise participation (kappa = 0.60) and energy expenditure (r(s) = 0.68), nonexercise activities (correlation coefficients = 0.42-0.68), and total daily energy expenditure (r(s) = 0.68, kappa(quartiles) = 0.47). Correlations between the PAQ and criterion measures of adult exercise were 0.45 (7-day physical activity recall) and 0.51 (physical activity log) for the first PAQ and 0.62 (7-day physical activity recall) and 0.71 (physical activity log) for the second PAQ. Correlations between PAQ nonexercise activities and the physical activity log and 7-day physical activity recall were 0.31-0.86. Correlations for total energy expenditure were high (0.62-0.77). Results indicate that the Shanghai Men's Health Study PAQ has reasonable reproducibility and validity for classifying men by their level of exercise and nonexercise activities in this cohort.

  8. A PC-based bus monitor program for use with the transport systems research vehicle RS-232 communication interfaces

    NASA Technical Reports Server (NTRS)

    Easley, Wesley C.

    1991-01-01

    Experiment critical use of RS-232 data busses in the Transport Systems Research Vehicle (TSRV) operated by the Advanced Transport Operating Systems Program Office at the NASA Langley Research Center has recently increased. Each application utilizes a number of nonidentical computer and peripheral configurations and requires task specific software development. To aid these development tasks, an IBM PC-based RS-232 bus monitoring system was produced. It can simultaneously monitor two communication ports of a PC or clone, including the nonstandard bus expansion of the TSRV Grid laptop computers. Display occurs in a separate window for each port's input with binary display being selectable. A number of other features including binary log files, screen capture to files, and a full range of communication parameters are provided.

  9. Archive of digital and digitized analog boomer seismic reflection data collected during USGS cruise 96CCT02 in Copano, Corpus Christi, and Nueces Bays and Corpus Christi Bayou, Texas, July 1996

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Kindinger, Jack G.; Morton, Robert A.; Blum, Mike D.; Wiese, Dana S.; Subiño, Janice A.

    2007-01-01

    In June of 1996, the U.S. Geological Survey conducted geophysical surveys from Nueces to Copano Bays, Texas. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, cruise log, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles and high resolution scanned TIFF images of the original paper printouts are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  10. 41 CFR 105-54.203-2 - Active charters file.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false Active charters file... Regulations System (Continued) GENERAL SERVICES ADMINISTRATION Regional Offices-General Services... charters file. The GSA Committee Management Officer retains each original signed charter in a file of...

  11. Collecting conditions usage metadata to optimize current and future ATLAS software and processing

    NASA Astrophysics Data System (ADS)

    Rinaldi, L.; Barberis, D.; Formica, A.; Gallas, E. J.; Oda, S.; Rybkin, G.; Verducci, M.; ATLAS Collaboration

    2017-10-01

    Conditions data (for example: alignment, calibration, data quality) are used extensively in the processing of real and simulated data in ATLAS. The volume and variety of the conditions data needed by different types of processing are quite diverse, so optimizing its access requires a careful understanding of conditions usage patterns. These patterns can be quantified by mining representative log files from each type of processing and gathering detailed information about conditions usage for that type of processing into a central repository.

  12. Structure of the top of the Karnak Limestone Member (Ste. Genevieve) in Illinois

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bristol, H.M.; Howard, R.H.

    1976-01-01

    To facilitate petroleum exploration in Illinois, the Illinois State Geological Survey presents a structure map (for most of southern Illinois) of the Karnak Limestone Member--a relatively pure persistent limestone unit (generally 10 to 35 ft thick) in the Ste. Genevieve Limestone of Genevievian age. All available electric logs and selected studies of well cuttings were used in constructing the map. Oil and gas development maps containing Karnak-structure contours are on open file at the ISGS.

  13. An analysis of technology usage for streaming digital video in support of a preclinical curriculum.

    PubMed

    Dev, P; Rindfleisch, T C; Kush, S J; Stringer, J R

    2000-01-01

    Usage of streaming digital video of lectures in preclinical courses was measured by analysis of the data in the log file maintained on the web server. We observed that students use the video when it is available. They do not use it to replace classroom attendance but rather for review before examinations or when a class has been missed. Usage of video has not increased significantly for any course within the 18 month duration of this project.

  14. Development and validation of a new self-report instrument for measuring sedentary behaviors and light-intensity physical activity in adults.

    PubMed

    Barwais, Faisal Awad; Cuddihy, Thomas F; Washington, Tracy; Tomson, L Michaud; Brymer, Eric

    2014-08-01

    Low levels of physical activity and high levels of sedentary behavior (SB) are major public health concerns. This study was designed to develop and validate the 7-day Sedentary (S) and Light Intensity Physical Activity (LIPA) Log (7-day SLIPA Log), a self-report measure of specific daily behaviors. To develop the log, 62 specific SB and LIPA behaviors were chosen from the Compendium of Physical Activities. Face-to-face interviews were conducted with 32 sedentary volunteers to identify domains and behaviors of SB and LIPA. To validate the log, a further 22 sedentary adults were recruited to wear the GT3x for 7 consecutive days and nights. Pearson correlations (r) between the 7-day SLIPA Log and GT3x were significant for sedentary (r = .86, P < .001), for LIPA (r = .80, P < .001). Lying and sitting postures were positively correlated with GT3x output (r = .60 and r = .64, P < .001, respectively). No significant correlation was found for standing posture (r = .14, P = .53).The kappa values between the 7-day SLIPA Log and GT3x variables ranged from 0.09 to 0.61, indicating poor to good agreement. The 7-day SLIPA Log is a valid self-report measure of SB and LIPA in specific behavioral domains.

  15. 20 CFR 645.270 - What procedures are there to ensure that currently employed workers may file grievances regarding...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... currently employed workers may file grievances regarding displacement and that Welfare-to-Work participants in employment activities may file grievances regarding displacement, health and safety standards and... regarding displacement and that Welfare-to-Work participants in employment activities may file grievances...

  16. 20 CFR 645.270 - What procedures are there to ensure that currently employed workers may file grievances regarding...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... currently employed workers may file grievances regarding displacement and that Welfare-to-Work participants in employment activities may file grievances regarding displacement, health and safety standards and... regarding displacement and that Welfare-to-Work participants in employment activities may file grievances...

  17. 20 CFR 645.270 - What procedures are there to ensure that currently employed workers may file grievances regarding...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... currently employed workers may file grievances regarding displacement and that Welfare-to-Work participants in employment activities may file grievances regarding displacement, health and safety standards and... regarding displacement and that Welfare-to-Work participants in employment activities may file grievances...

  18. 20 CFR 645.270 - What procedures are there to ensure that currently employed workers may file grievances regarding...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... currently employed workers may file grievances regarding displacement and that Welfare-to-Work participants in employment activities may file grievances regarding displacement, health and safety standards and... regarding displacement and that Welfare-to-Work participants in employment activities may file grievances...

  19. 20 CFR 645.270 - What procedures are there to ensure that currently employed workers may file grievances regarding...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... currently employed workers may file grievances regarding displacement and that Welfare-to-Work participants in employment activities may file grievances regarding displacement, health and safety standards and... regarding displacement and that Welfare-to-Work participants in employment activities may file grievances...

  20. VizieR Online Data Catalog: Solar analogs and twins rotation by Kepler (do Nascimento+, 2014)

    NASA Astrophysics Data System (ADS)

    Do Nascimento, J.-D. Jr; Garcia, R. A.; Mathur, S.; Anthony, F.; Barnes, S. A.; Meibom, S.; da Costa, J. S.; Castro, M.; Salabert, D.; Ceillier, T.

    2017-03-01

    Our sample of 75 stars consists of a seismic sample of 38 from Chaplin et al. (2014, J/ApJS/210/1), 35 additional stars selected from the Kepler Input Catalog (KIC), and 16 Cyg A and B. We selected 38 well-studied stars from the asteroseismic data with fundamental properties, including ages, estimated by Chaplin et al. (2014, J/ApJS/210/1), and with Teff and log g as close as possible to the Sun's value (5200 K < Teff < 6060 K and 3.63 < log g < 4.40). This seismic sample allows a direct comparison between gyro- and seismic-ages for a subset of eight stars. These seismic samples were observed in short cadence for one month each in survey mode. Stellar properties for these stars have been estimated using two global asteroseismic parameters and complementary photometric and spectroscopic observations as described by Chaplin et al. (2014, J/ApJS/210/1). The median final quoted uncertainties for the full Chaplin et al. (2014, J/ApJS/210/1) sample were approximately 0.020 dex in log g and 150 K in Teff. (1 data file).

  1. 78 FR 18330 - Combined Notice of Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... Imbalances and Cash-out Activity. Filed Date: 3/18/13. Accession Number: 20130318-5086. Comments Due: 5 p.m...: Cameron Interstate Pipeline LLC Annual Report of Transportation Imbalances and Cash-out Activity. Filed...

  2. Archive of post-Hurricane Isabel coastal oblique aerial photographs collected during U.S. Geological Survey Field Activity 03CCH01 from Ocean City, Maryland, to Fort Caswell, North Carolina and Inland from Waynesboro to Redwood, Virginia, September 21 - 23, 2003

    USGS Publications Warehouse

    Subino, Janice A.; Morgan, Karen L.M.; Krohn, M. Dennis; Dadisman, Shawn V.

    2013-01-01

    On September 21 - 23, 2003, the United States Geological Survey (USGS) conducted an oblique aerial photographic survey along the Atlantic coast from Ocean City, Md., to Fort Caswell, N.C., and inland oblique aerial photographic survey from Waynesboro to Redwood, Va., aboard a Navajo Piper twin-engine airplane. The coastal survey was conducted at an altitude of 500 feet (ft) and approximately 1,000 ft offshore. For the inland photos, the aircraft tried to stay approximately 500 ft above the terrain. These coastal photos were used to document coastal changes like beach erosion and overwash caused by Hurricane Isabel, while the inland photos looked for potential landslides caused by heavy rains. The photos may also be used as baseline data for future coastal change analysis. The USGS and the National Aeronautics and Space Administration (NASA) surveyed the impact zone of Hurricane Isabel to better understand the changes in vulnerability of the Nation’s coasts to extreme storms (Morgan, 2009). This report serves as an archive of photographs collected during the September 21 - 23, 2003, post-Hurricane Isabel coastal and inland oblique aerial survey along with associated survey maps, KML files, navigation files, digital Field Activity Collection System (FACS) logs, and Federal Geographic Data Committee (FGDC) metadata. Refer to the Acronyms page for expansions of all acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 03CCH01 tells us the data were collected in 2003 for the Coastal Change Hazards (CCH) study and the data were collected during the first field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the ID number. The photographs provided here are Joint Photographic Experts Group (JPEG) scanned images of the analog 35 millimeter (mm) color positive slides. The photograph locations are estimates of the location of the plane (see the Navigation page). The metadata values for photo creation time, GPS latitude, GPS longitude, GPS position (latitude and longitude), keywords, credit, artist, caption, copyright, and contact were added to each photograph's EXIF header using EXIFtool (Subino and others, 2012). Photographs can be opened directly with any JPEG-compatible image viewer by clicking on a thumbnail on the contact sheet, or, when viewing the Google Earth KML file, by clicking on the marker and then clicking on either the thumbnail or the link below the thumbnail. Nathaniel Plant (USGS - St. Petersburg, Fla.), and Ann Marie Ascough (formerly contracted at the USGS - St. Petersburg, Fla.) helped with the creation of KML files. To view the photos and survey maps, proceed to the Photos and Maps page.

  3. Web-based pathology practice examination usage.

    PubMed

    Klatt, Edward C

    2014-01-01

    General and subject specific practice examinations for students in health sciences studying pathology were placed onto a free public internet web site entitled web path and were accessed four clicks from the home web site menu. Multiple choice questions were coded into. html files with JavaScript functions for web browser viewing in a timed format. A Perl programming language script with common gateway interface for web page forms scored examinations and placed results into a log file on an internet computer server. The four general review examinations of 30 questions each could be completed in up to 30 min. The 17 subject specific examinations of 10 questions each with accompanying images could be completed in up to 15 min each. The results of scores and user educational field of study from log files were compiled from June 2006 to January 2014. The four general review examinations had 31,639 accesses with completion of all questions, for a completion rate of 54% and average score of 75%. A score of 100% was achieved by 7% of users, ≥90% by 21%, and ≥50% score by 95% of users. In top to bottom web page menu order, review examination usage was 44%, 24%, 17%, and 15% of all accessions. The 17 subject specific examinations had 103,028 completions, with completion rate 73% and average score 74%. Scoring at 100% was 20% overall, ≥90% by 37%, and ≥50% score by 90% of users. The first three menu items on the web page accounted for 12.6%, 10.0%, and 8.2% of all completions, and the bottom three accounted for no more than 2.2% each. Completion rates were higher for shorter 10 questions subject examinations. Users identifying themselves as MD/DO scored higher than other users, averaging 75%. Usage was higher for examinations at the top of the web page menu. Scores achieved suggest that a cohort of serious users fully completing the examinations had sufficient preparation to use them to support their pathology education.

  4. MO-G-BRE-04: Automatic Verification of Daily Treatment Deliveries and Generation of Daily Treatment Reports for a MR Image-Guided Treatment Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, D; Li, X; Li, H

    2014-06-15

    Purpose: Two aims of this work were to develop a method to automatically verify treatment delivery accuracy immediately after patient treatment and to develop a comprehensive daily treatment report to provide all required information for daily MR-IGRT review. Methods: After systematically analyzing the requirements for treatment delivery verification and understanding the available information from a novel MR-IGRT treatment machine, we designed a method to use 1) treatment plan files, 2) delivery log files, and 3) dosimetric calibration information to verify the accuracy and completeness of daily treatment deliveries. The method verifies the correctness of delivered treatment plans and beams, beammore » segments, and for each segment, the beam-on time and MLC leaf positions. Composite primary fluence maps are calculated from the MLC leaf positions and the beam-on time. Error statistics are calculated on the fluence difference maps between the plan and the delivery. We also designed the daily treatment delivery report by including all required information for MR-IGRT and physics weekly review - the plan and treatment fraction information, dose verification information, daily patient setup screen captures, and the treatment delivery verification results. Results: The parameters in the log files (e.g. MLC positions) were independently verified and deemed accurate and trustable. A computer program was developed to implement the automatic delivery verification and daily report generation. The program was tested and clinically commissioned with sufficient IMRT and 3D treatment delivery data. The final version has been integrated into a commercial MR-IGRT treatment delivery system. Conclusion: A method was developed to automatically verify MR-IGRT treatment deliveries and generate daily treatment reports. Already in clinical use since December 2013, the system is able to facilitate delivery error detection, and expedite physician daily IGRT review and physicist weekly chart review.« less

  5. Nighttime activity of moving objects, their mapping and statistic making, on the example of applying thermal imaging and advanced image processing to the research of nocturnal mammals

    NASA Astrophysics Data System (ADS)

    Pregowski, Piotr; Owadowska, Edyta; Pietrzak, Jan; Zwolenik, Slawomir

    2005-09-01

    The paper presents method of acquiring a new form of statistical information about the changes at scenery, overseen by thermal imaging camera in static configuration. This type of imagers reach uniquely high efficiency during nighttime surveillance and targeting. The technical issue we have solved, resulted from the problem: how to verify the hypothesis that small, nocturnal rodents, like bank voles, use common paths inside their range and that they form a common, rather stable system? Such research has been especially difficult because the mentioned mammals are secretive, move with various speed and due to low contrast to their natural surroundings - as leaves or grass - nearly impossible for other kind of observations from a few meters distance. The main advantage of the elaborated method showed to be both adequately filtered long thermal movies for manual analyses, as well as auto-creation of the synthetic images which present maps of invisible paths and activity of their usage. Additional file with logs describing objects and their dislocations as the ".txt" files allows various, more detailed studies of animal behavior. The obtained results proved that this original method delivers a new, non-invasive, powerful and dynamic concept of solving various ecological problems. Creation of networks consisted of uncooled thermal imagers - of significantly increased availability - with data transmissions to digital centers allows to investigate of moving - particularly heat generated - objects in complete darkness, much wider and much more efficiently than up today. Thus, although our system was elaborated for ecological studies, a similar one can be considered as a tool for chosen tasks in the optical security areas.

  6. Inactivation of viruses by pasteurization at 60 °C for 10 h with and without 40% glucose as stabilizer during a new manufacturing process of α2-Macroglobulin from Cohn Fraction IV.

    PubMed

    Huangfu, Chaoji; Ma, Yuyuan; Jia, Junting; Lv, Maomin; Zhu, Fengxuan; Ma, Xiaowei; Zhao, Xiong; Zhang, Jingang

    2017-03-01

    Pasteurization is regularly used to inactivate viruses for the safety of plasma derivatives. Influence of pasteurization at 60 °C for 10 h on α2-Macroglobulin activity and virus inactivation were studied. With 40% sugar as stabilizers more than 70% α2-Macroglobulin activity was reserved after pasteurization compared with 20% in control. Glucose presented a better activity protection effect than sucrose and maltose. By pasteurization without stabilizer the virus titers of pseudorabies virus, Sindbis virus, porcine parvovirus and encephalomyocarditis virus were reduced more than 5.88 log 10 , 7.50 log 10 , 4.88 log 10 , and 5.63 log 10 respectively within 2 h. By pasteurization with 40% glucose vesicular stomatitis virus was inactivated more than 5.88 log 10 within 1 h. Only 2.71 log 10 reduction was achieved for encephalomyocarditis virus after 10 h. 40% glucose protected α2-M activity and viruses simultaneously from pasteurization. Other viral inactivation methods need to be incorporated to ensure viral safety of this manufacturing process of α2-Macroglobulin. Copyright © 2017 International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  7. Does Internet voting make elections less social? Group voting patterns in Estonian e-voting log files (2013–2015)

    PubMed Central

    2017-01-01

    Remote Internet voting places the control and secrecy of the immediate voting environment on the shoulder of the individual voter but it also turns voting into yet another on-line activity thus endangering the well-known social nature of voting and possibly reducing the crucial sense of civic duty that is important for a healthy democracy. There is however a complete lack of evidence to what degree this actually materializes once electronic voting is introduced. This paper uses individual level log data on Internet voting in Estonian elections between 2013–2015 to inspect if Internet voting retains the social nature of the voting act. We do so by examining if Internet voting in groups takes place and what implications it has for voting speed. We find strong evidence of e-voting in pairs. Same aged male-female pairs seem to be voting in close proximity to each other, consistent with spouses or partners voting together. Also, female-female and female-male pairs with large age differences seem to be voting together, consistent with a parent voting with an adult aged offspring. With regards to voting speed we see the second vote in a vote pair being considerably faster than the first vote, again indicating a shared voting act. We end with a discussion of how the onset of electronic voting does not make elections less social, but does make vote secrecy more a choice rather than a requirement. PMID:28542348

  8. Streamlining CASTOR to manage the LHC data torrent

    NASA Astrophysics Data System (ADS)

    Lo Presti, G.; Espinal Curull, X.; Cano, E.; Fiorini, B.; Ieri, A.; Murray, S.; Ponce, S.; Sindrilaru, E.

    2014-06-01

    This contribution describes the evolution of the main CERN storage system, CASTOR, as it manages the bulk data stream of the LHC and other CERN experiments, achieving over 90 PB of stored data by the end of LHC Run 1. This evolution was marked by the introduction of policies to optimize the tape sub-system throughput, going towards a cold storage system where data placement is managed by the experiments' production managers. More efficient tape migrations and recalls have been implemented and deployed where bulk meta-data operations greatly reduce the overhead due to small files. A repack facility is now integrated in the system and it has been enhanced in order to automate the repacking of several tens of petabytes, required in 2014 in order to prepare for the next LHC run. Finally the scheduling system has been evolved to integrate the internal monitoring. To efficiently manage the service a solid monitoring infrastructure is required, able to analyze the logs produced by the different components (about 1 kHz of log messages). A new system has been developed and deployed, which uses a transport messaging layer provided by the CERN-IT Agile Infrastructure and exploits technologies including Hadoop and HBase. This enables efficient data mining by making use of MapReduce techniques, and real-time data aggregation and visualization. The outlook for the future is also presented. Directions and possible evolution will be discussed in view of the restart of data taking activities.

  9. Selective logging and its relation to deforestation

    Treesearch

    Gregory P. Asner; Michael Keller; Marco Lentini; Frank Merry; Souza Jr. Carlos

    2009-01-01

    Selective logging is a major contributor to the social, economic, and ecological dynamics of Brazilian Amazonia. Logging activities have expanded from low-volume floodplain harvests in past centuries to high-volume operations today that take about 25 million m3 of wood from the forest each year. The most common high-impact conventional and often illegal logging...

  10. Remote Environmental Monitoring and Diagnostics in the Perishables Supply Chain - Phase 1

    DTIC Science & Technology

    2011-12-12

    The table below displays the  raw  data from the tests. Each cell contains a number between 0  and 5 corresponding  to  the number of  successful...along  with  the  raw   temperature  data  to  the  email  addresses  specified  in  the  configuration file.    As mentioned previously, for the CAEN...the Intelleflex system.    The user also has the option to save the data log, which contains the  raw  temperature data, to  a file on the Windows

  11. A Comparison of Reasoning Processes in a Collaborative Modelling Environment: Learning about genetics problems using virtual chat

    NASA Astrophysics Data System (ADS)

    Pata, Kai; Sarapuu, Tago

    2006-09-01

    This study investigated the possible activation of different types of model-based reasoning processes in two learning settings, and the influence of various terms of reasoning on the learners’ problem representation development. Changes in 53 students’ problem representations about genetic issue were analysed while they worked with different modelling tools in a synchronous network-based environment. The discussion log-files were used for the “microgenetic” analysis of reasoning types. For studying the stages of students’ problem representation development, individual pre-essays and post-essays and their utterances during two reasoning phases were used. An approach for mapping problem representations was developed. Characterizing the elements of mental models and their reasoning level enabled the description of five hierarchical categories of problem representations. Learning in exploratory and experimental settings was registered as the shift towards more complex stages of problem representations in genetics. The effect of different types of reasoning could be observed as the divergent development of problem representations within hierarchical categories.

  12. Mechanical reduction of the intracanal Enterococcus faecalis population by Hyflex CM, K3XF, ProTaper Next, and two manual instrument systems: an in vitro comparative study.

    PubMed

    Tewari, Rajendra K; Ali, Sajid; Mishra, Surendra K; Kumar, Ashok; Andrabi, Syed Mukhtar-Un-Nisar; Zoya, Asma; Alam, Sharique

    2016-05-01

    In the present study, the effectiveness of three rotary and two manual nickel titanium instrument systems on mechanical reduction of the intracanal Enterococcus faecalis population was evaluated. Mandibular premolars with straight roots were selected. Teeth were decoronated and instrumented until 20 K file and irrigated with physiological saline. After sterilization by ethylene oxide gas, root canals were inoculated with Enterococcus faecalis. The specimens were randomly divided into five groups for canal instrumentation: Manual Nitiflex and Hero Shaper nickel titanium files, and rotary Hyflex CM, ProTaper Next, and K3XF nickel titanium files. Intracanal bacterial sampling was done before and after instrumentation. After serial dilution, samples were plated onto the Mitis Salivarius agar. The c.f.u. grown were counted, and log10 transformation was calculated. All instrumentation systems significantly reduced the intracanal bacterial population after root canal preparation. ProTaper Next was found to be significantly more effective than Hyflex CM and manual Nitiflex and Hero Shaper. However, ProTaper Next showed no significant difference with K3XF. Canal instrumentation by all the file systems significantly reduced the intracanal Enterococcus faecalis counts. ProTaper Next was found to be most effective in reducing the number of bacteria than other rotary or hand instruments. © 2014 Wiley Publishing Asia Pty Ltd.

  13. Control of Cryptosporidium with wastewater treatment to prevent its proliferation in the water cycle.

    PubMed

    Suwa, M; Suzuki, Y

    2003-01-01

    The outbreak of Cryptosporidiosis in 1996 in Japan is thought to have been enlarged by the proliferation of Cryptosporidium in the water cycle from wastewater to drinking water through the river system. From this experience, the wastewater system must have functions to remove Cryptosporidium oocysts effectively. Efficiencies of wastewater treatment processes to remove oocysts were investigated using pilot plants receiving municipal wastewater. An activated sludge process and a following sand filter showed removal efficiencies of 2 log and 0.5 log, respectively. Poly-aluminium chloride dosage improved the efficiencies by 3 log for the activated sludge process and by 2 log for the sand filter. Chemical precipitation of raw wastewater with poly-aluminium chloride could achieve 1 to 3 log removal according on the coagulant concentration.

  14. Could LogP be a principal determinant of biological activity in 18-crown-6 ethers? Synthesis of biologically active adamantane-substituted diaza-crowns.

    PubMed

    Supek, Fran; Ramljak, Tatjana Šumanovac; Marjanović, Marko; Buljubašić, Maja; Kragol, Goran; Ilić, Nataša; Smuc, Tomislav; Zahradka, Davor; Mlinarić-Majerski, Kata; Kralj, Marijeta

    2011-08-01

    18-crown-6 ethers are known to exert their biological activity by transporting K(+) ions across cell membranes. Using non-linear Support Vector Machines regression, we searched for structural features that influence antiproliferative activity in a diverse set of 19 known oxa-, monoaza- and diaza-18-crown-6 ethers. Here, we show that the logP of the molecule is the most important molecular descriptor, among ∼1300 tested descriptors, in determining biological potency (R(2)(cv) = 0.704). The optimal logP was at 5.5 (Ghose-Crippen ALOGP estimate) while both higher and lower values were detrimental to biological potency. After controlling for logP, we found that the antiproliferative activity of the molecule was generally not affected by side chain length, molecular symmetry, or presence of side chain amide links. To validate this QSAR model, we synthesized six novel, highly lipophilic diaza-18-crown-6 derivatives with adamantane moieties attached to the side arms. These compounds have near-optimal logP values and consequently exhibit strong growth inhibition in various human cancer cell lines and a bacterial system. The bioactivities of different diaza-18-crown-6 analogs in Bacillus subtilis and cancer cells were correlated, suggesting conserved molecular features may be mediating the cytotoxic response. We conclude that relying primarily on the logP is a sensible strategy in preparing future 18-crown-6 analogs with optimized biological activity. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  15. Log transfer and storage facilities in Southeast Alaska: a review.

    Treesearch

    Tamra L. Faris; Kenneth D. Vaughan

    1985-01-01

    The volume of timber harvested in southeast Alaska between 1909 and 1983 was 14,689 million board feet; nearly all was transported on water to various destinations for processing. In 1971 there were 69 active log transfer and storage facilities and 38 raft collecting and storage facilities in southeast Alaska. In 1983 there were 90 log transfer sites, 49 log storage...

  16. SwampLog II: A Structured Journal for Personal and Professional Inquiry within a Collaborative Environment.

    ERIC Educational Resources Information Center

    Nicassio, Frank J.

    SwampLog is a type of journal keeping that records the facts of daily activities as experienced and perceived by practitioners. The label, "SwampLog," was inspired by Donald Schon's metaphor used to distinguish the "swamplands of practice" from the "high, hard ground of research." Keeping a SwampLog consists of recording four general types of…

  17. Individual and Group-Based Engagement in an Online Physical Activity Monitoring Program in Georgia.

    PubMed

    Smith, Matthew Lee; Durrett, Nicholas K; Bowie, Maria; Berg, Alison; McCullick, Bryan A; LoPilato, Alexander C; Murray, Deborah

    2018-06-07

    Given the rising prevalence of obesity in the United States, innovative methods are needed to increase physical activity (PA) in community settings. Evidence suggests that individuals are more likely to engage in PA if they are given a choice of activities and have support from others (for encouragement, motivation, and accountability). The objective of this study was to describe the use of the online Walk Georgia PA tracking platform according to whether the user was an individual user or group user. Walk Georgia is a free, interactive online tracking platform that enables users to log PA by duration, activity, and perceived difficulty, and then converts these data into points based on metabolic equivalents. Users join individually or in groups and are encouraged to set weekly PA goals. Data were examined for 6,639 users (65.8% were group users) over 28 months. We used independent sample t tests and Mann-Whitney U tests to compare means between individual and group users. Two linear regression models were fitted to identify factors associated with activity logging. Users logged 218,766 activities (15,119,249 minutes of PA spanning 592,714 miles [41,858,446 points]). On average, group users had created accounts more recently than individual users (P < .001); however, group users logged more activities (P < .001). On average, group users logged more minutes of PA (P < .001) and earned more points (P < .001). Being in a group was associated with a larger proportion of weeks in which 150 minutes or more of weekly PA was logged (B = 20.47, P < .001). Use of Walk Georgia was significantly higher among group users than among individual users. To expand use and dissemination of online tracking of PA, programs should target naturally occurring groups (eg, workplaces, schools, faith-based groups).

  18. Creative Analytics of Mission Ops Event Messages

    NASA Technical Reports Server (NTRS)

    Smith, Dan

    2017-01-01

    Historically, tremendous effort has been put into processing and displaying mission health and safety telemetry data; and relatively little attention has been paid to extracting information from missions time-tagged event log messages. Todays missions may log tens of thousands of messages per day and the numbers are expected to dramatically increase as satellite fleets and constellations are launched, as security monitoring continues to evolve, and as the overall complexity of ground system operations increases. The logs may contain information about orbital events, scheduled and actual observations, device status and anomalies, when operators were logged on, when commands were resent, when there were data drop outs or system failures, and much much more. When dealing with distributed space missions or operational fleets, it becomes even more important to systematically analyze this data. Several advanced information systems technologies make it appropriate to now develop analytic capabilities which can increase mission situational awareness, reduce mission risk, enable better event-driven automation and cross-mission collaborations, and lead to improved operations strategies: Industry Standard for Log Messages. The Object Management Group (OMG) Space Domain Task Force (SDTF) standards organization is in the process of creating a formal standard for industry for event log messages. The format is based on work at NASA GSFC. Open System Architectures. The DoD, NASA, and others are moving towards common open system architectures for mission ground data systems based on work at NASA GSFC with the full support of the commercial product industry and major integration contractors. Text Analytics. A specific area of data analytics which applies statistical, linguistic, and structural techniques to extract and classify information from textual sources. This presentation describes work now underway at NASA to increase situational awareness through the collection of non-telemetry mission operations information into a common log format and then providing display and analytics tools to provide in-depth assessment of the log contents. The work includes: Common interface formats for acquiring time-tagged text messages Conversion of common files for schedules, orbital events, and stored commands to the common log format Innovative displays to depict thousands of messages on a single display Structured English text queries against the log message data store, extensible to a more mature natural language query capability Goal of speech-to-text and text-to-speech additions to create a personal mission operations assistant to aid on-console operations. A wide variety of planned uses identified by the mission operations teams will be discussed.

  19. Estradiol and inflammatory markers in older men.

    PubMed

    Maggio, Marcello; Ceda, Gian Paolo; Lauretani, Fulvio; Bandinelli, Stefania; Metter, E Jeffrey; Artoni, Andrea; Gatti, Elisa; Ruggiero, Carmelinda; Guralnik, Jack M; Valenti, Giorgio; Ling, Shari M; Basaria, Shehzad; Ferrucci, Luigi

    2009-02-01

    Aging is characterized by a mild proinflammatory state. In older men, low testosterone levels have been associated with increasing levels of proinflammatory cytokines. It is still unclear whether estradiol (E2), which generally has biological activities complementary to testosterone, affects inflammation. We analyzed data obtained from 399 men aged 65-95 yr enrolled in the Invecchiare in Chianti study with complete data on body mass index (BMI), serum E2, testosterone, IL-6, soluble IL-6 receptor, TNF-alpha, IL-1 receptor antagonist, and C-reactive protein. The relationship between E2 and inflammatory markers was examined using multivariate linear models adjusted for age, BMI, smoking, physical activity, chronic disease, and total testosterone. In age-adjusted analysis, log (E2) was positively associated with log (IL-6) (r = 0.19; P = 0.047), and the relationship was statistically significant (P = 0.032) after adjustments for age, BMI, smoking, physical activity, chronic disease, and serum testosterone levels. Log (E2) was not significantly associated with log (C-reactive protein), log (soluble IL-6 receptor), or log (TNF-alpha) in both age-adjusted and fully adjusted analyses. In older men, E2 is weakly positively associated with IL-6, independent of testosterone and other confounders including BMI.

  20. Assessment of feasibility of running RSNA's MIRC on a Raspberry Pi: a cost-effective solution for teaching files in radiology.

    PubMed

    Pereira, Andre; Atri, Mostafa; Rogalla, Patrik; Huynh, Thien; O'Malley, Martin E

    2015-11-01

    The value of a teaching case repository in radiology training programs is immense. The allocation of resources for putting one together is a complex issue, given the factors that have to be coordinated: hardware, software, infrastructure, administration, and ethics. Costs may be significant and cost-effective solutions are desirable. We chose Medical Imaging Resource Center (MIRC) to build our teaching file. It is offered by RSNA for free. For the hardware, we chose the Raspberry Pi, developed by the Raspberry Foundation: a small control board developed as a low cost computer for schools also used in alternative projects such as robotics and environmental data collection. Its performance and reliability as a file server were unknown to us. For the operational system, we chose Raspbian, a variant of Debian Linux, along with Apache (web server), MySql (database server) and PHP, which enhance the functionality of the server. A USB hub and an external hard drive completed the setup. Installation of software was smooth. The Raspberry Pi was able to handle very well the task of hosting the teaching file repository for our division. Uptime was logged at 100 %, and loading times were similar to other MIRC sites available online. We setup two servers (one for backup), each costing just below $200.00 including external storage and USB hub. It is feasible to run RSNA's MIRC off a low-cost control board (Raspberry Pi). Performance and reliability are comparable to full-size servers for the intended purpose of hosting a teaching file within an intranet environment.

  1. A compendium of P- and S-wave velocities from surface-to-borehole logging; summary and reanalysis of previously published data and analysis of unpublished data

    USGS Publications Warehouse

    Boore, David M.

    2003-01-01

    For over 28 years, the U.S. Geological Survey (USGS) has been acquiring seismic velocity and geologic data at a number of locations in California, many of which were chosen because strong ground motions from earthquakes were recorded at the sites. The method for all measurements involves picking first arrivals of P- and S-waves from a surface source recorded at various depths in a borehole (as opposed to noninvasive methods, such as the SASW method [e.g., Brown et al., 2002]). The results from most of the sites are contained in a series of U.S. Geological Survey Open-File Reports (see References). Until now, none of the results have been available as computer files, and before 1992 the interpretation of the arrival times was in terms of piecemeal interval velocities, with no attempt to derive a layered model that would fit the travel times in an overall sense (the one exception is Porcella, 1984). In this report I reanalyze all of the arrival times in terms of layered models for P- and for S-wave velocities at each site, and I provide the results as computer files. In addition to the measurements reported in the open-file reports, I also include some borehole results from other reports, as well as some results never before published. I include data for 277 boreholes (at the time of this writing; more will be added to the web site as they are obtained), all in California (I have data from boreholes in Washington and Utah, but these will be published separately). I am also in the process of interpreting travel time data obtained using a seismic cone penetrometer at hundreds of sites; these data can be interpreted in the same way of those obtained from surface-to-borehole logging. When available, the data will be added to the web site (see below for information on obtaining data from the World Wide Web (WWW)). In addition to the basic borehole data and results, I provide information concerning strong-motion stations that I judge to be close enough to the boreholes that the borehole velocity models can be used as the velocity models beneath the stations.

  2. Water Log.

    ERIC Educational Resources Information Center

    Science Activities, 1995

    1995-01-01

    Presents a Project WET water education activity. Students use a Water Log (journal or portfolio) to write or illustrate their observations, feelings, and actions related to water. The log serves as an assessment tool to monitor changes over time in knowledge of and attitudes toward the water. (LZ)

  3. Developments in Quantitative Structure-Activity Relationships (QSAR). A Review

    DTIC Science & Technology

    1976-07-01

    hyphae Analogs Inhibition of s-Nitrostyrenes 20 84 Growth Botrytie -,inerea Inhibition of a-Nitrostyrenes 6 84 Grcwth Bovine hemoglobin Binding of...AspergiL us niger, phenyl methacrylates upon Ranse-nula awmat~a and RR’NCSS Na+ upon Botrytis cinerea conformed to the general equation 35. The equations...log II vs log kw *79 Botrytis cinerea , 41, 64 -lg! slgý,7 Bovine hemoglobin, 36 lg Elv o .,7 Bovine serum albumin, 36 - log iI vs log P, 79 - log JE

  4. Active Brownian particles escaping a channel in single file.

    PubMed

    Locatelli, Emanuele; Baldovin, Fulvio; Orlandini, Enzo; Pierno, Matteo

    2015-02-01

    Active particles may happen to be confined in channels so narrow that they cannot overtake each other (single-file conditions). This interesting situation reveals nontrivial physical features as a consequence of the strong interparticle correlations developed in collective rearrangements. We consider a minimal two-dimensional model for active Brownian particles with the aim of studying the modifications introduced by activity with respect to the classical (passive) single-file picture. Depending on whether their motion is dominated by translational or rotational diffusion, we find that active Brownian particles in single file may arrange into clusters that are continuously merging and splitting (active clusters) or merely reproduce passive-motion paradigms, respectively. We show that activity conveys to self-propelled particles a strategic advantage for trespassing narrow channels against external biases (e.g., the gravitational field).

  5. Active Brownian particles escaping a channel in single file

    NASA Astrophysics Data System (ADS)

    Locatelli, Emanuele; Baldovin, Fulvio; Orlandini, Enzo; Pierno, Matteo

    2015-02-01

    Active particles may happen to be confined in channels so narrow that they cannot overtake each other (single-file conditions). This interesting situation reveals nontrivial physical features as a consequence of the strong interparticle correlations developed in collective rearrangements. We consider a minimal two-dimensional model for active Brownian particles with the aim of studying the modifications introduced by activity with respect to the classical (passive) single-file picture. Depending on whether their motion is dominated by translational or rotational diffusion, we find that active Brownian particles in single file may arrange into clusters that are continuously merging and splitting (active clusters) or merely reproduce passive-motion paradigms, respectively. We show that activity conveys to self-propelled particles a strategic advantage for trespassing narrow channels against external biases (e.g., the gravitational field).

  6. Silanols, a New Class of Antimicrobial Agent

    DTIC Science & Technology

    2006-04-01

    carbinols against the four bacteria was log (1/MLC) = 0.670 log P + 0.0035 ∆ν -1.836, n = 282, r = 0.96, s = 0.22. This equation and a significantly...activity relationship of antimicrobial agents by means of equations [8] based on a method proposed by Hansch and Fujita in 1964 [1]. This multiple...correlation equations between their antimicrobial activities and structural properties, log P and H-bond acidity, were created by a multiple regression

  7. VizieR Online Data Catalog: Nearby B-type stars abundances (Morel+, 2008)

    NASA Astrophysics Data System (ADS)

    Morel, T.; Butler, K.

    2008-06-01

    This Table gives the adopted loggf values, EW measurements (in mA) and line-by-line abundances (on the scale in which log[epsilon(H)]=12). A blank indicates that the EW was not reliably measurable, the line was considered blended for the relevant temperature range or yielded a discrepant abundance. The accuracy of the EW measurements is discussed in Sect.3 of the paper. The wing of HeI 4387.9 was taken as pseudo continuum in the case of NeII 4391.99. (2 data files).

  8. VizieR Online Data Catalog: Hubble Tarantula Treasury Project (HTTP). III. (Sabbi+, 2016)

    NASA Astrophysics Data System (ADS)

    Sabbi, E.; Lennon, D. J.; Anderson, J.; Cignoni, M.; van der Marel, R. P.; Zaritsky, D.; de Marchi, G.; Panagia, N.; Gouliermis, D. A.; Grebel, E. K.; Gallagher, J. S., III; Smith, L. J.; Sana, H.; Aloisi, A.; Tosi, M.; Evans, C. J.; Arab, H.; Boyer, M.; de Mink, S. E.; Gordon, K.; Koekemoer, A. M.; Larsen, S. S.; Ryon, J. E.; Zeidler, P.

    2016-02-01

    Hubble Tarantula Treasury Project (HTTP; HST 12939, PI Elena Sabbi + HST 12499, PI Danny Lennon) was awarded 60 orbits of HST time in cycle 20 to survey the entire Tarantula Nebula (30 Doradus), using both the UVIS and the IR channels of the Wide Field Camera 3 (WFC3), and, in parallel, the Wide Field Channel (WFC) of the Advanced Camera for Surveys (ACS). See log of the observations (from 2011 Oct 03 to 2013 Sep 17) in table 1. (2 data files).

  9. N2C2M2 Experimentation and Validation: Understanding Its C2 Approaches and Implications

    DTIC Science & Technology

    2010-06-01

    C O N FL IC TE D D EC O N FL IC TE D C O O R D IN A TE D C O LL A B O R A TI...Interactions (Shares and Posts) Log File LE VE L Fa ct oi d Se t Tr ia l TO TA L Va lu e TO TA L Va lu e / Su bj ec t Te nd en cy C TC TL s...80% 90% 100% CO N FL IC TE

  10. Long-Term file activity patterns in a UNIX workstation environment

    NASA Technical Reports Server (NTRS)

    Gibson, Timothy J.; Miller, Ethan L.

    1998-01-01

    As mass storage technology becomes more affordable for sites smaller than supercomputer centers, understanding their file access patterns becomes crucial for developing systems to store rarely used data on tertiary storage devices such as tapes and optical disks. This paper presents a new way to collect and analyze file system statistics for UNIX-based file systems. The collection system runs in user-space and requires no modification of the operating system kernel. The statistics package provides details about file system operations at the file level: creations, deletions, modifications, etc. The paper analyzes four months of file system activity on a university file system. The results confirm previously published results gathered from supercomputer file systems, but differ in several important areas. Files in this study were considerably smaller than those at supercomputer centers, and they were accessed less frequently. Additionally, the long-term creation rate on workstation file systems is sufficiently low so that all data more than a day old could be cheaply saved on a mass storage device, allowing the integration of time travel into every file system.

  11. Line-driven disc wind model for ultrafast outflows in active galactic nuclei - scaling with luminosity

    NASA Astrophysics Data System (ADS)

    Nomura, M.; Ohsuga, K.

    2017-03-01

    In order to reveal the origin of the ultrafast outflows (UFOs) that are frequently observed in active galactic nuclei (AGNs), we perform two-dimensional radiation hydrodynamics simulations of the line-driven disc winds, which are accelerated by the radiation force due to the spectral lines. The line-driven winds are successfully launched for the range of MBH = 106-9 M⊙ and ε = 0.1-0.5, and the resulting mass outflow rate (dot{M_w}), momentum flux (dot{p_w}), and kinetic luminosity (dot{E_w}) are in the region containing 90 per cent of the posterior probability distribution in the dot{M}_w-Lbol plane, dot{p}_w-Lbol plane, and dot{E}_w-Lbol plane shown in Gofford et al., where MBH is the black hole mass, ε is the Eddington ratio, and Lbol is the bolometric luminosity. The best-fitting relations in Gofford et al., d log dot{M_w}/d log {L_bol}˜ 0.9, d log dot{p_w}/d log {L_bol}˜ 1.2, and d log dot{E_w}/d log {L_bol}˜ 1.5, are roughly consistent with our results, d log dot{M_w}/d log {L_bol}˜ 9/8, d log dot{p_w}/d log {L_bol}˜ 10/8, and d log dot{E_w}/d log {L_bol}˜ 11/8. In addition, our model predicts that no UFO features are detected for the AGNs with ε ≲ 0.01, since the winds do not appear. Also, only AGNs with MBH ≲ 108 M⊙ exhibit the UFOs when ε ∼ 0.025. These predictions nicely agree with the X-ray observations. These results support that the line-driven disc wind is the origin of the UFOs.

  12. Reliability and relative validity of three physical activity questionnaires in Taizhou population of China: the Taizhou Longitudinal Study.

    PubMed

    Hu, B; Lin, L F; Zhuang, M Q; Yuan, Z Y; Li, S Y; Yang, Y J; Lu, M; Yu, S Z; Jin, L; Ye, W M; Wang, X F

    2015-09-01

    To examine the test-retest reliabilities and relative validities of the Chinese version of short International Physical Activity Questionnaire (IPAQ-S-C), the Global Physical Activity Questionnaire (GPAQ-C), and the Total Energy Expenditure Questionnaire (TEEQ-C) in a population-based prospective study, the Taizhou Longitudinal Study (TZLS). A longitudinal comparative study. A total of 205 participants (male: 38.54%) aged 30-70 years completed three questionnaires twice (day one and day nine) and physical activity log (PA-log) over seven consecutive days. The test-retest reliabilities were evaluated using intra-class correlation coefficients (ICCs) and the relative validities were estimated by comparing the data from physical activity questionnaires (PAQs) and PA-log. Good reliabilities were observed between the repeated PAQs. The ICCs ranged from 0.51 to 0.80 for IPAQ-C, 0.67 to 0.85 for GPAQ-C, and 0.74 to 0.94 for TEEQ-C, respectively. Energy expenditure of most PA domains estimated by the three PAQs correlated moderately with the results recorded by PA-log except the walking domain of IPAQ-S-C. The partial correlation coefficients between the PAQs and PA-log ranged from 0.44 to 0.58 for IPAQ-S-C, 0.26 to 0.52 for GPAQ-C, and 0.41 to 0.72 for TEEQ-C, respectively. Bland-Altman plots showed acceptable agreement between the three PAQs and PA-log. The three PAQs, especially TEEQ-C, were relatively reliable and valid for assessment of physical activity and could be used in TZLS. Copyright © 2015 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  13. Logging Activity in the Trinational Amazonian Region of Pando/Bolivia, Acre and Rond“nia/Brazil, and Madre de Dios/Peru: Analysis of Existing Data

    NASA Astrophysics Data System (ADS)

    Mendoza, E.; Brilhante, S. H.; Brown, I.; Peralta, R.; Rivero, S.; Melendez, N.

    2002-12-01

    Logging activity in the trinational southwestern Amazonia will grow in importance as a driver of regional land-use change as expanding road access facilitates both timber extraction and transport to international markets. Official data on current activity in this ~50 million ha region are limited and inconsistent with differences as much as twenty-fold between official estimates; nevertheless, they serve as guides for understanding the relative magnitude of logging activities. For 2000, an estimated 5 million m3 of timber were commercialized in Rondonia, 400,000 m3 in Acre, Brazil, and 200,000 m3 for the combined departments of Pando, Bolivia and Madre de Dios, Peru. About 70% of this timber originates from clear cutting done for pasture and agriculture activities, nearly a third from unregulated selective logging, and only 2% from managed selective logging. Eight timber species are preferentially extracted. The total area for timber concessions in Acre, Pando and Madre de Dios extends to about 4 million ha for a potential timber supply of 65 million m3. About 150,000 m3/yr of illegal timber is confiscated by federal and state agencies in Acre, Pando and Madre de Dios. Problems of enforcement in the region are due principally to the lack of trained personnel and little cooperation among agencies of the three countries. Proposed development plans indicate a 3- to >10-fold increase in logging activity in the Acre and Pando regions during the coming decade. More detailed studies are urgently needed to guide sustainable development of this resource in southwestern Amazonia.

  14. Michigan Saw Log Production and Sawmill Industry, 1978

    Treesearch

    James E. Blyth; Jack Zollner; W. Brad Smith

    1982-01-01

    Michigan's saw log production climbed to 563 million board feet in 1978 from 514 million board feet in 1977. Eight percent was shipped to out-of-state mills. Michigan's 341 active sawmills received 525 million board feet of logs; only 1 percent came from other States.

  15. 76 FR 42130 - Agency Information Collection Activities: BioWatch Filter Holder Log

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-18

    ... DEPARTMENT OF HOMELAND SECURITY Agency Information Collection Activities: BioWatch Filter Holder...) assigned responsibility for installing and removing filters from aerosol collection devices and transportation to local laboratories for sample analysis. A standard filter log form is completed for each sample...

  16. 76 FR 24504 - Agency Information Collection Activities: BioWatch Filter Holder Log

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-02

    ... DEPARTMENT OF HOMELAND SECURITY Agency Information Collection Activities: BioWatch Filter Holder...) assigned responsibility for installing and removing filters from aerosol collection devices and transportation to local laboratories for sample analysis. A standard filter log form is completed for each sample...

  17. 20 CFR 10.528 - What action will OWCP take if the employee fails to file a report of activity indicating an...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... fails to file a report of activity indicating an ability to work? 10.528 Section 10.528 Employees... employee fails to file a report of activity indicating an ability to work? OWCP periodically requires each... indicating an ability to work, which the employee has performed for the prior 15 months. If an employee who...

  18. 20 CFR 10.528 - What action will OWCP take if the employee fails to file a report of activity indicating an...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... fails to file a report of activity indicating an ability to work? 10.528 Section 10.528 Employees... employee fails to file a report of activity indicating an ability to work? OWCP periodically requires each... indicating an ability to work, which the employee has performed for the prior 15 months. If an employee who...

  19. 20 CFR 10.528 - What action will OWCP take if the employee fails to file a report of activity indicating an...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... fails to file a report of activity indicating an ability to work? 10.528 Section 10.528 Employees... employee fails to file a report of activity indicating an ability to work? OWCP periodically requires each... indicating an ability to work, which the employee has performed for the prior 15 months. If an employee who...

  20. 20 CFR 10.528 - What action will OWCP take if the employee fails to file a report of activity indicating an...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... fails to file a report of activity indicating an ability to work? 10.528 Section 10.528 Employees... employee fails to file a report of activity indicating an ability to work? OWCP periodically requires each... indicating an ability to work, which the employee has performed for the prior 15 months. If an employee who...

  1. 20 CFR 10.528 - What action will OWCP take if the employee fails to file a report of activity indicating an...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... fails to file a report of activity indicating an ability to work? 10.528 Section 10.528 Employees... employee fails to file a report of activity indicating an ability to work? OWCP periodically requires each... indicating an ability to work, which the employee has performed for the prior 15 months. If an employee who...

  2. 41 CFR 105-54.203-2 - Active charters file.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false Active charters file... Administration 54-ADVISORY COMMITTEE MANAGEMENT 54.2-Establishment of Advisory Committees § 105-54.203-2 Active... active charters. ...

  3. 41 CFR 105-54.203-2 - Active charters file.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false Active charters file... Administration 54-ADVISORY COMMITTEE MANAGEMENT 54.2-Establishment of Advisory Committees § 105-54.203-2 Active... active charters. ...

  4. 41 CFR 105-54.203-2 - Active charters file.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false Active charters file... Administration 54-ADVISORY COMMITTEE MANAGEMENT 54.2-Establishment of Advisory Committees § 105-54.203-2 Active... active charters. ...

  5. 41 CFR 105-54.203-2 - Active charters file.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Active charters file... Administration 54-ADVISORY COMMITTEE MANAGEMENT 54.2-Establishment of Advisory Committees § 105-54.203-2 Active... active charters. ...

  6. Analyzing Information Seeking and Drug-Safety Alert Response by Health Care Professionals as New Methods for Surveillance

    PubMed Central

    Pernek, Igor; Stiglic, Gregor; Leskovec, Jure; Strasberg, Howard R; Shah, Nigam Haresh

    2015-01-01

    Background Patterns in general consumer online search logs have been used to monitor health conditions and to predict health-related activities, but the multiple contexts within which consumers perform online searches make significant associations difficult to interpret. Physician information-seeking behavior has typically been analyzed through survey-based approaches and literature reviews. Activity logs from health care professionals using online medical information resources are thus a valuable yet relatively untapped resource for large-scale medical surveillance. Objective To analyze health care professionals’ information-seeking behavior and assess the feasibility of measuring drug-safety alert response from the usage logs of an online medical information resource. Methods Using two years (2011-2012) of usage logs from UpToDate, we measured the volume of searches related to medical conditions with significant burden in the United States, as well as the seasonal distribution of those searches. We quantified the relationship between searches and resulting page views. Using a large collection of online mainstream media articles and Web log posts we also characterized the uptake of a Food and Drug Administration (FDA) alert via changes in UpToDate search activity compared with general online media activity related to the subject of the alert. Results Diseases and symptoms dominate UpToDate searches. Some searches result in page views of only short duration, while others consistently result in longer-than-average page views. The response to an FDA alert for Celexa, characterized by a change in UpToDate search activity, differed considerably from general online media activity. Changes in search activity appeared later and persisted longer in UpToDate logs. The volume of searches and page view durations related to Celexa before the alert also differed from those after the alert. Conclusions Understanding the information-seeking behavior associated with online evidence sources can offer insight into the information needs of health professionals and enable large-scale medical surveillance. Our Web log mining approach has the potential to monitor responses to FDA alerts at a national level. Our findings can also inform the design and content of evidence-based medical information resources such as UpToDate. PMID:26293444

  7. Analyzing Information Seeking and Drug-Safety Alert Response by Health Care Professionals as New Methods for Surveillance.

    PubMed

    Callahan, Alison; Pernek, Igor; Stiglic, Gregor; Leskovec, Jure; Strasberg, Howard R; Shah, Nigam Haresh

    2015-08-20

    Patterns in general consumer online search logs have been used to monitor health conditions and to predict health-related activities, but the multiple contexts within which consumers perform online searches make significant associations difficult to interpret. Physician information-seeking behavior has typically been analyzed through survey-based approaches and literature reviews. Activity logs from health care professionals using online medical information resources are thus a valuable yet relatively untapped resource for large-scale medical surveillance. To analyze health care professionals' information-seeking behavior and assess the feasibility of measuring drug-safety alert response from the usage logs of an online medical information resource. Using two years (2011-2012) of usage logs from UpToDate, we measured the volume of searches related to medical conditions with significant burden in the United States, as well as the seasonal distribution of those searches. We quantified the relationship between searches and resulting page views. Using a large collection of online mainstream media articles and Web log posts we also characterized the uptake of a Food and Drug Administration (FDA) alert via changes in UpToDate search activity compared with general online media activity related to the subject of the alert. Diseases and symptoms dominate UpToDate searches. Some searches result in page views of only short duration, while others consistently result in longer-than-average page views. The response to an FDA alert for Celexa, characterized by a change in UpToDate search activity, differed considerably from general online media activity. Changes in search activity appeared later and persisted longer in UpToDate logs. The volume of searches and page view durations related to Celexa before the alert also differed from those after the alert. Understanding the information-seeking behavior associated with online evidence sources can offer insight into the information needs of health professionals and enable large-scale medical surveillance. Our Web log mining approach has the potential to monitor responses to FDA alerts at a national level. Our findings can also inform the design and content of evidence-based medical information resources such as UpToDate.

  8. Estradiol and Inflammatory Markers in Older Men

    PubMed Central

    Maggio, Marcello; Ceda, Gian Paolo; Lauretani, Fulvio; Bandinelli, Stefania; Metter, E. Jeffrey; Artoni, Andrea; Gatti, Elisa; Ruggiero, Carmelinda; Guralnik, Jack M.; Valenti, Giorgio; Ling, Shari M.; Basaria, Shehzad; Ferrucci, Luigi

    2009-01-01

    Background: Aging is characterized by a mild proinflammatory state. In older men, low testosterone levels have been associated with increasing levels of proinflammatory cytokines. It is still unclear whether estradiol (E2), which generally has biological activities complementary to testosterone, affects inflammation. Methods: We analyzed data obtained from 399 men aged 65–95 yr enrolled in the Invecchiare in Chianti study with complete data on body mass index (BMI), serum E2, testosterone, IL-6, soluble IL-6 receptor, TNF-α, IL-1 receptor antagonist, and C-reactive protein. The relationship between E2 and inflammatory markers was examined using multivariate linear models adjusted for age, BMI, smoking, physical activity, chronic disease, and total testosterone. Results: In age-adjusted analysis, log (E2) was positively associated with log (IL-6) (r = 0.19; P = 0.047), and the relationship was statistically significant (P = 0.032) after adjustments for age, BMI, smoking, physical activity, chronic disease, and serum testosterone levels. Log (E2) was not significantly associated with log (C-reactive protein), log (soluble IL-6 receptor), or log (TNF-α) in both age-adjusted and fully adjusted analyses. Conclusions: In older men, E2 is weakly positively associated with IL-6, independent of testosterone and other confounders including BMI. PMID:19050054

  9. Structural characteristics of novel symmetrical diaryl derivatives with nitrogenated functions. Requirements for cytotoxic activity.

    PubMed

    Font, María; Ardaiz, Elena; Cordeu, Lucia; Cubedo, Elena; García-Foncillas, Jesús; Sanmartin, Carmen; Palop, Juan-Antonio

    2006-03-15

    In an attempt to discover the essential features that would allow us to explain the differences in cytotoxic activity shown by a series of symmetrical diaryl derivatives with nitrogenated functions, we have studied by molecular modelling techniques the variation in Log P and conformational behaviour, in terms of structural modifications. The Log P data--although they provide few clues concerning the observed variability in activity--suggest that an initial separation of active and inactive compounds is possible based on this parameter. The subsequent study of the conformational behaviour of the compounds, selected according to their Log P values, showed that the active compounds preferentially display an extended conformation and inactive ones are associated with a certain type of folding, with a triangular-type conformation adopted in these cases.

  10. Design and implementation of wireless dose logger network for radiological emergency decision support system.

    PubMed

    Gopalakrishnan, V; Baskaran, R; Venkatraman, B

    2016-08-01

    A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee-Pro wireless modules and PSoC controller for wireless interfacing, and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.

  11. Design and implementation of wireless dose logger network for radiological emergency decision support system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopalakrishnan, V.; Baskaran, R.; Venkatraman, B.

    A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee–Pro wireless modules and PSoC controller for wireless interfacing,more » and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.« less

  12. VizieR Online Data Catalog: Radiative forces for stellar envelopes (Seaton, 1997)

    NASA Astrophysics Data System (ADS)

    Seaton, M. J.; Yan, Y.; Mihalas, D.; Pradhan, A. K.

    2000-02-01

    (1) Primary data files, stages.zz These files give data for the calculation of radiative accelerations, GRAD, for elements with nuclear charge zz. Data are available for zz=06, 07, 08, 10, 11, 12, 13, 14, 16, 18, 20, 24, 25, 26 and 28. Calculations are made using data from the Opacity Project (see papers SYMP and IXZ). The data are given for each ionisation stage, j. They are tabulated on a mesh of (T, Ne, CHI) where T is temperature, Ne electron density and CHI is abundance multiplier. The files include data for ionisation fractions, for each (T, Ne). The file contents are described in the paper ACC and as comments in the code add.f (2) Code add.f This reads a file stages.zz and creates a file acc.zz giving radiative accelerations averaged over ionisation stages. The code prompts for names of input and output files. The code, as provided, gives equal weights (as defined in the paper ACC) to all stages. Th weights are set in SUBROUTINE WEIGHTS, which could be changed to give any weights preferred by the user. The dependence of diffusion coefficients on ionisation stage is given by a function ZET, which is defined in SUBROUTINE ZETA. The expressions used for ZET are as given in the paper. The user can change that subroutine if other expressions are preferred. The output file contains values, ZETBAR, of ZET, averaged over ionisation stages. (3) Files acc.zz Radiative accelerations computed using add.f as provided. The user will need to run the code add.f only if it is required to change the subroutines WEIGHTS or ZETA. The contents of the files acc.zz are described in the paper ACC and in comments contained in the code add.f. (4) Code accfit.f This code gives gives radiative accelerations, and some related data, for a stellar model. Methods used to interpolate data to the values of (T, RHO) for the stellar model are based on those used in the code opfit.for (see the paper OPF). The executable file accfit.com runs accfit.f. It uses a list of files given in accfit.files (see that file for further description). The mesh used for the abundance-multiplier CHI on the output file will generally be finer than that used in the input files acc.zz. The mesh to be used is specified on a file chi.dat. For a test run, the stellar model used is given in the file 10000_4.2 (Teff=10000 K, LOG10(g)=4.2) The output file from that test run is acc100004.2. The contents of the output file are described in the paper ACC and as comments in the code accfit.f. (5) The code diff.f This code reads the output file (e.g. acc1000004.2) created by accfit.f. For any specified depth point in the model and value of CHI, it gives values of radiative accelerations, the quantity ZETBAR required for calculation of diffusion coefficients, and Rosseland-mean opacities. The code prompts for input data. It creates a file recording all data calculated. The code diff.f is intended for incorporation, as a set of subroutines, in codes for diffusion calculations. (1 data file).

  13. Inquiry and Aquifers.

    ERIC Educational Resources Information Center

    Leuenberger, Ted; Shepardson, Daniel; Harbor, Jon; Bell, Cheryl; Meyer, Jason; Klagges, Hope; Burgess, Willie

    2001-01-01

    Presents inquiry-oriented activities that acquaint students with groundwater sources, movement of water through aquifers, and contamination of groundwater by pollution. In one activity, students use well log data from web-based resources to explore groundwater systems. Provides sample well log data for those not having access to local information.…

  14. 75 FR 5066 - Commission Information Collection Activities (FERC Form 60,1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-01

    ... corresponding dockets and collection numbers.) Comments may be filed either electronically or in paper format. Those persons filing electronically do not need to make a paper filing. Documents filed electronically... acknowledgement to the sender's e- mail address upon receipt of comments. For paper filings, the comments should...

  15. 12 CFR 5.4 - Filing required.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Filing required. 5.4 Section 5.4 Banks and... CORPORATE ACTIVITIES Rules of General Applicability § 5.4 Filing required. (a) Filing. A depository... filings are available in the Manual and from each district office. (c) Other applications accepted. At the...

  16. Cotton textiles modified with citric acid as efficient anti-bacterial agent for prevention of nosocomial infections

    PubMed Central

    Bischof Vukušić, Sandra; Flinčec Grgac, Sandra; Budimir, Ana; Kalenić, Smilja

    2011-01-01

    Aim To study the antimicrobial activity of citric acid (CA) and sodium hypophosphite monohydrate (SHP) against gram-positive and gram-negative bacteria, and to determine the influence of conventional and microwave thermal treatments on the effectiveness of antimicrobial treatment of cotton textiles. Method Textile material was impregnated with CA and SHP solution and thermally treated by either conventional or microwave drying/curing treatment. Antibacterial effectiveness was tested according to the ISO 20743:2009 standard, using absorption method. The surfaces were morphologically observed by scanning electron microscopy, while physical characteristics were determined by wrinkle recovery angles method (DIN 53 891), tensile strength (DIN 53 837), and whiteness degree method (AATCC 110-2000). Results Cotton fabric treated with CA and SHP showed significant antibacterial activity against MRSA (6.38 log10 treated by conventional drying and 6.46 log10 treated by microwave drying before washing, and 6.90 log10 and 7.86 log10, respectively, after 1 cycle of home domestic laundering washing [HDLW]). Antibacterial activity was also remarkable against S. aureus (4.25 log10 by conventional drying, 4.58 log10 by microwave drying) and against P. aeruginosa (1.93 log10 by conventional and 4.66 log10 by microwave drying). Antibacterial activity against P. aeruginosa was higher in samples subjected to microwave drying/curing than in those subjected to conventional drying/curing. As expected, antibacterial activity was reduced after 10 HDLW cycles but the compound was still effective. The surface of the untreated cotton polymer was smooth, while minor erosion stripes appeared on the surfaces treated with antimicrobial agent, and long and deep stripes were found on the surface of the washed sample. Conclusion CA can be used both for the disposable (non-durable) materials (gowns, masks, and cuffs for blood pressure measurement) and the materials that require durability to laundering. The current protocols and initiatives in infection control could be improved by the use of antimicrobial agents applied on cotton carbohydrate polymer. PMID:21328723

  17. Building Specialized Multilingual Lexical Graphs Using Community Resources

    NASA Astrophysics Data System (ADS)

    Daoud, Mohammad; Boitet, Christian; Kageura, Kyo; Kitamoto, Asanobu; Mangeot, Mathieu; Daoud, Daoud

    We are describing methods for compiling domain-dedicated multilingual terminological data from various resources. We focus on collecting data from online community users as a main source, therefore, our approach depends on acquiring contributions from volunteers (explicit approach), and it depends on analyzing users' behaviors to extract interesting patterns and facts (implicit approach). As a generic repository that can handle the collected multilingual terminological data, we are describing the concept of dedicated Multilingual Preterminological Graphs MPGs, and some automatic approaches for constructing them by analyzing the behavior of online community users. A Multilingual Preterminological Graph is a special lexical resource that contains massive amount of terms related to a special domain. We call it preterminological, because it is a raw material that can be used to build a standardized terminological repository. Building such a graph is difficult using traditional approaches, as it needs huge efforts by domain specialists and terminologists. In our approach, we build such a graph by analyzing the access log files of the website of the community, and by finding the important terms that have been used to search in that website, and their association with each other. We aim at making this graph as a seed repository so multilingual volunteers can contribute. We are experimenting this approach with the Digital Silk Road Project. We have used its access log files since its beginning in 2003, and obtained an initial graph of around 116000 terms. As an application, we used this graph to obtain a preterminological multilingual database that is serving a CLIR system for the DSR project.

  18. Fast skin dose estimation system for interventional radiology

    PubMed Central

    Takata, Takeshi; Kotoku, Jun’ichi; Maejima, Hideyuki; Kumagai, Shinobu; Arai, Norikazu; Kobayashi, Takenori; Shiraishi, Kenshiro; Yamamoto, Masayoshi; Kondo, Hiroshi; Furui, Shigeru

    2018-01-01

    Abstract To minimise the radiation dermatitis related to interventional radiology (IR), rapid and accurate dose estimation has been sought for all procedures. We propose a technique for estimating the patient skin dose rapidly and accurately using Monte Carlo (MC) simulation with a graphical processing unit (GPU, GTX 1080; Nvidia Corp.). The skin dose distribution is simulated based on an individual patient’s computed tomography (CT) dataset for fluoroscopic conditions after the CT dataset has been segmented into air, water and bone based on pixel values. The skin is assumed to be one layer at the outer surface of the body. Fluoroscopic conditions are obtained from a log file of a fluoroscopic examination. Estimating the absorbed skin dose distribution requires calibration of the dose simulated by our system. For this purpose, a linear function was used to approximate the relation between the simulated dose and the measured dose using radiophotoluminescence (RPL) glass dosimeters in a water-equivalent phantom. Differences of maximum skin dose between our system and the Particle and Heavy Ion Transport code System (PHITS) were as high as 6.1%. The relative statistical error (2 σ) for the simulated dose obtained using our system was ≤3.5%. Using a GPU, the simulation on the chest CT dataset aiming at the heart was within 3.49 s on average: the GPU is 122 times faster than a CPU (Core i7–7700K; Intel Corp.). Our system (using the GPU, the log file, and the CT dataset) estimated the skin dose more rapidly and more accurately than conventional methods. PMID:29136194

  19. Fast skin dose estimation system for interventional radiology.

    PubMed

    Takata, Takeshi; Kotoku, Jun'ichi; Maejima, Hideyuki; Kumagai, Shinobu; Arai, Norikazu; Kobayashi, Takenori; Shiraishi, Kenshiro; Yamamoto, Masayoshi; Kondo, Hiroshi; Furui, Shigeru

    2018-03-01

    To minimise the radiation dermatitis related to interventional radiology (IR), rapid and accurate dose estimation has been sought for all procedures. We propose a technique for estimating the patient skin dose rapidly and accurately using Monte Carlo (MC) simulation with a graphical processing unit (GPU, GTX 1080; Nvidia Corp.). The skin dose distribution is simulated based on an individual patient's computed tomography (CT) dataset for fluoroscopic conditions after the CT dataset has been segmented into air, water and bone based on pixel values. The skin is assumed to be one layer at the outer surface of the body. Fluoroscopic conditions are obtained from a log file of a fluoroscopic examination. Estimating the absorbed skin dose distribution requires calibration of the dose simulated by our system. For this purpose, a linear function was used to approximate the relation between the simulated dose and the measured dose using radiophotoluminescence (RPL) glass dosimeters in a water-equivalent phantom. Differences of maximum skin dose between our system and the Particle and Heavy Ion Transport code System (PHITS) were as high as 6.1%. The relative statistical error (2 σ) for the simulated dose obtained using our system was ≤3.5%. Using a GPU, the simulation on the chest CT dataset aiming at the heart was within 3.49 s on average: the GPU is 122 times faster than a CPU (Core i7-7700K; Intel Corp.). Our system (using the GPU, the log file, and the CT dataset) estimated the skin dose more rapidly and more accurately than conventional methods.

  20. Analyzing Medical Image Search Behavior: Semantics and Prediction of Query Results.

    PubMed

    De-Arteaga, Maria; Eggel, Ivan; Kahn, Charles E; Müller, Henning

    2015-10-01

    Log files of information retrieval systems that record user behavior have been used to improve the outcomes of retrieval systems, understand user behavior, and predict events. In this article, a log file of the ARRS GoldMiner search engine containing 222,005 consecutive queries is analyzed. Time stamps are available for each query, as well as masked IP addresses, which enables to identify queries from the same person. This article describes the ways in which physicians (or Internet searchers interested in medical images) search and proposes potential improvements by suggesting query modifications. For example, many queries contain only few terms and therefore are not specific; others contain spelling mistakes or non-medical terms that likely lead to poor or empty results. One of the goals of this report is to predict the number of results a query will have since such a model allows search engines to automatically propose query modifications in order to avoid result lists that are empty or too large. This prediction is made based on characteristics of the query terms themselves. Prediction of empty results has an accuracy above 88%, and thus can be used to automatically modify the query to avoid empty result sets for a user. The semantic analysis and data of reformulations done by users in the past can aid the development of better search systems, particularly to improve results for novice users. Therefore, this paper gives important ideas to better understand how people search and how to use this knowledge to improve the performance of specialized medical search engines.

Top