Sample records for system log files

  1. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  2. Cloud object store for archive storage of high performance computing data using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  3. Log-less metadata management on metadata server for parallel file systems.

    PubMed

    Liao, Jianwei; Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally.

  4. Log-Less Metadata Management on Metadata Server for Parallel File Systems

    PubMed Central

    Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally. PMID:24892093

  5. TH-AB-201-12: Using Machine Log-Files for Treatment Planning and Delivery QA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanhope, C; Liang, J; Drake, D

    2016-06-15

    Purpose: To determine the segment reduction and dose resolution necessary for machine log-files to effectively replace current phantom-based patient-specific quality assurance, while minimizing computational cost. Methods: Elekta’s Log File Convertor R3.2 records linac delivery parameters (dose rate, gantry angle, leaf position) every 40ms. Five VMAT plans [4 H&N, 1 Pulsed Brain] comprised of 2 arcs each were delivered on the ArcCHECK phantom. Log-files were reconstructed in Pinnacle on the phantom geometry using 1/2/3/4° control point spacing and 2/3/4mm dose grid resolution. Reconstruction effectiveness was quantified by comparing 2%/2mm gamma passing rates of the original and log-file plans. Modulation complexity scoresmore » (MCS) were calculated for each beam to correlate reconstruction accuracy and beam modulation. Percent error in absolute dose for each plan-pair combination (log-file vs. ArcCHECK, original vs. ArcCHECK, log-file vs. original) was calculated for each arc and every diode greater than 10% of the maximum measured dose (per beam). Comparing standard deviations of the three plan-pair distributions, relative noise of the ArcCHECK and log-file systems was elucidated. Results: The original plans exhibit a mean passing rate of 95.1±1.3%. The eight more modulated H&N arcs [MCS=0.088±0.014] and two less modulated brain arcs [MCS=0.291±0.004] yielded log-file pass rates most similar to the original plan when using 1°/2mm [0.05%±1.3% lower] and 2°/3mm [0.35±0.64% higher] log-file reconstructions respectively. Log-file and original plans displayed percent diode dose errors 4.29±6.27% and 3.61±6.57% higher than measurement. Excluding the phantom eliminates diode miscalibration and setup errors; log-file dose errors were 0.72±3.06% higher than the original plans – significantly less noisy. Conclusion: For log-file reconstructed VMAT arcs, 1° control point spacing and 2mm dose resolution is recommended, however, less modulated arcs may allow less stringent reconstructions. Following the aforementioned reconstruction recommendations, the log-file technique is capable of detecting delivery errors with equivalent accuracy and less noise than ArcCHECK QA. I am funded by an Elekta Research Grant.« less

  6. Log file-based patient dose calculations of double-arc VMAT for head-and-neck radiotherapy.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Majima, Kazuhiro; Matsushita, Haruo; Takeda, Ken; Jingu, Keiichi

    2018-04-01

    The log file-based method cannot display dosimetric changes due to linac component miscalibration because of the insensitivity of log files to linac component miscalibration. The purpose of this study was to supply dosimetric changes in log file-based patient dose calculations for double-arc volumetric-modulated arc therapy (VMAT) in head-and-neck cases. Fifteen head-and-neck cases participated in this study. For each case, treatment planning system (TPS) doses were produced by double-arc and single-arc VMAT. Miscalibration-simulated log files were generated by inducing a leaf miscalibration of ±0.5 mm into the log files that were acquired during VMAT irradiation. Subsequently, patient doses were estimated using the miscalibration-simulated log files. For double-arc VMAT, regarding planning target volume (PTV), the change from TPS dose to miscalibration-simulated log file dose in D mean was 0.9 Gy and that for tumor control probability was 1.4%. As for organ-at-risks (OARs), the change in D mean was <0.7 Gy and normal tissue complication probability was <1.8%. A comparison between double-arc and single-arc VMAT for PTV showed statistically significant differences in the changes evaluated by D mean and radiobiological metrics (P < 0.01), even though the magnitude of these differences was small. Similarly, for OARs, the magnitude of these changes was found to be small. Using the log file-based method for PTV and OARs, the log file-based method estimate of patient dose using the double-arc VMAT has accuracy comparable to that obtained using the single-arc VMAT. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, S; Ho, M; Chen, C

    Purpose: The use of log files to perform patient specific quality assurance for both protons and IMRT has been established. Here, we extend that approach to a proprietary log file format and compare our results to measurements in phantom. Our goal was to generate a system that would permit gross errors to be found within 3 fractions until direct measurements. This approach could eventually replace direct measurements. Methods: Spot scanning protons pass through multi-wire ionization chambers which provide information about the charge, location, and size of each delivered spot. We have generated a program that calculates the dose in phantommore » from these log files and compares the measurements with the plan. The program has 3 different spot shape models: single Gaussian, double Gaussian and the ASTROID model. The program was benchmarked across different treatment sites for 23 patients and 74 fields. Results: The dose calculated from the log files were compared to those generate by the treatment planning system (Raystation). While the dual Gaussian model often gave better agreement, overall, the ASTROID model gave the most consistent results. Using a 5%–3 mm gamma with a 90% passing criteria and excluding doses below 20% of prescription all patient samples passed. However, the degree of agreement of the log file approach was slightly worse than that of the chamber array measurement approach. Operationally, this implies that if the beam passes the log file model, it should pass direct measurement. Conclusion: We have established and benchmarked a model for log file QA in an IBA proteus plus system. The choice of optimal spot model for a given class of patients may be affected by factors such as site, field size, and range shifter and will be investigated further.« less

  8. Clinical impact of dosimetric changes for volumetric modulated arc therapy in log file-based patient dose calculations.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Matsunaga, Kenichi; Matsushita, Haruo; Majima, Kazuhiro; Jingu, Keiichi

    2017-10-01

    A log file-based method cannot detect dosimetric changes due to linac component miscalibration because log files are insensitive to miscalibration. Herein, clinical impacts of dosimetric changes on a log file-based method were determined. Five head-and-neck and five prostate plans were applied. Miscalibration-simulated log files were generated by inducing a linac component miscalibration into the log file. Miscalibration magnitudes for leaf, gantry, and collimator at the general tolerance level were ±0.5mm, ±1°, and ±1°, respectively, and at a tighter tolerance level achievable on current linac were ±0.3mm, ±0.5°, and ±0.5°, respectively. Re-calculations were performed on patient anatomy using log file data. Changes in tumor control probability/normal tissue complication probability from treatment planning system dose to re-calculated dose at the general tolerance level was 1.8% on planning target volume (PTV) and 2.4% on organs at risk (OARs) in both plans. These changes at the tighter tolerance level were improved to 1.0% on PTV and to 1.5% on OARs, with a statistically significant difference. We determined the clinical impacts of dosimetric changes on a log file-based method using a general tolerance level and a tighter tolerance level for linac miscalibration and found that a tighter tolerance level significantly improved the accuracy of the log file-based method. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  9. WE-G-213CD-03: A Dual Complementary Verification Method for Dynamic Tumor Tracking on Vero SBRT.

    PubMed

    Poels, K; Depuydt, T; Verellen, D; De Ridder, M

    2012-06-01

    to use complementary cine EPID and gimbals log file analysis for in-vivo tracking accuracy monitoring. A clinical prototype of dynamic tracking (DT) was installed on the Vero SBRT system. This prototype version allowed tumor tracking by gimballed linac rotations using an internal-external correspondence model. The DT prototype software allowed the detailed logging of all applied gimbals rotations during tracking. The integration of an EPID on the vero system allowed the acquisition of cine EPID images during DT. We quantified the tracking error on cine EPID (E-EPID) by subtracting the target center (fiducial marker detection) and the field centroid. Dynamic gimbals log file information was combined with orthogonal x-ray verification images to calculate the in-vivo tracking error (E-kVLog). The correlation between E-kVLog and E-EPID was calculated for validation of the gimbals log file. Further, we investigated the sensitivity of the log file tracking error by introducing predefined systematic tracking errors. As an application we calculate gimbals log file tracking error for dynamic hidden target tests to investigate gravity effects and decoupled gimbals rotation from gantry rotation. Finally, calculating complementary cine EPID and log file tracking errors evaluated the clinical accuracy of dynamic tracking. A strong correlation was found between log file and cine EPID tracking error distribution during concurrent measurements (R=0.98). We found sensitivity in the gimbals log files to detect a systematic tracking error up to 0.5 mm. Dynamic hidden target tests showed no gravity influence on tracking performance and high degree of decoupled gimbals and gantry rotation during dynamic arc dynamic tracking. A submillimetric agreement between clinical complementary tracking error measurements was found. Redundancy of the internal gimbals log file with x-ray verification images with complementary independent cine EPID images was implemented to monitor the accuracy of gimballed tumor tracking on Vero SBRT. Research was financially supported by the Flemish government (FWO), Hercules Foundation and BrainLAB AG. © 2012 American Association of Physicists in Medicine.

  10. SU-E-T-142: Automatic Linac Log File: Analysis and Reporting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gainey, M; Rothe, T

    Purpose: End to end QA for IMRT/VMAT is time consuming. Automated linac log file analysis and recalculation of daily recorded fluence, and hence dose, distribution bring this closer. Methods: Matlab (R2014b, Mathworks) software was written to read in and analyse IMRT/VMAT trajectory log files (TrueBeam 1.5, Varian Medical Systems) overnight, and are archived on a backed-up network drive (figure). A summary report (PDF) is sent by email to the duty linac physicist. A structured summary report (PDF) for each patient is automatically updated for embedding into the R&V system (Mosaiq 2.5, Elekta AG). The report contains cross-referenced hyperlinks to easemore » navigation between treatment fractions. Gamma analysis can be performed on planned (DICOM RTPlan) and treated (trajectory log) fluence distributions. Trajectory log files can be converted into RTPlan files for dose distribution calculation (Eclipse, AAA10.0.28, VMS). Results: All leaf positions are within +/−0.10mm: 57% within +/−0.01mm; 89% within 0.05mm. Mean leaf position deviation is 0.02mm. Gantry angle variations lie in the range −0.1 to 0.3 degrees, mean 0.04 degrees. Fluence verification shows excellent agreement between planned and treated fluence. Agreement between planned and treated dose distribution, the derived from log files, is very good. Conclusion: Automated log file analysis is a valuable tool for the busy physicist, enabling potential treated fluence distribution errors to be quickly identified. In the near future we will correlate trajectory log analysis with routine IMRT/VMAT QA analysis. This has the potential to reduce, but not eliminate, the QA workload.« less

  11. Web usage data mining agent

    NASA Astrophysics Data System (ADS)

    Madiraju, Praveen; Zhang, Yanqing

    2002-03-01

    When a user logs in to a website, behind the scenes the user leaves his/her impressions, usage patterns and also access patterns in the web servers log file. A web usage mining agent can analyze these web logs to help web developers to improve the organization and presentation of their websites. They can help system administrators in improving the system performance. Web logs provide invaluable help in creating adaptive web sites and also in analyzing the network traffic analysis. This paper presents the design and implementation of a Web usage mining agent for digging in to the web log files.

  12. Geophysical log database for the Floridan aquifer system and southeastern Coastal Plain aquifer system in Florida and parts of Georgia, Alabama, and South Carolina

    USGS Publications Warehouse

    Williams, Lester J.; Raines, Jessica E.; Lanning, Amanda E.

    2013-04-04

    A database of borehole geophysical logs and other types of data files were compiled as part of ongoing studies of water availability and assessment of brackish- and saline-water resources. The database contains 4,883 logs from 1,248 wells in Florida, Georgia, Alabama, South Carolina, and from a limited number of offshore wells of the eastern Gulf of Mexico and the Atlantic Ocean. The logs can be accessed through a download directory organized by state and county for onshore wells and in a single directory for the offshore wells. A flat file database is provided that lists the wells, their coordinates, and the file listings.

  13. Who Goes There? Measuring Library Web Site Usage.

    ERIC Educational Resources Information Center

    Bauer, Kathleen

    2000-01-01

    Discusses how libraries can gather data on the use of their Web sites. Highlights include Web server log files, including the common log file, referrer log file, and agent log file; log file limitations; privacy concerns; and choosing log analysis software, both free and commercial. (LRW)

  14. Replication in the Harp File System

    DTIC Science & Technology

    1981-07-01

    Shrira Michael Williams iadly 1991 © Massachusetts Institute of Technology (To appear In the Proceedings of the Thirteenth ACM Symposium on Operating...S., Spector, A. Z., and Thompson, D. S. Distributed Logging for Transaction Processing. ACM Special Interest Group on Management of Data 1987 Annual ...System. USENIX Conference Proceedings , June, 1990, pp. 63-71. 15. Hagmann, R. Reimplementing the Cedar File System Using Logging and Group Commit

  15. SU-F-T-469: A Clinically Observed Discrepancy Between Image-Based and Log- Based MLC Position

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neal, B; Ahmed, M; Siebers, J

    2016-06-15

    Purpose: To present a clinical case which challenges the base assumption of log-file based QA, by showing that the actual position of a MLC leaf can suddenly deviate from its programmed and logged position by >1 mm as observed with real-time imaging. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used in cine mode to capture portal images during treatment. Visual monitoring identified an anomalous MLC leaf pair gap not otherwise detected by the automatic position verification. The position of the erred leaf was measured on EPID images and log files were analyzed for themore » treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days. Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3±0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusion: It has been clinically observed that log-file derived leaf positions can differ from their actual positions by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trusting log file records. Intra-treatment EPID imaging provides a method to capture departures from MLC planned positions. Work was supported in part by Varian Medical Systems.« less

  16. Cooperative storage of shared files in a parallel computing system with dynamic block size

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-11-10

    Improved techniques are provided for parallel writing of data to a shared object in a parallel computing system. A method is provided for storing data generated by a plurality of parallel processes to a shared object in a parallel computing system. The method is performed by at least one of the processes and comprises: dynamically determining a block size for storing the data; exchanging a determined amount of the data with at least one additional process to achieve a block of the data having the dynamically determined block size; and writing the block of the data having the dynamically determined block size to a file system. The determined block size comprises, e.g., a total amount of the data to be stored divided by the number of parallel processes. The file system comprises, for example, a log structured virtual parallel file system, such as a Parallel Log-Structured File System (PLFS).

  17. Storage of sparse files using parallel log-structured file system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    A sparse file is stored without holes by storing a data portion of the sparse file using a parallel log-structured file system; and generating an index entry for the data portion, the index entry comprising a logical offset, physical offset and length of the data portion. The holes can be restored to the sparse file upon a reading of the sparse file. The data portion can be stored at a logical end of the sparse file. Additional storage efficiency can optionally be achieved by (i) detecting a write pattern for a plurality of the data portions and generating a singlemore » patterned index entry for the plurality of the patterned data portions; and/or (ii) storing the patterned index entries for a plurality of the sparse files in a single directory, wherein each entry in the single directory comprises an identifier of a corresponding sparse file.« less

  18. Monte Carlo based, patient-specific RapidArc QA using Linac log files.

    PubMed

    Teke, Tony; Bergman, Alanah M; Kwa, William; Gill, Bradford; Duzenli, Cheryl; Popescu, I Antoniu

    2010-01-01

    A Monte Carlo (MC) based QA process to validate the dynamic beam delivery accuracy for Varian RapidArc (Varian Medical Systems, Palo Alto, CA) using Linac delivery log files (DynaLog) is presented. Using DynaLog file analysis and MC simulations, the goal of this article is to (a) confirm that adequate sampling is used in the RapidArc optimization algorithm (177 static gantry angles) and (b) to assess the physical machine performance [gantry angle and monitor unit (MU) delivery accuracy]. Ten clinically acceptable RapidArc treatment plans were generated for various tumor sites and delivered to a water-equivalent cylindrical phantom on the treatment unit. Three Monte Carlo simulations were performed to calculate dose to the CT phantom image set: (a) One using a series of static gantry angles defined by 177 control points with treatment planning system (TPS) MLC control files (planning files), (b) one using continuous gantry rotation with TPS generated MLC control files, and (c) one using continuous gantry rotation with actual Linac delivery log files. Monte Carlo simulated dose distributions are compared to both ionization chamber point measurements and with RapidArc TPS calculated doses. The 3D dose distributions were compared using a 3D gamma-factor analysis, employing a 3%/3 mm distance-to-agreement criterion. The dose difference between MC simulations, TPS, and ionization chamber point measurements was less than 2.1%. For all plans, the MC calculated 3D dose distributions agreed well with the TPS calculated doses (gamma-factor values were less than 1 for more than 95% of the points considered). Machine performance QA was supplemented with an extensive DynaLog file analysis. A DynaLog file analysis showed that leaf position errors were less than 1 mm for 94% of the time and there were no leaf errors greater than 2.5 mm. The mean standard deviation in MU and gantry angle were 0.052 MU and 0.355 degrees, respectively, for the ten cases analyzed. The accuracy and flexibility of the Monte Carlo based RapidArc QA system were demonstrated. Good machine performance and accurate dose distribution delivery of RapidArc plans were observed. The sampling used in the TPS optimization algorithm was found to be adequate.

  19. Parallel checksumming of data chunks of a shared data object using a log-structured file system

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-09-06

    Checksum values are generated and used to verify the data integrity. A client executing in a parallel computing system stores a data chunk to a shared data object on a storage node in the parallel computing system. The client determines a checksum value for the data chunk; and provides the checksum value with the data chunk to the storage node that stores the shared object. The data chunk can be stored on the storage node with the corresponding checksum value as part of the shared object. The storage node may be part of a Parallel Log-Structured File System (PLFS), and the client may comprise, for example, a Log-Structured File System client on a compute node or burst buffer. The checksum value can be evaluated when the data chunk is read from the storage node to verify the integrity of the data that is read.

  20. Catching errors with patient-specific pretreatment machine log file analysis.

    PubMed

    Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa

    2013-01-01

    A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  1. Model Analyst’s Toolkit User Guide, Version 7.1.0

    DTIC Science & Technology

    2015-08-01

    Help > About)  Environment details ( operating system )  metronome.log file, located in your MAT 7.1.0 installation folder  Any log file that...requirements to run the Model Analyst’s Toolkit:  Windows XP operating system (or higher) with Service Pack 2 and all critical Windows updates installed...application icon on your desktop  Create a Quick Launch icon – Creates a MAT application icon on the taskbar for operating systems released

  2. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file

    NASA Astrophysics Data System (ADS)

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-01

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases. This work was partly presented at the 58th Annual meeting of American Association of Physicists in Medicine.

  3. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file.

    PubMed

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-21

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases.

  4. 20 CFR 401.85 - Exempt systems.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... subsection (k)(2) of the Privacy Act: (A) The General Criminal Investigation Files, SSA; (B) The Criminal Investigations File, SSA; and, (C) The Program Integrity Case Files, SSA. (D) Civil and Administrative Investigative Files of the Inspector General, SSA/OIG. (E) Complaint Files and Log. SSA/OGC. (iii) Pursuant to...

  5. The NetLogger Toolkit V2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunter, Dan; Lee, Jason; Stoufer, Martin

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation ofmore » application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and penodically try to reconnect broken TCP pipe. A typical use for this is to store data on local disk while net is down. An event archiver can log one or more incoming NetLogger streams to a local disk file (netlogd) or to a mySQL database (netarchd). We have found exploratory, visual analysis of the log event data to be the most useful means of determining the causes of performance anomalies The NetLogger Visualization tool, niv, has been developed to provide a flexible and interactive graphical representation of system-level and application-level events.« less

  6. SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeMarco, J; McCloskey, S; Low, D

    Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT planmore » file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment.« less

  7. SU-E-T-184: Clinical VMAT QA Practice Using LINAC Delivery Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, H; Jacobson, T; Gu, X

    2015-06-15

    Purpose: To evaluate the accuracy of volumetric modulated arc therapy (VMAT) treatment delivery dose clouds by comparing linac log data to doses measured using an ionization chamber and film. Methods: A commercial IMRT quality assurance (QA) process utilizing a DICOM-RT framework was tested for clinical practice using 30 prostate and 30 head and neck VMAT plans. Delivered 3D VMAT dose distributions were independently checked using a PinPoint ionization chamber and radiographic film in a solid water phantom. DICOM RT coordinates were used to extract the corresponding point and planar doses from 3D log file dose distributions. Point doses were evaluatedmore » by computing the percent error between log file and chamber measured values. A planar dose evaluation was performed for each plan using a 2D gamma analysis with 3% global dose difference and 3 mm isodose point distance criteria. The same analysis was performed to compare treatment planning system (TPS) doses to measured values to establish a baseline assessment of agreement. Results: The mean percent error between log file and ionization chamber dose was 1.0%±2.1% for prostate VMAT plans and −0.2%±1.4% for head and neck plans. The corresponding TPS calculated and measured ionization chamber values agree within 1.7%±1.6%. The average 2D gamma passing rates for the log file comparison to film are 98.8%±1.0% and 96.2%±4.2% for the prostate and head and neck plans, respectively. The corresponding passing rates for the TPS comparison to film are 99.4%±0.5% and 93.9%±5.1%. Overall, the point dose and film data indicate that log file determined doses are in excellent agreement with measured values. Conclusion: Clinical VMAT QA practice using LINAC treatment log files is a fast and reliable method for patient-specific plan evaluation.« less

  8. MO-F-CAMPUS-I-01: A System for Automatically Calculating Organ and Effective Dose for Fluoroscopically-Guided Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Rana, V

    2015-06-15

    Purpose: A system was developed that automatically calculates the organ and effective dose for individual fluoroscopically-guided procedures using a log of the clinical exposure parameters. Methods: We have previously developed a dose tracking system (DTS) to provide a real-time color-coded 3D- mapping of skin dose. This software produces a log file of all geometry and exposure parameters for every x-ray pulse during a procedure. The data in the log files is input into PCXMC, a Monte Carlo program that calculates organ and effective dose for projections and exposure parameters set by the user. We developed a MATLAB program to readmore » data from the log files produced by the DTS and to automatically generate the definition files in the format used by PCXMC. The processing is done at the end of a procedure after all exposures are completed. Since there are thousands of exposure pulses with various parameters for fluoroscopy, DA and DSA and at various projections, the data for exposures with similar parameters is grouped prior to entry into PCXMC to reduce the number of Monte Carlo calculations that need to be performed. Results: The software developed automatically transfers data from the DTS log file to PCXMC and runs the program for each grouping of exposure pulses. When the dose from all exposure events are calculated, the doses for each organ and all effective doses are summed to obtain procedure totals. For a complicated interventional procedure, the calculations can be completed on a PC without manual intervention in less than 30 minutes depending on the level of data grouping. Conclusion: This system allows organ dose to be calculated for individual procedures for every patient without tedious calculations or data entry so that estimates of stochastic risk can be obtained in addition to the deterministic risk estimate provided by the DTS. Partial support from NIH grant R01EB002873 and Toshiba Medical Systems Corp.« less

  9. 17 CFR 274.402 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... for access codes to file on EDGAR. 274.402 Section 274.402 Commodity and Securities Exchanges... Forms for Electronic Filing § 274.402 Form ID, uniform application for access codes to file on EDGAR..., filing agent or training agent to log on to the EDGAR system, submit filings, and change its CCC. (d...

  10. 17 CFR 274.402 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... for access codes to file on EDGAR. 274.402 Section 274.402 Commodity and Securities Exchanges... Forms for Electronic Filing § 274.402 Form ID, uniform application for access codes to file on EDGAR..., filing agent or training agent to log on to the EDGAR system, submit filings, and change its CCC. (d...

  11. Workload Characterization and Performance Implications of Large-Scale Blog Servers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeon, Myeongjae; Kim, Youngjae; Hwang, Jeaho

    With the ever-increasing popularity of social network services (SNSs), an understanding of the characteristics of these services and their effects on the behavior of their host servers is critical. However, there has been a lack of research on the workload characterization of servers running SNS applications such as blog services. To fill this void, we empirically characterized real-world web server logs collected from one of the largest South Korean blog hosting sites for 12 consecutive days. The logs consist of more than 96 million HTTP requests and 4.7 TB of network traffic. Our analysis reveals the followings: (i) The transfermore » size of non-multimedia files and blog articles can be modeled using a truncated Pareto distribution and a log-normal distribution, respectively; (ii) User access for blog articles does not show temporal locality, but is strongly biased towards those posted with image or audio files. We additionally discuss the potential performance improvement through clustering of small files on a blog page into contiguous disk blocks, which benefits from the observed file access patterns. Trace-driven simulations show that, on average, the suggested approach achieves 60.6% better system throughput and reduces the processing time for file access by 30.8% compared to the best performance of the Ext4 file system.« less

  12. Identification and Management of Pump Thrombus in the HeartWare Left Ventricular Assist Device System: A Novel Approach Using Log File Analysis.

    PubMed

    Jorde, Ulrich P; Aaronson, Keith D; Najjar, Samer S; Pagani, Francis D; Hayward, Christopher; Zimpfer, Daniel; Schlöglhofer, Thomas; Pham, Duc T; Goldstein, Daniel J; Leadley, Katrin; Chow, Ming-Jay; Brown, Michael C; Uriel, Nir

    2015-11-01

    The study sought to characterize patterns in the HeartWare (HeartWare Inc., Framingham, Massachusetts) ventricular assist device (HVAD) log files associated with successful medical treatment of device thrombosis. Device thrombosis is a serious adverse event for mechanical circulatory support devices and is often preceded by increased power consumption. Log files of the pump power are easily accessible on the bedside monitor of HVAD patients and may allow early diagnosis of device thrombosis. Furthermore, analysis of the log files may be able to predict the success rate of thrombolysis or the need for pump exchange. The log files of 15 ADVANCE trial patients (algorithm derivation cohort) with 16 pump thrombus events treated with tissue plasminogen activator (tPA) were assessed for changes in the absolute and rate of increase in power consumption. Successful thrombolysis was defined as a clinical resolution of pump thrombus including normalization of power consumption and improvement in biochemical markers of hemolysis. Significant differences in log file patterns between successful and unsuccessful thrombolysis treatments were verified in 43 patients with 53 pump thrombus events implanted outside of clinical trials (validation cohort). The overall success rate of tPA therapy was 57%. Successful treatments had significantly lower measures of percent of expected power (130.9% vs. 196.1%, p = 0.016) and rate of increase in power (0.61 vs. 2.87, p < 0.0001). Medical therapy was successful in 77.7% of the algorithm development cohort and 81.3% of the validation cohort when the rate of power increase and percent of expected power values were <1.25% and 200%, respectively. Log file parameters can potentially predict the likelihood of successful tPA treatments and if validated prospectively, could substantially alter the approach to thrombus management. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  13. A clinically observed discrepancy between image-based and log-based MLC positions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neal, Brian, E-mail: bpn2p@virginia.edu; Ahmed, Mahmoud; Kathuria, Kunal

    2016-06-15

    Purpose: To present a clinical case in which real-time intratreatment imaging identified an multileaf collimator (MLC) leaf to be consistently deviating from its programmed and logged position by >1 mm. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used to capture cine during treatment images. The author serendipitously visually identified a suspected MLC leaf displacement that was not otherwise detected. The leaf position as recorded on the EPID images was measured and log-files were analyzed for the treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days.more » Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log-file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3 ± 0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusions: It has been clinically observed that log-file derived leaf positions can differ from their actual position by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trust log-file records. Intratreatment EPID imaging provides a method to capture departures from MLC planned positions.« less

  14. Use patterns of health information exchange through a multidimensional lens: conceptual framework and empirical validation.

    PubMed

    Politi, Liran; Codish, Shlomi; Sagy, Iftach; Fink, Lior

    2014-12-01

    Insights about patterns of system use are often gained through the analysis of system log files, which record the actual behavior of users. In a clinical context, however, few attempts have been made to typify system use through log file analysis. The present study offers a framework for identifying, describing, and discerning among patterns of use of a clinical information retrieval system. We use the session attributes of volume, diversity, granularity, duration, and content to define a multidimensional space in which each specific session can be positioned. We also describe an analytical method for identifying the common archetypes of system use in this multidimensional space. We demonstrate the value of the proposed framework with a log file of the use of a health information exchange (HIE) system by physicians in an emergency department (ED) of a large Israeli hospital. The analysis reveals five distinct patterns of system use, which have yet to be described in the relevant literature. The results of this study have the potential to inform the design of HIE systems for efficient and effective use, thus increasing their contribution to the clinical decision-making process. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. SU-F-T-233: Evaluation of Treatment Delivery Parameters Using High Resolution ELEKTA Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kabat, C; Defoor, D; Alexandrian, A

    2016-06-15

    Purpose: As modern linacs have become more technologically advanced with the implementation of IGRT and IMRT with HDMLCs, a requirement for more elaborate tracking techniques to monitor components’ integrity is paramount. ElektaLog files are generated every 40 milliseconds, which can be analyzed to track subtle changes and provide another aspect of quality assurance. This allows for constant monitoring of fraction consistency in addition to machine reliability. With this in mind, it was the aim of the study to evaluate if ElektaLog files can be utilized for linac consistency QA. Methods: ElektaLogs were reviewed for 16 IMRT patient plans with >16more » fractions. Logs were analyzed by creating fluence maps from recorded values of MLC locations, jaw locations, and dose per unit time. Fluence maps were then utilized to calculate a 2D gamma index with a 2%–2mm criteria for each fraction. ElektaLogs were also used to analyze positional errors for MLC leaves and jaws, which were used to compute an overall error for the MLC banks, Y-jaws, and X-jaws by taking the root-meansquare value of the individual recorded errors during treatment. Additionally, beam on time was calculated using the number of ElektaLog file entries within the file. Results: The average 2D gamma for all 16 patient plans was found to be 98.0±2.0%. Recorded gamma index values showed an acceptable correlation between fractions. Average RMS values for MLC leaves and the jaws resulted in a leaf variation of roughly 0.3±0.08 mm and jaw variation of about 0.15±0.04 mm, both of which fall within clinical tolerances. Conclusion: The use of ElektaLog files for day-to-day evaluation of linac integrity and patient QA can be utilized to allow for reliable analysis of system accuracy and performance.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilsche, Thomas; Schuchart, Joseph; Cope, Joseph

    Event tracing is an important tool for understanding the performance of parallel applications. As concurrency increases in leadership-class computing systems, the quantity of performance log data can overload the parallel file system, perturbing the application being observed. In this work we present a solution for event tracing at leadership scales. We enhance the I/O forwarding system software to aggregate and reorganize log data prior to writing to the storage system, significantly reducing the burden on the underlying file system for this type of traffic. Furthermore, we augment the I/O forwarding system with a write buffering capability to limit the impactmore » of artificial perturbations from log data accesses on traced applications. To validate the approach, we modify the Vampir tracing tool to take advantage of this new capability and show that the approach increases the maximum traced application size by a factor of 5x to more than 200,000 processors.« less

  17. 17 CFR 239.63 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... for access codes to file on EDGAR. 239.63 Section 239.63 Commodity and Securities Exchanges SECURITIES... Statements § 239.63 Form ID, uniform application for access codes to file on EDGAR. Form ID must be filed by... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification...

  18. 17 CFR 239.63 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... for access codes to file on EDGAR. 239.63 Section 239.63 Commodity and Securities Exchanges SECURITIES... Statements § 239.63 Form ID, uniform application for access codes to file on EDGAR. Form ID must be filed by... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification...

  19. Quantification of residual dose estimation error on log file-based patient dose calculation.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Matsunaga, Kenichi; Matsushita, Haruo; Majima, Kazuhiro; Jingu, Keiichi

    2016-05-01

    The log file-based patient dose estimation includes a residual dose estimation error caused by leaf miscalibration, which cannot be reflected on the estimated dose. The purpose of this study is to determine this residual dose estimation error. Modified log files for seven head-and-neck and prostate volumetric modulated arc therapy (VMAT) plans simulating leaf miscalibration were generated by shifting both leaf banks (systematic leaf gap errors: ±2.0, ±1.0, and ±0.5mm in opposite directions and systematic leaf shifts: ±1.0mm in the same direction) using MATLAB-based (MathWorks, Natick, MA) in-house software. The generated modified and non-modified log files were imported back into the treatment planning system and recalculated. Subsequently, the generalized equivalent uniform dose (gEUD) was quantified for the definition of the planning target volume (PTV) and organs at risks. For MLC leaves calibrated within ±0.5mm, the quantified residual dose estimation errors that obtained from the slope of the linear regression of gEUD changes between non- and modified log file doses per leaf gap are in head-and-neck plans 1.32±0.27% and 0.82±0.17Gy for PTV and spinal cord, respectively, and in prostate plans 1.22±0.36%, 0.95±0.14Gy, and 0.45±0.08Gy for PTV, rectum, and bladder, respectively. In this work, we determine the residual dose estimation errors for VMAT delivery using the log file-based patient dose calculation according to the MLC calibration accuracy. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  20. User's manual for SEDCALC, a computer program for computation of suspended-sediment discharge

    USGS Publications Warehouse

    Koltun, G.F.; Gray, John R.; McElhone, T.J.

    1994-01-01

    Sediment-Record Calculations (SEDCALC), a menu-driven set of interactive computer programs, was developed to facilitate computation of suspended-sediment records. The programs comprising SEDCALC were developed independently in several District offices of the U.S. Geological Survey (USGS) to minimize the intensive labor associated with various aspects of sediment-record computations. SEDCALC operates on suspended-sediment-concentration data stored in American Standard Code for Information Interchange (ASCII) files in a predefined card-image format. Program options within SEDCALC can be used to assist in creating and editing the card-image files, as well as to reformat card-image files to and from formats used by the USGS Water-Quality System. SEDCALC provides options for creating card-image files containing time series of equal-interval suspended-sediment concentrations from 1. digitized suspended-sediment-concentration traces, 2. linear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals, and 3. nonlinear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals. Suspended-sediment discharge can be computed from the streamflow and suspended-sediment-concentration data or by application of transport relations derived by regressing log-transformed instantaneous streamflows on log-transformed instantaneous suspended-sediment concentrations or discharges. The computed suspended-sediment discharge data are stored in card-image files that can be either directly imported to the USGS Automated Data Processing System or used to generate plots by means of other SEDCALC options.

  1. Building analytical platform with Big Data solutions for log files of PanDA infrastructure

    NASA Astrophysics Data System (ADS)

    Alekseev, A. A.; Barreiro Megino, F. G.; Klimentov, A. A.; Korchuganova, T. A.; Maendo, T.; Padolski, S. V.

    2018-05-01

    The paper describes the implementation of a high-performance system for the processing and analysis of log files for the PanDA infrastructure of the ATLAS experiment at the Large Hadron Collider (LHC), responsible for the workload management of order of 2M daily jobs across the Worldwide LHC Computing Grid. The solution is based on the ELK technology stack, which includes several components: Filebeat, Logstash, ElasticSearch (ES), and Kibana. Filebeat is used to collect data from logs. Logstash processes data and export to Elasticsearch. ES are responsible for centralized data storage. Accumulated data in ES can be viewed using a special software Kibana. These components were integrated with the PanDA infrastructure and replaced previous log processing systems for increased scalability and usability. The authors will describe all the components and their configuration tuning for the current tasks, the scale of the actual system and give several real-life examples of how this centralized log processing and storage service is used to showcase the advantages for daily operations.

  2. Comparing Web and Touch Screen Transaction Log Files

    PubMed Central

    Huntington, Paul; Williams, Peter

    2001-01-01

    Background Digital health information is available on a wide variety of platforms including PC-access of the Internet, Wireless Application Protocol phones, CD-ROMs, and touch screen public kiosks. All these platforms record details of user sessions in transaction log files, and there is a growing body of research into the evaluation of this data. However, there is very little research that has examined the problems of comparing the transaction log files of kiosks and the Internet. Objectives To provide a first step towards examining the problems of comparing the transaction log files of kiosks and the Internet. Methods We studied two platforms: touch screen kiosks and a comparable Web site. For both of these platforms, we examined the menu structure (which affects transaction log file data), the log-file structure, and the metrics derived from log-file records. Results We found substantial differences between the generated metrics. Conclusions None of the metrics discussed can be regarded as an effective way of comparing the use of kiosks and Web sites. Two metrics stand out as potentially comparable and valuable: the number of user sessions per hour and user penetration of pages. PMID:11720960

  3. SU-G-JeP1-08: Dual Modality Verification for Respiratory Gating Using New Real- Time Tumor Tracking Radiotherapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiinoki, T; Hanazawa, H; Shibuya, K

    Purpose: The respirato ry gating system combined the TrueBeam and a new real-time tumor-tracking radiotherapy system (RTRT) was installed. The RTRT system consists of two x-ray tubes and color image intensifiers. Using fluoroscopic images, the fiducial marker which was implanted near the tumor was tracked and was used as the internal surrogate for respiratory gating. The purposes of this study was to develop the verification technique of the respiratory gating with the new RTRT using cine electronic portal image device images (EPIDs) of TrueBeam and log files of the RTRT. Methods: A patient who underwent respiratory gated SBRT of themore » lung using the RTRT were enrolled in this study. For a patient, the log files of three-dimensional coordinate of fiducial marker used as an internal surrogate were acquired using the RTRT. Simultaneously, the cine EPIDs were acquired during respiratory gated radiotherapy. The data acquisition was performed for one field at five sessions during the course of SBRT. The residual motion errors were calculated using the log files (E{sub log}). The fiducial marker used as an internal surrogate into the cine EPIDs was automatically extracted by in-house software based on the template-matching algorithm. The differences between the the marker positions of cine EPIDs and digitally reconstructed radiograph were calculated (E{sub EPID}). Results: Marker detection on EPID using in-house software was influenced by low image contrast. For one field during the course of SBRT, the respiratory gating using the RTRT showed the mean ± S.D. of 95{sup th} percentile E{sub EPID} were 1.3 ± 0.3 mm,1.1 ± 0.5 mm,and those of E{sub log} were 1.5 ± 0.2 mm, 1.1 ± 0.2 mm in LR and SI directions, respectively. Conclusion: We have developed the verification method of respiratory gating combined TrueBeam and new real-time tumor-tracking radiotherapy system using EPIDs and log files.« less

  4. 17 CFR 259.602 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...)—allows a filer, filing agent or training agent to log on to the EDGAR system, submit filings, and change... agent to change its Password. [69 FR 22710, Apr. 26, 2004] Editorial Note: For Federal Register... section of the printed volume and at at www.fdsys.gov. ...

  5. 17 CFR 259.602 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...)—allows a filer, filing agent or training agent to log on to the EDGAR system, submit filings, and change... agent to change its Password. [69 FR 22710, Apr. 26, 2004] Editorial Note: For Federal Register... section of the printed volume and on GPO Access. ...

  6. Logs Perl Module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owen, R. K.

    2007-04-04

    A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.

  7. Patterns of usage for a Web-based clinical information system.

    PubMed

    Chen, Elizabeth S; Cimino, James J

    2004-01-01

    Understanding how clinicians are using clinical information systems to assist with their everyday tasks is valuable to the system design and development process. Developers of such systems are interested in monitoring usage in order to make enhancements. System log files are rich resources for gaining knowledge about how the system is being used. We have analyzed the log files of our Web-based clinical information system (WebCIS) to obtain various usage statistics including which WebCIS features are frequently being used. We have also identified usage patterns, which convey how the user is traversing the system. We present our method and these results as well as describe how the results can be used to customize menus, shortcut lists, and patient reports in WebCIS and similar systems.

  8. Sawmill: A Logging File System for a High-Performance RAID Disk Array

    DTIC Science & Technology

    1995-01-01

    from limiting disk performance, new controller architectures connect the disks directly to the network so that data movement bypasses the file server...These developments raise two questions for file systems: how to get the best performance from a RAID, and how to use such a controller architecture ...the RAID-II storage system; this architecture provides a fast data path that moves data rapidly among the disks, high-speed controller memory, and the

  9. Predicting Correctness of Problem Solving from Low-Level Log Data in Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Cetintas, Suleyman; Si, Luo; Xin, Yan Ping; Hord, Casey

    2009-01-01

    This paper proposes a learning based method that can automatically determine how likely a student is to give a correct answer to a problem in an intelligent tutoring system. Only log files that record students' actions with the system are used to train the model, therefore the modeling process doesn't require expert knowledge for identifying…

  10. Development of a user-friendly system for image processing of electron microscopy by integrating a web browser and PIONE with Eos.

    PubMed

    Tsukamoto, Takafumi; Yasunaga, Takuo

    2014-11-01

    Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also need to provide common workspace for analysis because the client is physically separated from a server. We solved the file format problem by extension of rules of OptionControlFile of Eos. Furthermore, to solve workspace problems, we have developed two type of system. The first system is to use only local environments. The user runs a web server provided by Eos, access to a web client through a web browser, and manipulate the local files with GUI on the web browser. The second system is employing PIONE (Process-rule for Input/Output Negotiation Environment), which is our developing platform that works under heterogenic distributed environment. The users can put their resources, such as microscopic images, text files and so on, into the server-side environment supported by PIONE, and so experts can write PIONE rule definition, which defines a workflow of image processing. PIONE run each image processing on suitable computers, following the defined rule. PIONE has the ability of interactive manipulation, and user is able to try a command with various setting values. In this situation, we contribute to auto-generation of GUI for a PIONE workflow.As advanced functions, we have developed a module to log user actions. The logs include information such as setting values in image processing, procedure of commands and so on. If we use the logs effectively, we can get a lot of advantages. For example, when an expert may discover some know-how of image processing, other users can also share logs including his know-hows and so we may obtain recommendation workflow of image analysis, if we analyze logs. To implement social platform of image processing for electron microscopists, we have developed system infrastructure, as well. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. REPHLEX II: An information management system for the ARS Water Data Base

    NASA Astrophysics Data System (ADS)

    Thurman, Jane L.

    1993-08-01

    The REPHLEX II computer system is an on-line information management system which allows scientists, engineers, and other researchers to retrieve data from the ARS Water Data Base using asynchronous communications. The system features two phone lines handling baud rates from 300 to 2400, customized menus to facilitate browsing, help screens, direct access to information and data files, electronic mail processing, file transfers using the XMODEM protocol, and log-in procedures which capture information on new users, process passwords, and log activity for a permanent audit trail. The primary data base on the REPHLEX II system is the ARS Water Data Base which consists of rainfall and runoff data from experimental agricultural watersheds located in the United States.

  12. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    ERIC Educational Resources Information Center

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  13. 17 CFR 249.446 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification Authorization Code (PMAC)—allows a filer, filing agent or training agent to change its Password. [69 FR 22710... Sections Affected, which appears in the Finding Aids section of the printed volume and on GPO Access. ...

  14. A Clustering Methodology of Web Log Data for Learning Management Systems

    ERIC Educational Resources Information Center

    Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros

    2012-01-01

    Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…

  15. SU-E-T-392: Evaluation of Ion Chamber/film and Log File Based QA to Detect Delivery Errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, C; Mason, B; Kirsner, S

    2015-06-15

    Purpose: Ion chamber and film (ICAF) is a method used to verify patient dose prior to treatment. More recently, log file based QA has been shown as an alternative for measurement based QA. In this study, we delivered VMAT plans with and without errors to determine if ICAF and/or log file based QA was able to detect the errors. Methods: Using two VMAT patients, the original treatment plan plus 7 additional plans with delivery errors introduced were generated and delivered. The erroneous plans had gantry, collimator, MLC, gantry and collimator, collimator and MLC, MLC and gantry, and gantry, collimator, andmore » MLC errors. The gantry and collimator errors were off by 4{sup 0} for one of the two arcs. The MLC error introduced was one in which the opening aperture didn’t move throughout the delivery of the field. For each delivery, an ICAF measurement was made as well as a dose comparison based upon log files. Passing criteria to evaluate the plans were ion chamber less and 5% and film 90% of pixels pass the 3mm/3% gamma analysis(GA). For log file analysis 90% of voxels pass the 3mm/3% 3D GA and beam parameters match what was in the plan. Results: Two original plans were delivered and passed both ICAF and log file base QA. Both ICAF and log file QA met the dosimetry criteria on 4 of the 12 erroneous cases analyzed (2 cases were not analyzed). For the log file analysis, all 12 erroneous plans alerted a mismatch in delivery versus what was planned. The 8 plans that didn’t meet criteria all had MLC errors. Conclusion: Our study demonstrates that log file based pre-treatment QA was able to detect small errors that may not be detected using an ICAF and both methods of were able to detect larger delivery errors.« less

  16. Teaching an Old Log New Tricks with Machine Learning.

    PubMed

    Schnell, Krista; Puri, Colin; Mahler, Paul; Dukatz, Carl

    2014-03-01

    To most people, the log file would not be considered an exciting area in technology today. However, these relatively benign, slowly growing data sources can drive large business transformations when combined with modern-day analytics. Accenture Technology Labs has built a new framework that helps to expand existing vendor solutions to create new methods of gaining insights from these benevolent information springs. This framework provides a systematic and effective machine-learning mechanism to understand, analyze, and visualize heterogeneous log files. These techniques enable an automated approach to analyzing log content in real time, learning relevant behaviors, and creating actionable insights applicable in traditionally reactive situations. Using this approach, companies can now tap into a wealth of knowledge residing in log file data that is currently being collected but underutilized because of its overwhelming variety and volume. By using log files as an important data input into the larger enterprise data supply chain, businesses have the opportunity to enhance their current operational log management solution and generate entirely new business insights-no longer limited to the realm of reactive IT management, but extending from proactive product improvement to defense from attacks. As we will discuss, this solution has immediate relevance in the telecommunications and security industries. However, the most forward-looking companies can take it even further. How? By thinking beyond the log file and applying the same machine-learning framework to other log file use cases (including logistics, social media, and consumer behavior) and any other transactional data source.

  17. Coastal bathymetry data collected in 2011 from the Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    DeWitt, Nancy T.; Pfeiffer, William R.; Bernier, Julie C.; Buster, Noreen A.; Miselis, Jennifer L.; Flocks, James G.; Reynolds, Billy J.; Wiese, Dana S.; Kelso, Kyle W.

    2014-01-01

    This report serves as an archive of processed interferometric swath and single-beam bathymetry data. Geographic Iinformation System data products include a 50-meter cell-size interpolated bathymetry grid surface, trackline maps, and point data files. Additional files include error analysis maps, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.

  18. 17 CFR 249.446 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification Authorization Code (PMAC)—allows a filer, filing agent or training agent to change its Password. [69 FR 22710... Sections Affected, which appears in the Finding Aids section of the printed volume and at at www.fdsys.gov. ...

  19. From Log Files to Assessment Metrics: Measuring Students' Science Inquiry Skills Using Educational Data Mining

    ERIC Educational Resources Information Center

    Gobert, Janice D.; Sao Pedro, Michael; Raziuddin, Juelaila; Baker, Ryan S.

    2013-01-01

    We present a method for assessing science inquiry performance, specifically for the inquiry skill of designing and conducting experiments, using educational data mining on students' log data from online microworlds in the Inq-ITS system (Inquiry Intelligent Tutoring System; www.inq-its.org). In our approach, we use a 2-step process: First we use…

  20. Analysis of the request patterns to the NSSDC on-line archive

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore

    1994-01-01

    NASA missions, both for earth science and for space science, collect huge amounts of data, and the rate at which data is being gathered is increasing. For example, the EOSDIS project is expected to collect petabytes per year. In addition, these archives are being made available to remote users over the Internet. The ability to manage the growth of the size and request activity of scientific archives depends on an understanding of the access patterns of scientific users. The National Space Science Data Center (NSSDC) of NASA Goddard Space Flight Center has run their on-line mass storage archive of space data, the National Data Archive and Distribution Service (NDADS), since November 1991. A large world-wide space research community makes use of NSSDC, requesting more than 20,000 files per month. Since the initiation of their service, they have maintained log files which record all accesses the archive. In this report, we present an analysis of the NDADS log files. We analyze the log files, and discuss several issues, including caching, reference patterns, clustering, and system loading.

  1. Zebra: A striped network file system

    NASA Technical Reports Server (NTRS)

    Hartman, John H.; Ousterhout, John K.

    1992-01-01

    The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.

  2. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  3. LogScope

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex

    2012-01-01

    LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).

  4. SU-F-T-288: Impact of Trajectory Log Files for Clarkson-Based Independent Dose Verification of IMRT and VMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, R; Kamima, T; Tachibana, H

    2016-06-15

    Purpose: To investigate the effect of the trajectory files from linear accelerator for Clarkson-based independent dose verification in IMRT and VMAT plans. Methods: A CT-based independent dose verification software (Simple MU Analysis: SMU, Triangle Products, Japan) with a Clarksonbased algorithm was modified to calculate dose using the trajectory log files. Eclipse with the three techniques of step and shoot (SS), sliding window (SW) and Rapid Arc (RA) was used as treatment planning system (TPS). In this study, clinically approved IMRT and VMAT plans for prostate and head and neck (HN) at two institutions were retrospectively analyzed to assess the dosemore » deviation between DICOM-RT plan (PL) and trajectory log file (TJ). An additional analysis was performed to evaluate MLC error detection capability of SMU when the trajectory log files was modified by adding systematic errors (0.2, 0.5, 1.0 mm) and random errors (5, 10, 30 mm) to actual MLC position. Results: The dose deviations for prostate and HN in the two sites were 0.0% and 0.0% in SS, 0.1±0.0%, 0.1±0.1% in SW and 0.6±0.5%, 0.7±0.9% in RA, respectively. The MLC error detection capability shows the plans for HN IMRT were the most sensitive and 0.2 mm of systematic error affected 0.7% dose deviation on average. Effect of the MLC random error did not affect dose error. Conclusion: The use of trajectory log files including actual information of MLC location, gantry angle, etc should be more effective for an independent verification. The tolerance level for the secondary check using the trajectory file may be similar to that of the verification using DICOM-RT plan file. From the view of the resolution of MLC positional error detection, the secondary check could detect the MLC position error corresponding to the treatment sites and techniques. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  5. A Prototype Implementation of a Time Interval File Protection System in Linux

    DTIC Science & Technology

    2006-09-01

    when a user logs in, the /etc/ passwd file is read by the system to get the user’s home directory. The user’s login shell then changes the directory...and don. • Users can be added with the command: # useradd – m <username> • Set the password by: # passwd <username> • Make a copy of the

  6. Parallel file system with metadata distributed across partitioned key-value store c

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  7. [Investigation of Elekta linac characteristics for VMAT].

    PubMed

    Luo, Guangwen; Zhang, Kunyi

    2012-01-01

    The aim of this study is to investigate the characteristics of Elekta delivery system for volumetric modulated arc therapy (VMAT). Five VMAT plans were delivered in service mode and dose rates, and speed of gantry and MLC leaves were analyzed by log files. Results showed that dose rates varied between 6 dose rates. Gantry and MLC leaf speed dynamically varied during delivery. The technique of VMAT requires linac to dynamically control more parameters, and these key dynamic variables during VMAT delivery can be checked by log files. Quality assurance procedure should be carried out for VMAT related parameter.

  8. 46 CFR Appendix A to Part 530 - Instructions for the Filing of Service Contracts

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... file service contracts. BTCL will direct OIRM to provide approved filers with a log-on ID and password. Filers who wish a third party (publisher) to file their service contracts must so indicate on Form FMC-83... home page, http://www.fmc.gov. A. Registration, Log-on ID and Password To register for filing, a...

  9. 46 CFR Appendix A to Part 530 - Instructions for the Filing of Service Contracts

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... file service contracts. BTCL will direct OIRM to provide approved filers with a log-on ID and password. Filers who wish a third party (publisher) to file their service contracts must so indicate on Form FMC-83... home page, http://www.fmc.gov. A. Registration, Log-on ID and Password To register for filing, a...

  10. Design and development of an automatic data acquisition system for a balance study using a smartcard system.

    PubMed

    Ambrozy, C; Kolar, N A; Rattay, F

    2010-01-01

    For measurement value logging of board angle values during balance training, it is necessary to develop a measurement system. This study will provide data for a balance study using the smartcard. The data acquisition comes automatically. An individually training plan for each proband is necessary. To store the proband identification a smartcard with an I2C data bus protocol and an E2PROM memory system is used. For reading the smartcard data a smartcard reader is connected via universal serial bus (USB) to a notebook. The data acquisition and smartcard read programme is designed with Microsoft® Visual C#. A training plan file contains the individual training plan for each proband. The data of the test persons are saved in a proband directory. Each event is automatically saved as a log-file for the exact documentation. This system makes study development easy and time-saving.

  11. The key image and case log application: new radiology software for teaching file creation and case logging that incorporates elements of a social network.

    PubMed

    Rowe, Steven P; Siddiqui, Adeel; Bonekamp, David

    2014-07-01

    To create novel radiology key image software that is easy to use for novice users, incorporates elements adapted from social networking Web sites, facilitates resident and fellow education, and can serve as the engine for departmental sharing of interesting cases and follow-up studies. Using open-source programming languages and software, radiology key image software (the key image and case log application, KICLA) was developed. This system uses a lightweight interface with the institutional picture archiving and communications systems and enables the storage of key images, image series, and cine clips. It was designed to operate with minimal disruption to the radiologists' daily workflow. Many features of the user interface have been inspired by social networking Web sites, including image organization into private or public folders, flexible sharing with other users, and integration of departmental teaching files into the system. We also review the performance, usage, and acceptance of this novel system. KICLA was implemented at our institution and achieved widespread popularity among radiologists. A large number of key images have been transmitted to the system since it became available. After this early experience period, the most commonly encountered radiologic modalities are represented. A survey distributed to users revealed that most of the respondents found the system easy to use (89%) and fast at allowing them to record interesting cases (100%). Hundred percent of respondents also stated that they would recommend a system such as KICLA to their colleagues. The system described herein represents a significant upgrade to the Digital Imaging and Communications in Medicine teaching file paradigm with efforts made to maximize its ease of use and inclusion of characteristics inspired by social networking Web sites that allow the system additional functionality such as individual case logging. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  12. An EXCEL macro for importing log ASCII standard (LAS) files into EXCEL worksheets

    NASA Astrophysics Data System (ADS)

    Özkaya, Sait Ismail

    1996-02-01

    An EXCEL 5.0 macro is presented for converting a LAS text file into an EXCEL worksheet. Although EXCEL has commands for importing text files and parsing text lines, LAS files must be decoded line-by-line because three different delimiters are used to separate fields of differing length. The macro is intended to eliminate manual decoding of LAS version 2.0. LAS is a floppy disk format for storage and transfer of log data as text files. LAS was proposed by the Canadian Well Logging Society. The present EXCEL macro decodes different sections of a LAS file, separates, and places the fields into different columns of an EXCEL worksheet. To import a LAS file into EXCEL without errors, the file must not contain any unrecognized symbols, and the data section must be the last section. The program does not check for the presence of mandatory sections or fields as required by LAS rules. Once a file is incorporated into EXCEL, mandatory sections and fields may be inspected visually.

  13. 75 FR 27051 - Privacy Act of 1974: System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-13

    ... address and appears below: DOT/FMCSA 004 SYSTEM NAME: National Consumer Complaint Database (NCCDB.... A system, database, and procedures for filing and logging consumer complaints relating to household... are stored in an automated system operated and maintained at the Volpe National Transportation Systems...

  14. 75 FR 76426 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-08

    ..., access control lists, file system permissions, intrusion detection and prevention systems and log..., address, mailing address, country, organization, phone, fax, mobile, pager, Defense Switched Network (DSN..., address, mailing address, country, organization, phone, fax, mobile, pager, Defense Switched Network (DSN...

  15. Visual behavior characterization for intrusion and misuse detection

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.; Frincke, Deborah

    2001-05-01

    As computer and network intrusions become more and more of a concern, the need for better capabilities, to assist in the detection and analysis of intrusions also increase. System administrators typically rely on log files to analyze usage and detect misuse. However, as a consequence of the amount of data collected by each machine, multiplied by the tens or hundreds of machines under the system administrator's auspices, the entirety of the data available is neither collected nor analyzed. This is compounded by the need to analyze network traffic data as well. We propose a methodology for analyzing network and computer log information visually based on the analysis of the behavior of the users. Each user's behavior is the key to determining their intent and overriding activity, whether they attempt to hide their actions or not. Proficient hackers will attempt to hide their ultimate activities, which hinders the reliability of log file analysis. Visually analyzing the users''s behavior however, is much more adaptable and difficult to counteract.

  16. Online Courses Assessment through Measuring and Archetyping of Usage Data

    ERIC Educational Resources Information Center

    Kazanidis, Ioannis; Theodosiou, Theodosios; Petasakis, Ioannis; Valsamidis, Stavros

    2016-01-01

    Database files and additional log files of Learning Management Systems (LMSs) contain an enormous volume of data which usually remain unexploited. A new methodology is proposed in order to analyse these data both on the level of both the courses and the learners. Specifically, "regression analysis" is proposed as a first step in the…

  17. Automating linear accelerator quality assurance.

    PubMed

    Eckhause, Tobias; Al-Hallaq, Hania; Ritter, Timothy; DeMarco, John; Farrey, Karl; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Perez, Mario; Park, SungYong; Booth, Jeremy T; Thorwarth, Ryan; Moran, Jean M

    2015-10-01

    The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The standard deviation in MLC position, as determined by EPID measurements, across the consortium was 0.33 mm for IMRT fields. With respect to the log files, the deviations between expected and actual positions for parameters were small (<0.12 mm) for all Linacs. Considering both log files and EPID measurements, all parameters were well within published tolerance values. Variations in collimator angle, MLC position, and gantry sag were also evaluated for all Linacs. The performance of the TrueBeam Linac model was shown to be consistent based on automated analysis of trajectory log files and EPID images acquired during delivery of a standardized test suite. The results can be compared directly to tolerance thresholds. In addition, sharing of results from standard tests across institutions can facilitate the identification of QA process and Linac changes. These reference values are presented along with the standard deviation for common tests so that the test suite can be used by other centers to evaluate their Linac performance against those in this consortium.

  18. Linking log files with dosimetric accuracy--A multi-institutional study on quality assurance of volumetric modulated arc therapy.

    PubMed

    Pasler, Marlies; Kaas, Jochem; Perik, Thijs; Geuze, Job; Dreindl, Ralf; Künzler, Thomas; Wittkamper, Frits; Georg, Dietmar

    2015-12-01

    To systematically evaluate machine specific quality assurance (QA) for volumetric modulated arc therapy (VMAT) based on log files by applying a dynamic benchmark plan. A VMAT benchmark plan was created and tested on 18 Elekta linacs (13 MLCi or MLCi2, 5 Agility) at 4 different institutions. Linac log files were analyzed and a delivery robustness index was introduced. For dosimetric measurements an ionization chamber array was used. Relative dose deviations were assessed by mean gamma for each control point and compared to the log file evaluation. Fourteen linacs delivered the VMAT benchmark plan, while 4 linacs failed by consistently terminating the delivery. The mean leaf error (±1SD) was 0.3±0.2 mm for all linacs. Large MLC maximum errors up to 6.5 mm were observed at reversal positions. Delivery robustness index accounting for MLC position correction (0.8-1.0) correlated with delivery time (80-128 s) and depended on dose rate performance. Dosimetric evaluation indicated in general accurate plan reproducibility with γ(mean)(±1 SD)=0.4±0.2 for 1 mm/1%. However single control point analysis revealed larger deviations and attributed well to log file analysis. The designed benchmark plan helped identify linac related malfunctions in dynamic mode for VMAT. Log files serve as an important additional QA measure to understand and visualize dynamic linac parameters. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. A Varian DynaLog file-based procedure for patient dose-volume histogram-based IMRT QA.

    PubMed

    Calvo-Ortega, Juan F; Teke, Tony; Moragues, Sandra; Pozo, Miquel; Casals-Farran, Joan

    2014-03-06

    In the present study, we describe a method based on the analysis of the dynamic MLC log files (DynaLog) generated by the controller of a Varian linear accelerator in order to perform patient-specific IMRT QA. The DynaLog files of a Varian Millennium MLC, recorded during an IMRT treatment, can be processed using a MATLAB-based code in order to generate the actual fluence for each beam and so recalculate the actual patient dose distribution using the Eclipse treatment planning system. The accuracy of the DynaLog-based dose reconstruction procedure was assessed by introducing ten intended errors to perturb the fluence of the beams of a reference plan such that ten subsequent erroneous plans were generated. In-phantom measurements with an ionization chamber (ion chamber) and planar dose measurements using an EPID system were performed to investigate the correlation between the measured dose changes and the expected ones detected by the reconstructed plans for the ten intended erroneous cases. Moreover, the method was applied to 20 cases of clinical plans for different locations (prostate, lung, breast, and head and neck). A dose-volume histogram (DVH) metric was used to evaluate the impact of the delivery errors in terms of dose to the patient. The ionometric measurements revealed a significant positive correlation (R² = 0.9993) between the variations of the dose induced in the erroneous plans with respect to the reference plan and the corresponding changes indicated by the DynaLog-based reconstructed plans. The EPID measurements showed that the accuracy of the DynaLog-based method to reconstruct the beam fluence was comparable with the dosimetric resolution of the portal dosimetry used in this work (3%/3 mm). The DynaLog-based reconstruction method described in this study is a suitable tool to perform a patient-specific IMRT QA. This method allows us to perform patient-specific IMRT QA by evaluating the result based on the DVH metric of the planning CT image (patient DVH-based IMRT QA).

  20. Parallel compression of data chunks of a shared data object using a log-structured file system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-10-25

    Techniques are provided for parallel compression of data chunks being written to a shared object. A client executing on a compute node or a burst buffer node in a parallel computing system stores a data chunk generated by the parallel computing system to a shared data object on a storage node by compressing the data chunk; and providing the data compressed data chunk to the storage node that stores the shared object. The client and storage node may employ Log-Structured File techniques. The compressed data chunk can be de-compressed by the client when the data chunk is read. A storagemore » node stores a data chunk as part of a shared object by receiving a compressed version of the data chunk from a compute node; and storing the compressed version of the data chunk to the shared data object on the storage node.« less

  1. Exploring Online Students' Self-Regulated Learning with Self-Reported Surveys and Log Files: A Data Mining Approach

    ERIC Educational Resources Information Center

    Cho, Moon-Heum; Yoo, Jin Soung

    2017-01-01

    Many researchers who are interested in studying students' online self-regulated learning (SRL) have heavily relied on self-reported surveys. Data mining is an alternative technique that can be used to discover students' SRL patterns from large data logs saved on a course management system. The purpose of this study was to identify students' online…

  2. 75 FR 69644 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-15

    ..., organization, phone, fax, mobile, pager, Defense Switched Network (DSN) phone, other fax, other mobile, other.../Transport Layer Security (SSL/ TLS) connections, access control lists, file system permissions, intrusion detection and prevention systems and log monitoring. Complete access to all records is restricted to and...

  3. Index map of cross sections through parts of the Appalachian basin (Kentucky, New York, Ohio, Pennsylvania, Tennessee, Virginia, West Virginia): Chapter E.1 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    The appendixes in chapters E.4.1 and E.4.2 include (1) Log ASCII Standard (LAS) files, which encode gamma-ray, neutron, density, and other logs in text files that can be used by most well-logging software programs; and (2) graphic well-log traces. In the appendix to chapter E.4.1, the well-log traces are accompanied by lithologic descriptions with formation tops.

  4. An alternative to sneakernet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrell, S.; Ralstin, S.

    1992-04-01

    Many computer security plans specify that only a small percentage of the data processed will be classified. Thus, the bulk of the data on secure systems must be unclassified. Secure limited access sites operating approved classified computing systems sometimes also have a system ostensibly containing only unclassified files but operating within the secure environment. That system could be networked or otherwise connected to a classified system(s) in order that both be able to use common resources for file storage or computing power. Such a system must operate under the same rules as the secure classified systems. It is in themore » nature of unclassified files that they either came from, or will eventually migrate to, a non-secure system. Today, unclassified files are exported from systems within the secure environment typically by loading transport media and carrying them to an open system. Import of unclassified files is handled similarly. This media transport process, sometimes referred to as sneaker net, often is manually logged and controlled only by administrative procedures. A comprehensive system for secure bi-directional transfer of unclassified files between secure and open environments has yet to be developed. Any such secure file transport system should be required to meet several stringent criteria. It is the purpose of this document to begin a definition of these criteria.« less

  5. An alternative to sneakernet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrell, S.; Ralstin, S.

    1992-01-01

    Many computer security plans specify that only a small percentage of the data processed will be classified. Thus, the bulk of the data on secure systems must be unclassified. Secure limited access sites operating approved classified computing systems sometimes also have a system ostensibly containing only unclassified files but operating within the secure environment. That system could be networked or otherwise connected to a classified system(s) in order that both be able to use common resources for file storage or computing power. Such a system must operate under the same rules as the secure classified systems. It is in themore » nature of unclassified files that they either came from, or will eventually migrate to, a non-secure system. Today, unclassified files are exported from systems within the secure environment typically by loading transport media and carrying them to an open system. Import of unclassified files is handled similarly. This media transport process, sometimes referred to as sneaker net, often is manually logged and controlled only by administrative procedures. A comprehensive system for secure bi-directional transfer of unclassified files between secure and open environments has yet to be developed. Any such secure file transport system should be required to meet several stringent criteria. It is the purpose of this document to begin a definition of these criteria.« less

  6. Development of a Methodology for Customizing Insider Threat Auditing on a Linux Operating System

    DTIC Science & Technology

    2010-03-01

    information /etc/group, passwd ,gshadow,shadow,/security/opasswd 16 User A attempts to access User B directory 17 User A attempts to access User B file w/o...configuration Handled by audit rules for root actions Audit user write attempts to system files -w /etc/group –p wxa -w /etc/ passwd –p wxa -w /etc/gshadow –p...information (/etc/group, /etc/ passwd , /etc/gshadow, /etc/shadow, /etc/sudoers, /etc/security/opasswd) Procedure: 1. User2 logs into the system

  7. The medium is NOT the message or Indefinitely long-term file storage at Leeds University

    NASA Technical Reports Server (NTRS)

    Holdsworth, David

    1996-01-01

    Approximately 3 years ago we implemented an archive file storage system which embodies experiences gained over more than 25 years of using and writing file storage systems. It is the third in-house system that we have written, and all three systems have been adopted by other institutions. This paper discusses the requirements for long-term data storage in a university environment, and describes how our present system is designed to meet these requirements indefinitely. Particular emphasis is laid on experiences from past systems, and their influence on current system design. We also look at the influence of the IEEE-MSS standard. We currently have the system operating in five UK universities. The system operates in a multi-server environment, and is currently operational with UNIX (SunOS4, Solaris2, SGI-IRIX, HP-UX), NetWare3 and NetWare4. PCs logged on to NetWare can also archive and recover files that live on their hard disks.

  8. SU-E-J-182: Reproducibility of Tumor Motion Probability Distribution Function in Stereotactic Body Radiation Therapy of Lung Using Real-Time Tumor-Tracking Radiotherapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiinoki, T; Hanazawa, H; Park, S

    2015-06-15

    Purpose: We aim to achieve new four-dimensional radiotherapy (4DRT) using the next generation real-time tumor-tracking (RTRT) system and flattening-filter-free techniques. To achieve new 4DRT, it is necessary to understand the respiratory motion of tumor. The purposes of this study were: 1.To develop the respiratory motion analysis tool using log files. 2.To evaluate the reproducibility of tumor motion probability distribution function (PDF) during stereotactic body RT (SBRT) of lung tumor. Methods: Seven patients having fiducial markers closely implanted to the lung tumor were enrolled in this study. The positions of fiducial markers were measured using the RTRT system (Mitsubishi Electronics Co.,more » JP) and recorded as two types of log files during the course of SBRT. For each patients, tumor motion range and tumor motion PDFs in left-right (LR), anterior-posterior (AP) and superior-inferior (SI) directions were calculated using log files of all beams per fraction (PDFn). Fractional PDF reproducibility (Rn) was calculated as Kullback-Leibler (KL) divergence between PDF1 and PDFn of tumor motion. The mean of Rn (Rm) was calculated for each patient and correlated to the patient’s mean tumor motion range (Am). The change of Rm during the course of SBRT was also evluated. These analyses were performed using in-house developed software. Results: The Rm were 0.19 (0.07–0.30), 0.14 (0.07–0.32) and 0.16 (0.09–0.28) in LR, AP and SI directions, respectively. The Am were 5.11 mm (2.58–9.99 mm), 7.81 mm (2.87–15.57 mm) and 11.26 mm (3.80–21.27 mm) in LR, AP and SI directions, respectively. The PDF reproducibility decreased as the tumor motion range increased in AP and SI direction. That decreased slightly through the course of RT in SI direction. Conclusion: We developed the respiratory motion analysis tool for 4DRT using log files and quantified the range and reproducibility of respiratory motion for lung tumors.« less

  9. The RIACS Intelligent Auditing and Categorizing System

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1988-01-01

    The organization of the RIACS auditing package is described along with how to installation instructions and how to interpret the output. How to set up both local and remote file system auditing is given. Logging is done on a time driven basis, and auditing in a passive mode.

  10. The design and implementation of the HY-1B Product Archive System

    NASA Astrophysics Data System (ADS)

    Liu, Shibin; Liu, Wei; Peng, Hailong

    2010-11-01

    Product Archive System (PAS), as a background system, is the core part of the Product Archive and Distribution System (PADS) which is the center for data management of the Ground Application System of HY-1B satellite hosted by the National Satellite Ocean Application Service of China. PAS integrates a series of updating methods and technologies, such as a suitable data transmittal mode, flexible configuration files and log information in order to make the system with several desirable characteristics, such as ease of maintenance, stability, minimal complexity. This paper describes seven major components of the PAS (Network Communicator module, File Collector module, File Copy module, Task Collector module, Metadata Extractor module, Product data Archive module, Metadata catalogue import module) and some of the unique features of the system, as well as the technical problems encountered and resolved.

  11. Comparing image search behaviour in the ARRS GoldMiner search engine and a clinical PACS/RIS.

    PubMed

    De-Arteaga, Maria; Eggel, Ivan; Do, Bao; Rubin, Daniel; Kahn, Charles E; Müller, Henning

    2015-08-01

    Information search has changed the way we manage knowledge and the ubiquity of information access has made search a frequent activity, whether via Internet search engines or increasingly via mobile devices. Medical information search is in this respect no different and much research has been devoted to analyzing the way in which physicians aim to access information. Medical image search is a much smaller domain but has gained much attention as it has different characteristics than search for text documents. While web search log files have been analysed many times to better understand user behaviour, the log files of hospital internal systems for search in a PACS/RIS (Picture Archival and Communication System, Radiology Information System) have rarely been analysed. Such a comparison between a hospital PACS/RIS search and a web system for searching images of the biomedical literature is the goal of this paper. Objectives are to identify similarities and differences in search behaviour of the two systems, which could then be used to optimize existing systems and build new search engines. Log files of the ARRS GoldMiner medical image search engine (freely accessible on the Internet) containing 222,005 queries, and log files of Stanford's internal PACS/RIS search called radTF containing 18,068 queries were analysed. Each query was preprocessed and all query terms were mapped to the RadLex (Radiology Lexicon) terminology, a comprehensive lexicon of radiology terms created and maintained by the Radiological Society of North America, so the semantic content in the queries and the links between terms could be analysed, and synonyms for the same concept could be detected. RadLex was mainly created for the use in radiology reports, to aid structured reporting and the preparation of educational material (Lanlotz, 2006) [1]. In standard medical vocabularies such as MeSH (Medical Subject Headings) and UMLS (Unified Medical Language System) specific terms of radiology are often underrepresented, therefore RadLex was considered to be the best option for this task. The results show a surprising similarity between the usage behaviour in the two systems, but several subtle differences can also be noted. The average number of terms per query is 2.21 for GoldMiner and 2.07 for radTF, the used axes of RadLex (anatomy, pathology, findings, …) have almost the same distribution with clinical findings being the most frequent and the anatomical entity the second; also, combinations of RadLex axes are extremely similar between the two systems. Differences include a longer length of the sessions in radTF than in GoldMiner (3.4 and 1.9 queries per session on average). Several frequent search terms overlap but some strong differences exist in the details. In radTF the term "normal" is frequent, whereas in GoldMiner it is not. This makes intuitive sense, as in the literature normal cases are rarely described whereas in clinical work the comparison with normal cases is often a first step. The general similarity in many points is likely due to the fact that users of the two systems are influenced by their daily behaviour in using standard web search engines and follow this behaviour in their professional search. This means that many results and insights gained from standard web search can likely be transferred to more specialized search systems. Still, specialized log files can be used to find out more on reformulations and detailed strategies of users to find the right content. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. TU-CD-304-11: Veritas 2.0: A Cloud-Based Tool to Facilitate Research and Innovation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra, P; Patankar, A; Etmektzoglou, A

    Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verifiedmore » via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto.« less

  13. 46 CFR 97.35-3 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... voyage is completed, the master or person in charge shall file the logbook with the Officer in Charge.... Such logs or records are not filed with the Officer in Charge, Marine Inspection, but must be kept... logs for the period of validity of the vessel's certificate of inspection. [CGD 95-027, 61 FR 26007...

  14. 46 CFR 97.35-3 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... voyage is completed, the master or person in charge shall file the logbook with the Officer in Charge.... Such logs or records are not filed with the Officer in Charge, Marine Inspection, but must be kept... logs for the period of validity of the vessel's certificate of inspection. [CGD 95-027, 61 FR 26007...

  15. SU-C-BRD-03: Analysis of Accelerator Generated Text Logs for Preemptive Maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Able, CM; Baydush, AH; Nguyen, C

    2014-06-15

    Purpose: To develop a model to analyze medical accelerator generated parameter and performance data that will provide an early warning of performance degradation and impending component failure. Methods: A robust 6 MV VMAT quality assurance treatment delivery was used to test the constancy of accelerator performance. The generated text log files were decoded and analyzed using statistical process control (SPC) methodology. The text file data is a single snapshot of energy specific and overall systems parameters. A total of 36 system parameters were monitored which include RF generation, electron gun control, energy control, beam uniformity control, DC voltage generation, andmore » cooling systems. The parameters were analyzed using Individual and Moving Range (I/MR) charts. The chart limits were calculated using a hybrid technique that included the use of the standard 3σ limits and the parameter/system specification. Synthetic errors/changes were introduced to determine the initial effectiveness of I/MR charts in detecting relevant changes in operating parameters. The magnitude of the synthetic errors/changes was based on: the value of 1 standard deviation from the mean operating parameter of 483 TB systems, a small fraction (≤ 5%) of the operating range, or a fraction of the minor fault deviation. Results: There were 34 parameters in which synthetic errors were introduced. There were 2 parameters (radial position steering coil, and positive 24V DC) in which the errors did not exceed the limit of the I/MR chart. The I chart limit was exceeded for all of the remaining parameters (94.2%). The MR chart limit was exceeded in 29 of the 32 parameters (85.3%) in which the I chart limit was exceeded. Conclusion: Statistical process control I/MR evaluation of text log file parameters may be effective in providing an early warning of performance degradation or component failure for digital medical accelerator systems. Research is Supported by Varian Medical Systems, Inc.« less

  16. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    PubMed

    Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen

    2015-01-01

    Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  17. The Added Value of Log File Analyses of the Use of a Personal Health Record for Patients With Type 2 Diabetes Mellitus

    PubMed Central

    Kelders, Saskia M.; Braakman-Jansen, Louise M. A.; van Gemert-Pijnen, Julia E. W. C.

    2014-01-01

    The electronic personal health record (PHR) is a promising technology for improving the quality of chronic disease management. Until now, evaluations of such systems have provided only little insight into why a particular outcome occurred. The aim of this study is to gain insight into the navigation process (what functionalities are used, and in what sequence) of e-Vita, a PHR for patients with type 2 diabetes mellitus (T2DM), to increase the efficiency of the system and improve the long-term adherence. Log data of the first visits in the first 6 weeks after the release of a renewed version of e-Vita were analyzed to identify the usage patterns that emerge when users explore a new application. After receiving the invitation, 28% of all registered users visited e-Vita. In total, 70 unique usage patterns could be identified. When users visited the education service first, 93% of all users ended their session. Most users visited either 1 or 5 or more services during their first session, but the distribution of the routes was diffuse. In conclusion, log file analyses can provide valuable prompts for improving the system design of a PHR. In this way, the match between the system and its users and the long-term adherence has the potential to increase. PMID:24876574

  18. SU-F-T-295: MLCs Performance and Patient-Specific IMRT QA Using Log File Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osman, A; American University of Biuret Medical Center, Biuret; Maalej, N

    2016-06-15

    Purpose: To analyze the performance of the multi-leaf collimators (MLCs) from the log files recorded during the intensity modulated radiotherapy (IMRT) treatment and to construct the relative fluence maps and do the gamma analysis to compare the planned and executed MLCs movement. Methods: We developed a program to extract and analyze the data from dynamic log files (dynalog files) generated from sliding window IMRT delivery treatments. The program extracts the planned and executed (actual or delivered) MLCs movement, calculates and compares the relative planned and executed fluences. The fluence maps were used to perform the gamma analysis (with 3% dosemore » difference and 3 mm distance to agreement) for 3 IMR patients. We compared our gamma analysis results with those obtained from portal dose image prediction (PDIP) algorithm performed using the EPID. Results: For 3 different IMRT patient treatments, the maximum difference between the planned and the executed MCLs positions was 1.2 mm. The gamma analysis results of the planned and delivered fluences were in good agreement with the gamma analysis from portal dosimetry. The maximum difference for number of pixels passing the gamma criteria (3%/3mm) was 0.19% with respect to portal dosimetry results. Conclusion: MLC log files can be used to verify the performance of the MLCs. Patientspecific IMRT QA based on MLC movement log files gives similar results to EPID dosimetry results. This promising method for patient-specific IMRT QA is fast, does not require dose measurements in a phantom, can be done before the treatment and for every fraction, and significantly reduces the IMRT workload. The author would like to thank King Fahd University of petroleum and Minerals for the support.« less

  19. 20 CFR 658.410 - Establishment of State agency JS complaint system.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... system. At the local office level, the local office manager shall be responsible for the management of... related), the local office manager shall transmit a copy of that portion of the log containing the... established for the handling of complaints and files relating to the handling of complaints. The Manager or...

  20. AliEn—ALICE environment on the GRID

    NASA Astrophysics Data System (ADS)

    Saiz, P.; Aphecetche, L.; Bunčić, P.; Piskač, R.; Revsbech, J.-E.; Šego, V.; Alice Collaboration

    2003-04-01

    AliEn ( http://alien.cern.ch) (ALICE Environment) is a Grid framework built on top of the latest Internet standards for information exchange and authentication (SOAP, PKI) and common Open Source components. AliEn provides a virtual file catalogue that allows transparent access to distributed datasets and a number of collaborating Web services which implement the authentication, job execution, file transport, performance monitor and event logging. In the paper we will present the architecture and components of the system.

  1. Log ASCII Standard (LAS) Files for Geophysical Wireline Well Logs and Their Application to Geologic Cross Sections Through the Central Appalachian Basin

    USGS Publications Warehouse

    Crangle, Robert D.

    2007-01-01

    Introduction The U.S. Geological Survey (USGS) uses geophysical wireline well logs for a variety of purposes, including stratigraphic correlation (Hettinger, 2001, Ryder, 2002), petroleum reservoir analyses (Nelson and Bird, 2005), aquifer studies (Balch, 1988), and synthetic seismic profiles (Kulander and Ryder, 2005). Commonly, well logs are easier to visualize, manipulate, and interpret when available in a digital format. In recent geologic cross sections E-E' and D-D', constructed through the central Appalachian basin (Ryder, Swezey, and others, in press; Ryder, Crangle, and others, in press), gamma ray well log traces and lithologic logs were used to correlate key stratigraphic intervals (Fig. 1). The stratigraphy and structure of the cross sections are illustrated through the use of graphical software applications (e.g., Adobe Illustrator). The gamma ray traces were digitized in Neuralog (proprietary software) from paper well logs and converted to a Log ASCII Standard (LAS) format. Once converted, the LAS files were transformed to images through an LAS-reader application (e.g., GeoGraphix Prizm) and then overlain in positions adjacent to well locations, used for stratigraphic control, on each cross section. This report summarizes the procedures used to convert paper logs to a digital LAS format using a third-party software application, Neuralog. Included in this report are LAS files for sixteen wells used in geologic cross section E-E' (Table 1) and thirteen wells used in geologic cross section D-D' (Table 2).

  2. Mission Operations Center (MOC) - Precipitation Processing System (PPS) Interface Software System (MPISS)

    NASA Technical Reports Server (NTRS)

    Ferrara, Jeffrey; Calk, William; Atwell, William; Tsui, Tina

    2013-01-01

    MPISS is an automatic file transfer system that implements a combination of standard and mission-unique transfer protocols required by the Global Precipitation Measurement Mission (GPM) Precipitation Processing System (PPS) to control the flow of data between the MOC and the PPS. The primary features of MPISS are file transfers (both with and without PPS specific protocols), logging of file transfer and system events to local files and a standard messaging bus, short term storage of data files to facilitate retransmissions, and generation of file transfer accounting reports. The system includes a graphical user interface (GUI) to control the system, allow manual operations, and to display events in real time. The PPS specific protocols are an enhanced version of those that were developed for the Tropical Rainfall Measuring Mission (TRMM). All file transfers between the MOC and the PPS use the SSH File Transfer Protocol (SFTP). For reports and data files generated within the MOC, no additional protocols are used when transferring files to the PPS. For observatory data files, an additional handshaking protocol of data notices and data receipts is used. MPISS generates and sends to the PPS data notices containing data start and stop times along with a checksum for the file for each observatory data file transmitted. MPISS retrieves the PPS generated data receipts that indicate the success or failure of the PPS to ingest the data file and/or notice. MPISS retransmits the appropriate files as indicated in the receipt when required. MPISS also automatically retrieves files from the PPS. The unique feature of this software is the use of both standard and PPS specific protocols in parallel. The advantage of this capability is that it supports users that require the PPS protocol as well as those that do not require it. The system is highly configurable to accommodate the needs of future users.

  3. Aero/fluids database system

    NASA Technical Reports Server (NTRS)

    Reardon, John E.; Violett, Duane L., Jr.

    1991-01-01

    The AFAS Database System was developed to provide the basic structure of a comprehensive database system for the Marshall Space Flight Center (MSFC) Structures and Dynamics Laboratory Aerophysics Division. The system is intended to handle all of the Aerophysics Division Test Facilities as well as data from other sources. The system was written for the DEC VAX family of computers in FORTRAN-77 and utilizes the VMS indexed file system and screen management routines. Various aspects of the system are covered, including a description of the user interface, lists of all code structure elements, descriptions of the file structures, a description of the security system operation, a detailed description of the data retrieval tasks, a description of the session log, and a description of the archival system.

  4. SU-E-T-325: The New Evaluation Method of the VMAT Plan Delivery Using Varian DynaLog Files and Modulation Complexity Score (MCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tateoka, K; Graduate School of Medicine, Sapporo Medical University, Sapporo, JP; Fujimomo, K

    2014-06-01

    Purpose: The aim of the study is to evaluate the use of Varian DynaLog files to verify VMAT plans delivery and modulation complexity score (MCS) of VMAT. Methods: Delivery accuracy of machine performance was quantified by multileaf collimator (MLC) position errors, gantry angle errors and fluence delivery accuracy for volumetric modulated arc therapy (VMAT). The relationship between machine performance and plan complexity were also investigated using the modulation complexity score (MCS). Plan and Actual MLC positions, gantry angles and delivered fraction of monitor units were extracted from Varian DynaLog files. These factors were taken from the record and verify systemmore » of MLC control file. Planned and delivered beam data were compared to determine leaf position errors and gantry angle errors. Analysis was also performed on planned and actual fluence maps reconstructed from those of the DynaLog files. This analysis was performed for all treatment fractions of 5 prostate VMAT plans. The analysis of DynaLog files have been carried out by in-house programming in Visual C++. Results: The root mean square of leaf position and gantry angle errors were about 0.12 and 0.15, respectively. The Gamma of planned and actual fluence maps at 3%/3 mm criterion was about 99.21. The gamma of the leaf position errors were not directly related to plan complexity as determined by the MCS. Therefore, the gamma of the gantry angle errors were directly related to plan complexity as determined by the MCS. Conclusion: This study shows Varian dynalog files for VMAT plan can be diagnosed delivery errors not possible with phantom based quality assurance. Furthermore, the MCS of VMAT plan can evaluate delivery accuracy for patients receiving of VMAT. Machine performance was found to be directly related to plan complexity but this is not the dominant determinant of delivery accuracy.« less

  5. VizieR Online Data Catalog: GOALS sample PACS and SPIRE fluxes (Chu+, 2017)

    NASA Astrophysics Data System (ADS)

    Chu, J. K.; Sanders, D. B.; Larson, K. L.; Mazzarella, J. M.; Howell, J. H.; Diaz-Santos, T.; Xu, K. C.; Paladini, R.; Schulz, B.; Shupe, D.; Appleton, P.; Armus, L.; Billot, N.; Chan, B. H. P.; Evans, A. S.; Fadda, D.; Frayer, D. T.; Haan, S.; Ishida, C. M.; Iwasawa, K.; Kim, D.-C.; Lord, S.; Murphy, E.; Petric, A.; Privon, G. C.; Surace, J. A.; Treister, E.

    2017-06-01

    The IRAS RBGS contains 179 LIRGs (log(LIR/Lȯ)= 22 ultra-luminous infrared galaxies (ULIRGs: log(LIR/Lȯ)>=12.0); these 201 total objects comprise the GOALS sample (Armus et al. 2009), a statistically complete flux-limited sample of infrared-luminous galaxies in the local universe. This paper presents imaging and photometry for all 201 LIRGs and LIRG systems in the IRAS RBGS that were observed during our GOALS Herschel OT1 program. (4 data files).

  6. An analysis of image storage systems for scalable training of deep neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Young, Steven R; Patton, Robert M

    This study presents a principled empirical evaluation of image storage systems for training deep neural networks. We employ the Caffe deep learning framework to train neural network models for three different data sets, MNIST, CIFAR-10, and ImageNet. While training the models, we evaluate five different options to retrieve training image data: (1) PNG-formatted image files on local file system; (2) pushing pixel arrays from image files into a single HDF5 file on local file system; (3) in-memory arrays to hold the pixel arrays in Python and C++; (4) loading the training data into LevelDB, a log-structured merge tree based key-valuemore » storage; and (5) loading the training data into LMDB, a B+tree based key-value storage. The experimental results quantitatively highlight the disadvantage of using normal image files on local file systems to train deep neural networks and demonstrate reliable performance with key-value storage based storage systems. When training a model on the ImageNet dataset, the image file option was more than 17 times slower than the key-value storage option. Along with measurements on training time, this study provides in-depth analysis on the cause of performance advantages/disadvantages of each back-end to train deep neural networks. We envision the provided measurements and analysis will shed light on the optimal way to architect systems for training neural networks in a scalable manner.« less

  7. Comparison of fracture and deformation in the rotary endodontic instruments: Protaper versus K-3 system.

    PubMed

    Nagi, Sana Ehsen; Khan, Farhan Raza; Rahman, Munawar

    2016-03-01

    This experimental study was done on extracted human teeth to compare the fracture and deformation of the two rotary endodontic files system namely K-3 and Protapers. It was conducted at the dental clinics of the Aga Khan University Hospital, Karachi, A log of file deformation or fracture during root canal preparation was kept. The location of fracture was noted along with the identity of the canal in which fracture took place. The fracture in the two rotary systems was compared. SPSS 20 was used for data analysis. Of the 172(80.4%) teeth possessing more than 15 degrees of curvature, fracture occurred in 7(4.1%) cases and deformation in 10(5.8%). Of the 42(19.6%) teeth possessing less than 15 degrees of curvature, fracture occurred in none of them while deformation was seen in 1(2.4%). There was no difference in K-3 and Protaper files with respect to file deformation and fracture. Most of the fractures occurred in mesiobuccal canals of maxillary molars, n=3(21.4%). The likelihood of file fracture increased 5.65-fold when the same file was used more than 3 times. Irrespective of the rotary system, apical third of the root canal space was the most common site for file fracture.

  8. 47 CFR 76.1706 - Signal leakage logs and repair records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the probable cause of the leakage. The log shall be kept on file for a period of two years and shall... 47 Telecommunication 4 2010-10-01 2010-10-01 false Signal leakage logs and repair records. 76.1706... leakage logs and repair records. Cable operators shall maintain a log showing the date and location of...

  9. 47 CFR 76.1706 - Signal leakage logs and repair records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... the probable cause of the leakage. The log shall be kept on file for a period of two years and shall... 47 Telecommunication 4 2011-10-01 2011-10-01 false Signal leakage logs and repair records. 76.1706... leakage logs and repair records. Cable operators shall maintain a log showing the date and location of...

  10. 25 CFR 547.12 - What are the minimum technical standards for downloading on a Class II gaming system?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...

  11. 25 CFR 547.12 - What are the minimum technical standards for downloading on a Class II gaming system?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...

  12. Development of Cross-Platform Software for Well Logging Data Visualization

    NASA Astrophysics Data System (ADS)

    Akhmadulin, R. K.; Miraev, A. I.

    2017-07-01

    Well logging data processing is one of the main sources of information in the oil-gas field analysis and is of great importance in the process of its development and operation. Therefore, it is important to have the software which would accurately and clearly provide the user with processed data in the form of well logs. In this work, there have been developed a software product which not only has the basic functionality for this task (loading data from .las files, well log curves display, etc.), but can be run in different operating systems and on different devices. In the article a subject field analysis and task formulation have been performed, and the software design stage has been considered. At the end of the work the resulting software product interface has been described.

  13. Perceived Task-Difficulty Recognition from Log-File Information for the Use in Adaptive Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Janning, Ruth; Schatten, Carlotta; Schmidt-Thieme, Lars

    2016-01-01

    Recognising students' emotion, affect or cognition is a relatively young field and still a challenging task in the area of intelligent tutoring systems. There are several ways to use the output of these recognition tasks within the system. The approach most often mentioned in the literature is using it for giving feedback to the students. The…

  14. SU-E-T-144: Effective Analysis of VMAT QA Generated Trajectory Log Files for Medical Accelerator Predictive Maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Able, CM; Baydush, AH; Nguyen, C

    Purpose: To determine the effectiveness of SPC analysis for a model predictive maintenance process that uses accelerator generated parameter and performance data contained in trajectory log files. Methods: Each trajectory file is decoded and a total of 131 axes positions are recorded (collimator jaw position, gantry angle, each MLC, etc.). This raw data is processed and either axis positions are extracted at critical points during the delivery or positional change over time is used to determine axis velocity. The focus of our analysis is the accuracy, reproducibility and fidelity of each axis. A reference positional trace of the gantry andmore » each MLC is used as a motion baseline for cross correlation (CC) analysis. A total of 494 parameters (482 MLC related) were analyzed using Individual and Moving Range (I/MR) charts. The chart limits were calculated using a hybrid technique that included the use of the standard 3σ limits and parameter/system specifications. Synthetic errors/changes were introduced to determine the initial effectiveness of I/MR charts in detecting relevant changes in operating parameters. The magnitude of the synthetic errors/changes was based on: TG-142 and published analysis of VMAT delivery accuracy. Results: All errors introduced were detected. Synthetic positional errors of 2mm for collimator jaw and MLC carriage exceeded the chart limits. Gantry speed and each MLC speed are analyzed at two different points in the delivery. Simulated Gantry speed error (0.2 deg/sec) and MLC speed error (0.1 cm/sec) exceeded the speed chart limits. Gantry position error of 0.2 deg was detected by the CC maximum value charts. The MLC position error of 0.1 cm was detected by the CC maximum value location charts for every MLC. Conclusion: SPC I/MR evaluation of trajectory log file parameters may be effective in providing an early warning of performance degradation or component failure for medical accelerator systems.« less

  15. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  16. A Scientific Data Provenance Harvester for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan, Eric G.; Raju, Bibi; Elsethagen, Todd O.

    Data provenance provides a way for scientists to observe how experimental data originates, conveys process history, and explains influential factors such as experimental rationale and associated environmental factors from system metrics measured at runtime. The US Department of Energy Office of Science Integrated end-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project has developed a provenance harvester that is capable of collecting observations from file based evidence typically produced by distributed applications. To achieve this, file based evidence is extracted and transformed into an intermediate data format inspired in part by W3C CSV on the Web recommendations, calledmore » the Harvester Provenance Application Interface (HAPI) syntax. This syntax provides a general means to pre-stage provenance into messages that are both human readable and capable of being written to a provenance store, Provenance Environment (ProvEn). HAPI is being applied to harvest provenance from climate ensemble runs for Accelerated Climate Modeling for Energy (ACME) project funded under the U.S. Department of Energy’s Office of Biological and Environmental Research (BER) Earth System Modeling (ESM) program. ACME informally provides provenance in a native form through configuration files, directory structures, and log files that contain success/failure indicators, code traces, and performance measurements. Because of its generic format, HAPI is also being applied to harvest tabular job management provenance from Belle II DIRAC scheduler relational database tables as well as other scientific applications that log provenance related information.« less

  17. SU-E-T-784: Using MLC Log Files for Daily IMRT Delivery Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stathakis, S; Defoor, D; Linden, P

    2015-06-15

    Purpose: To verify daily intensity modulated radiation therapy (IMRT) treatments using multi-leaf collimator (MLC) log files. Methods: The MLC log files from a NovalisTX Varian linear accelerator were used in this study. The MLC files were recorded daily for all patients undergoing IMRT or volumetric modulated arc therapy (VMAT). The first record of each patient was used as reference and all records for subsequent days were compared against the reference. An in house MATLAB software code was used for the comparisons. Each MLC log file was converted to a fluence map (FM) and a gamma index (γ) analysis was usedmore » for the evaluation of each daily delivery for every patient. The tolerance for the gamma index was set to 2% dose difference and 2mm distance to agreement while points with signal of 10% or lower of the maximum value were excluded from the comparisons. Results: The γ between each of the reference FMs and the consecutive daily fraction FMs had an average value of 99.1% (ranged from 98.2 to 100.0%). The FM images were reconstructed at various resolutions in order to study the effect of the resolution on the γ and at the same time reduce the time for processing the images. We found that the comparison of images with the highest resolution (768×1024) yielded on average a lower γ (99.1%) than the ones with low resolution (192×256) (γ 99.5%). Conclusion: We developed an in-house software that allows us to monitor the quality of daily IMRT and VMAT treatment deliveries using information from the MLC log files of the linear accelerator. The information can be analyzed and evaluated as early as after the completion of each daily treatment. Such tool can be valuable to assess the effect of MLC positioning on plan quality, especially in the context of adaptive radiotherapy.« less

  18. A high-speed scintillation-based electronic portal imaging device to quantitatively characterize IMRT delivery.

    PubMed

    Ranade, Manisha K; Lynch, Bart D; Li, Jonathan G; Dempsey, James F

    2006-01-01

    We have developed an electronic portal imaging device (EPID) employing a fast scintillator and a high-speed camera. The device is designed to accurately and independently characterize the fluence delivered by a linear accelerator during intensity modulated radiation therapy (IMRT) with either step-and-shoot or dynamic multileaf collimator (MLC) delivery. Our aim is to accurately obtain the beam shape and fluence of all segments delivered during IMRT, in order to study the nature of discrepancies between the plan and the delivered doses. A commercial high-speed camera was combined with a terbium-doped gadolinium-oxy-sulfide (Gd2O2S:Tb) scintillator to form an EPID for the unaliased capture of two-dimensional fluence distributions of each beam in an IMRT delivery. The high speed EPID was synchronized to the accelerator pulse-forming network and gated to capture every possible pulse emitted from the accelerator, with an approximate frame rate of 360 frames-per-second (fps). A 62-segment beam from a head-and-neck IMRT treatment plan requiring 68 s to deliver was recorded with our high speed EPID producing approximately 6 Gbytes of imaging data. The EPID data were compared with the MLC instruction files and the MLC controller log files. The frames were binned to provide a frame rate of 72 fps with a signal-to-noise ratio that was sufficient to resolve leaf positions and segment fluence. The fractional fluence from the log files and EPID data agreed well. An ambiguity in the motion of the MLC during beam on was resolved. The log files reported leaf motions at the end of 33 of the 42 segments, while the EPID observed leaf motions in only 7 of the 42 segments. The static IMRT segment shapes observed by the high speed EPID were in good agreement with the shapes reported in the log files. The leaf motions observed during beam-on for step-and-shoot delivery were not temporally resolved by the log files.

  19. Geologic cross section E-E' through the Appalachian basin from the Findlay arch, Wood County, Ohio, to the Valley and Ridge province, Pendleton County, West Virginia: Chapter E.4.2 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Swezey, Christopher S.; Crangle, Robert D.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    This chapter is a re-release of U.S. Geological Survey Scientific Investigations Map 2985, of the same title, by Ryder and others (2008). For this chapter, two appendixes have been added that do not appear with the original version. Appendix A provides Log ASCII Standard (LAS) files for each drill hole along cross-section E–E'; they are text files which encode gamma-ray, neutron, density, and other logs that can be used by most well-logging software. Appendix B provides graphic well-log traces from each drill hole.

  20. Wave-Ice Interaction and the Marginal Ice Zone

    DTIC Science & Technology

    2013-09-30

    concept, using a high-quality attitude and heading reference system ( AHRS ) together with an accurate twin-antennae GPS compass. The instruments logged...the AHRS parameters at 50Hz, together with GPS-derived fixes, heading (accurate to better than 1o) and velocities at 10Hz. The 30MB hourly files

  1. INSPIRE and SPIRES Log File Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Cole; /Wheaton Coll. /SLAC

    2012-08-31

    SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are mademore » between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.« less

  2. 78 FR 40474 - Sustaining Power Solutions LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-05

    .... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...

  3. 78 FR 34371 - Longfellow Wind, LLC: Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-07

    .... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...

  4. The new idea of transporting tailings-logs in tailings slurry pipeline and the innovation of technology of mining waste-fill method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin Yu; Wang Fuji; Tao Yan

    2000-07-01

    This paper introduced a new idea of transporting mine tailings-logs in mine tailings-slurry pipeline and a new technology of mine cemented filing of tailings-logs with tailings-slurry. The hydraulic principles, the compaction of tailings-logs and the mechanic function of fillbody of tailings-logs cemented by tailings-slurry have been discussed.

  5. Study of the IMRT interplay effect using a 4DCT Monte Carlo dose calculation.

    PubMed

    Jensen, Michael D; Abdellatif, Ady; Chen, Jeff; Wong, Eugene

    2012-04-21

    Respiratory motion may lead to dose errors when treating thoracic and abdominal tumours with radiotherapy. The interplay between complex multileaf collimator patterns and patient respiratory motion could result in unintuitive dose changes. We have developed a treatment reconstruction simulation computer code that accounts for interplay effects by combining multileaf collimator controller log files, respiratory trace log files, 4DCT images and a Monte Carlo dose calculator. Two three-dimensional (3D) IMRT step-and-shoot plans, a concave target and integrated boost were delivered to a 1D rigid motion phantom. Three sets of experiments were performed with 100%, 50% and 25% duty cycle gating. The log files were collected, and five simulation types were performed on each data set: continuous isocentre shift, discrete isocentre shift, 4DCT, 4DCT delivery average and 4DCT plan average. Analysis was performed using 3D gamma analysis with passing criteria of 2%, 2 mm. The simulation framework was able to demonstrate that a single fraction of the integrated boost plan was more sensitive to interplay effects than the concave target. Gating was shown to reduce the interplay effects. We have developed a 4DCT Monte Carlo simulation method that accounts for IMRT interplay effects with respiratory motion by utilizing delivery log files.

  6. Automated clustering-based workload characterization

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Menasce, Daniel A.; Yesha, Yelena

    1996-01-01

    The demands placed on the mass storage systems at various federal agencies and national laboratories are continuously increasing in intensity. This forces system managers to constantly monitor the system, evaluate the demand placed on it, and tune it appropriately using either heuristics based on experience or analytic models. Performance models require an accurate workload characterization. This can be a laborious and time consuming process. It became evident from our experience that a tool is necessary to automate the workload characterization process. This paper presents the design and discusses the implementation of a tool for workload characterization of mass storage systems. The main features of the tool discussed here are: (1)Automatic support for peak-period determination. Histograms of system activity are generated and presented to the user for peak-period determination; (2) Automatic clustering analysis. The data collected from the mass storage system logs is clustered using clustering algorithms and tightness measures to limit the number of generated clusters; (3) Reporting of varied file statistics. The tool computes several statistics on file sizes such as average, standard deviation, minimum, maximum, frequency, as well as average transfer time. These statistics are given on a per cluster basis; (4) Portability. The tool can easily be used to characterize the workload in mass storage systems of different vendors. The user needs to specify through a simple log description language how the a specific log should be interpreted. The rest of this paper is organized as follows. Section two presents basic concepts in workload characterization as they apply to mass storage systems. Section three describes clustering algorithms and tightness measures. The following section presents the architecture of the tool. Section five presents some results of workload characterization using the tool.Finally, section six presents some concluding remarks.

  7. 78 FR 54888 - Guzman Power Markets, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-06

    ... the eFiling link to log on and submit the intervention or protests. Persons unable to file... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... electronic service, persons with Internet access who will eFile a document and/or be listed as a contact for...

  8. 78 FR 28835 - Salton Sea Power Generation Company; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    .... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...

  9. Application of Architectural Patterns and Lightweight Formal Method for the Validation and Verification of Safety Critical Systems

    DTIC Science & Technology

    2013-09-01

    to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS

  10. 77 FR 55817 - Delek Crude Logistics, LLC; Notice of Petition for Waiver

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ... using the eRegistration link. Select the eFiling link to log on and submit the intervention or protests... number. eFiling is encouraged. More detailed information relating to filing requirements, interventions...'') grant a temporary waiver of the filing and reporting requirements of sections 6 and 201 of the...

  11. Geologic cross section D-D' through the Appalachian basin from the Findlay arch, Sandusky County, Ohio, to the Valley and Ridge province, Hardy County, West Virginia: Chapter E.4.1 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Crangle, Robert D.; Trippi, Michael H.; Swezey, Christopher S.; Lentz, Erika E.; Rowan, Elisabeth L.; Hope, Rebecca S.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    This chapter is a re-release of U.S. Geological Survey Scientific Investigations Map 3067, of the same title, by Ryder and others (2009). For this chapter, two appendixes have been added that do not appear with the original version. Appendix A provides Log ASCII Standard (LAS) files for each drill hole along cross-section D-D'; they are text files which encode gamma-ray, neutron, density, and other logs that can be used by most well-logging software. Appendix B provides graphic well-log traces and lithologic descriptions with formation tops from each drill hole.

  12. 15 CFR 762.3 - Records exempt from recordkeeping requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...; (2) Special export file list; (3) Vessel log from freight forwarder; (4) Inspection certificate; (5... form; (12) Financial hold form; (13) Export parts shipping problem form; (14) Draft number log; (15) Expense invoice mailing log; (16) Financial status report; (17) Bank release of guarantees; (18) Cash...

  13. 15 CFR 762.3 - Records exempt from recordkeeping requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...; (2) Special export file list; (3) Vessel log from freight forwarder; (4) Inspection certificate; (5... form; (12) Financial hold form; (13) Export parts shipping problem form; (14) Draft number log; (15) Expense invoice mailing log; (16) Financial status report; (17) Bank release of guarantees; (18) Cash...

  14. 76 FR 4463 - Privacy Act of 1974; Report of Modified or Altered System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... occupationally related mortality or morbidity is occurring. In the event of litigation where the defendant is: (a... diseases and which provides for the confidentiality of the information. In the event of litigation..., limited log-ins, virus protection, and user rights/file attribute restrictions. Password protection...

  15. SU-E-T-100: Designing a QA Tool for Enhance Dynamic Wedges Based On Dynalog Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yousuf, A; Hussain, A

    2014-06-01

    Purpose: A robust quality assurance (QA) program for computer controlled enhanced dynamic wedge (EDW) has been designed and tested. Calculations to perform such QA test is based upon the EDW dynamic log files generated during dose delivery. Methods: Varian record and verify system generates dynamic log (dynalog) files during dynamic dose delivery. The system generated dynalog files contain information such as date and time of treatment, energy, monitor units, wedge orientation, and type of treatment. It also contains the expected calculated segmented treatment tables (STT) and the actual delivered STT for the treatment delivery as a verification record. These filesmore » can be used to assess the integrity and precision of the treatment plan delivery. The plans were delivered with a 6 MV beam from a Varian linear accelerator. For available EDW angles (10°, 15°, 20°, 25°, 30°, 45°, and 60°) Varian STT values were used to manually calculate monitor units for each segment. It can also be used to calculate the EDW factors. Independent verification of fractional MUs per segment was performed against those generated from dynalog files. The EDW factors used to calculate MUs in TPS were dosimetrically verified in solid water phantom with semiflex chamber on central axis. Results: EDW factors were generated from the STT provided by Varian and verified against practical measurements. The measurements were in agreement of the order of 1 % to the calculated EDW data. Variation between the MUs per segment obtained from dynalog files and those manually calculated was found to be less than 2%. Conclusion: An efficient and easy tool to perform routine QA procedure of EDW is suggested. The method can be easily implemented in any institution without a need for expensive QA equipment. An error of the order of ≥2% can be easily detected.« less

  16. 78 FR 70299 - Capacity Markets Partners, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-25

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  17. 78 FR 59923 - Buffalo Dunes Wind Project, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  18. 78 FR 28833 - Lighthouse Energy Group, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  19. 78 FR 29366 - Wheelabrator Baltimore, LP; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-20

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  20. 77 FR 64978 - Sunbury Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  1. 78 FR 62300 - Burgess Biopower LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-15

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  2. 78 FR 75561 - South Bay Energy Corp.; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  3. 78 FR 28833 - Ebensburg Power Company; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  4. 78 FR 72673 - Yellow Jacket Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-03

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street, NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  5. 78 FR 44557 - Guttman Energy Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  6. 78 FR 68052 - Covanta Haverhill Association, LP; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  7. 78 FR 49506 - Source Power & Gas LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  8. 77 FR 64980 - Noble Americas Energy Solutions LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE...://www.ferc.gov . To facilitate electronic service, persons with Internet access who will eFile a... using the eRegistration link. Select the eFiling link to log on and submit the intervention or protests...

  9. 78 FR 46939 - DWP Energy Holdings, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  10. 78 FR 28833 - CE Leathers Company; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  11. 78 FR 59014 - Lakeswind Power Partners, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-25

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street, NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  12. 78 FR 75560 - Green Current Solutions, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  13. 77 FR 64980 - Collegiate Clean Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  14. 77 FR 64977 - Frontier Utilities New York LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  15. 78 FR 62299 - West Deptford Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-15

    ... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  16. 78 FR 52913 - Allegany Generating Station LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-27

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  17. Multipurpose Controller with EPICS integration and data logging: BPM application for ESS Bilbao

    NASA Astrophysics Data System (ADS)

    Arredondo, I.; del Campo, M.; Echevarria, P.; Jugo, J.; Etxebarria, V.

    2013-10-01

    This work presents a multipurpose configurable control system which can be integrated in an EPICS control network, this functionality being configured through a XML configuration file. The core of the system is the so-called Hardware Controller which is in charge of the control hardware management, the set up and communication with the EPICS network and the data storage. The reconfigurable nature of the controller is based on a single XML file, allowing any final user to easily modify and adjust the control system to any specific requirement. The selected Java development environment ensures a multiplatform operation and large versatility, even regarding the control hardware to be controlled. Specifically, this paper, focused on fast control based on a high performance FPGA, describes also an application approach for the ESS Bilbao's Beam Position Monitoring system. The implementation of the XML configuration file and the satisfactory performance outcome achieved are presented, as well as a general description of the Multipurpose Controller itself.

  18. SedMob: A mobile application for creating sedimentary logs in the field

    NASA Astrophysics Data System (ADS)

    Wolniewicz, Pawel

    2014-05-01

    SedMob is an open-source, mobile software package for creating sedimentary logs, targeted for use in tablets and smartphones. The user can create an unlimited number of logs, save data from each bed in the log as well as export and synchronize the data with a remote server. SedMob is designed as a mobile interface to SedLog: a free multiplatform package for drawing graphic logs that runs on PC computers. Data entered into SedMob are saved in the CSV file format, fully compatible with SedLog.

  19. COMBATXXI, JDAFS, and LBC Integration Requirements for EASE

    DTIC Science & Technology

    2015-10-06

    process as linear and as new data is made available, any previous analysis is obsolete and has to start the process over again. Figure 2 proposes a...final line of the manifest file names the scenario file associated with the run. Under the usual practice, the analyst now starts the COMBATXXI...describes which events are to be logged. Finally the scenario is started with the click of a button. The simulation generates logs of a couple of sorts

  20. Users' information-seeking behavior on a medical library Website

    PubMed Central

    Rozic-Hristovski, Anamarija; Hristovski, Dimitar; Todorovski, Ljupco

    2002-01-01

    The Central Medical Library (CMK) at the Faculty of Medicine, University of Ljubljana, Slovenia, started to build a library Website that included a guide to library services and resources in 1997. The evaluation of Website usage plays an important role in its maintenance and development. Analyzing and exploring regularities in the visitors' behavior can be used to enhance the quality and facilitate delivery of information services, identify visitors' interests, and improve the server's performance. The analysis of the CMK Website users' navigational behavior was carried out by analyzing the Web server log files. These files contained information on all user accesses to the Website and provided a great opportunity to learn more about the behavior of visitors to the Website. The majority of the available tools for Web log file analysis provide a predefined set of reports showing the access count and the transferred bytes grouped along several dimensions. In addition to the reports mentioned above, the authors wanted to be able to perform interactive exploration and ad hoc analysis and discover trends in a user-friendly way. Because of that, we developed our own solution for exploring and analyzing the Web logs based on data warehousing and online analytical processing technologies. The analytical solution we developed proved successful, so it may find further application in the field of Web log file analysis. We will apply the findings of the analysis to restructuring the CMK Website. PMID:11999179

  1. 18 CFR 270.304 - Tight formation gas.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... determination that natural gas is tight formation gas must file with the jurisdictional agency an application... formation; (d) A complete copy of the well log, including the log heading identifying the designated tight...

  2. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). Version 3.5, Quick Reference Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, B.G.; Richards, R.E.; Reece, W.J.

    1992-10-01

    This Reference Guide contains instructions on how to install and use Version 3.5 of the NRC-sponsored Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). The NUCLARR data management system is contained in compressed files on the floppy diskettes that accompany this Reference Guide. NUCLARR is comprised of hardware component failure data (HCFD) and human error probability (HEP) data, both of which are available via a user-friendly, menu driven retrieval system. The data may be saved to a file in a format compatible with IRRAS 3.0 and commercially available statistical packages, or used to formulate log-plots and reports of data retrievalmore » and aggregation findings.« less

  3. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, B.G.; Richards, R.E.; Reece, W.J.

    1992-10-01

    This Reference Guide contains instructions on how to install and use Version 3.5 of the NRC-sponsored Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). The NUCLARR data management system is contained in compressed files on the floppy diskettes that accompany this Reference Guide. NUCLARR is comprised of hardware component failure data (HCFD) and human error probability (HEP) data, both of which are available via a user-friendly, menu driven retrieval system. The data may be saved to a file in a format compatible with IRRAS 3.0 and commercially available statistical packages, or used to formulate log-plots and reports of data retrievalmore » and aggregation findings.« less

  4. SU-F-T-177: Impacts of Gantry Angle Dependent Scanning Beam Properties for Proton Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Y; Clasie, B; Lu, H

    Purpose: In pencil beam scanning (PBS), the delivered spot MU, position and size are slightly different at different gantry angles. We investigated the level of delivery uncertainty at different gantry angles through a log file analysis. Methods: 34 PBS fields covering full 360 degrees gantry angle spread were collected retrospectively from 28 patients treated at our institution. All fields were delivered at zero gantry angle and the prescribed gantry angle, and measured at isocenter with the MatriXX 2D array detector at the prescribed gantry angle. The machine log files were analyzed to extract the delivered MU per spot and themore » beam position from the strip ionization chambers in the treatment nozzle. The beam size was separately measured as a function of gantry angle and beam energy. Using this information, the dose was calculated in a water phantom at both gantry angles and compared to the measurement using the 3D γ-index at 2mm/2%. Results: The spot-by-spot difference between the beam position in the log files from the delivery at the two gantry angles has a mean of 0.3 and 0.4 mm and a standard deviation of 0.6 and 0.7 mm for × and y directions, respectively. Similarly, the spot-by-spot difference between the MU in the log files from the delivery at the two gantry angles has a mean 0.01% and a standard deviation of 0.7%. These small deviations lead to an excellent agreement in dose calculations with an average γ pass rate for all fields being approximately 99.7%. When each calculation is compared to the measurement, a high correlation in γ was also found. Conclusion: Using machine logs files, we verified that PBS beam delivery at different gantry angles are sufficiently small and the planned spot position and MU. This study brings us one step closer to simplifying our patient-specific QA.« less

  5. 78 FR 28834 - Salton Sea Power L.L.C.; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  6. 78 FR 28835 - Del Ranch Company; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  7. 78 FR 28835 - Patua Project LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  8. 78 FR 75561 - Great Bay Energy V, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  9. 77 FR 64981 - Homer City Generation, L.P.; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  10. 77 FR 69819 - Cirrus Wind 1, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-21

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  11. 77 FR 64979 - Great Bay Energy IV, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  12. 77 FR 53195 - H.A. Wagner LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-31

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  13. 78 FR 59923 - Mammoth Three LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  14. 78 FR 61945 - Tuscola Wind II, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-07

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  15. 77 FR 69819 - QC Power Strategies Fund LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-21

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  16. 78 FR 75561 - Astral Energy LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  17. Capabilities Report 2012, West Desert Test Center

    DTIC Science & Technology

    2012-03-12

    132 FT- IR Spectrometer...electronic system files, paper logs, production batch records, QA/QC data, and PCR data generated during a test. Data analysts also track and QC raw data...Advantage +SL bench-top freeze dryers achieve shelf temperatures as low as -57°C and condenser temperatures to -67°C. The bulk milling facility produces

  18. Web-Based Learning Programs: Use by Learners with Various Cognitive Styles

    ERIC Educational Resources Information Center

    Chen, Ling-Hsiu

    2010-01-01

    To consider how Web-based learning program is utilized by learners with different cognitive styles, this study presents a Web-based learning system (WBLS) and analyzes learners' browsing data recorded in the log file to identify how learners' cognitive styles and learning behavior are related. In order to develop an adapted WBLS, this study also…

  19. TU-D-209-05: Automatic Calculation of Organ and Effective Dose for CBCT and Interventional Fluoroscopic Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Oines, A

    Purpose: To compare PCXMC and EGSnrc calculated organ and effective radiation doses from cone-beam computed tomography (CBCT) and interventional fluoroscopically-guided procedures using automatic exposure-event grouping. Methods: For CBCT, we used PCXMC20Rotation.exe to automatically calculate the doses and compared the results to those calculated using EGSnrc with the Zubal patient phantom. For interventional procedures, we use the dose tracking system (DTS) which we previously developed to produce a log file of all geometry and exposure parameters for every x-ray pulse during a procedure, and the data in the log file is input into PCXMC and EGSnrc for dose calculation. A MATLABmore » program reads data from the log files and groups similar exposures to reduce calculation time. The definition files are then automatically generated in the format used by PCXMC and EGSnrc. Processing is done at the end of the procedure after all exposures are completed. Results: For the Toshiba Infinix CBCT LCI-Middle-Abdominal protocol, most organ doses calculated with PCXMC20Rotation closely matched those calculated with EGSnrc. The effective doses were 33.77 mSv with PCXMC20Rotation and 32.46 mSv with EGSnrc. For a simulated interventional cardiac procedure, similar close agreement in organ dose was obtained between the two codes; the effective doses were 12.02 mSv with PCXMC and 11.35 mSv with EGSnrc. The calculations can be completed on a PC without manual intervention in less than 15 minutes with PCXMC and in about 10 hours with EGSnrc, depending on the level of data grouping and accuracy desired. Conclusion: Effective dose and most organ doses in CBCT and interventional radiology calculated by PCXMC closely match those calculated by EGSnrc. Data grouping, which can be done automatically, makes the calculation time with PCXMC on a standard PC acceptable. This capability expands the dose information that can be provided by the DTS. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.« less

  20. Sight Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G.

    2014-09-01

    Enables applications to emit log information into an output file and produced a structured visual summary of the log data, as well as various statistical analyses of it. This makes it easier for developers to understand the behavior of their applications.

  1. 75 FR 60122 - Notice of Public Information Collection(s) Being Reviewed by the Federal Communications...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-29

    ... the respondents, including the use of automated collection techniques or other forms of information...: OMB Control Number: 3060-0360. Title: Section 80.409, Station Logs. Form No.: N/A. Type of Review... for filing suits upon such claims. Section 80.409(d), Ship Radiotelegraph Logs: Logs of ship stations...

  2. 78 FR 28834 - Elmore Company; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  3. 78 FR 49507 - OriGen Energy LLC ; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... securities and assumptions of liability. Any person desiring to intervene or to protest should file with the... with Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log...

  4. 78 FR 49507 - ORNI 47 LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  5. 77 FR 64981 - BITHENERGY, Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and...

  6. 78 FR 40473 - eBay Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-05

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  7. 78 FR 28832 - CalEnergy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  8. A PC-based bus monitor program for use with the transport systems research vehicle RS-232 communication interfaces

    NASA Technical Reports Server (NTRS)

    Easley, Wesley C.

    1991-01-01

    Experiment critical use of RS-232 data busses in the Transport Systems Research Vehicle (TSRV) operated by the Advanced Transport Operating Systems Program Office at the NASA Langley Research Center has recently increased. Each application utilizes a number of nonidentical computer and peripheral configurations and requires task specific software development. To aid these development tasks, an IBM PC-based RS-232 bus monitoring system was produced. It can simultaneously monitor two communication ports of a PC or clone, including the nonstandard bus expansion of the TSRV Grid laptop computers. Display occurs in a separate window for each port's input with binary display being selectable. A number of other features including binary log files, screen capture to files, and a full range of communication parameters are provided.

  9. Real-Time Population Health Detector

    DTIC Science & Technology

    2004-11-01

    military and civilian populations. General Dynamics (then Veridian Systems Division), in cooperation with Stanford University, won a competitive DARPA...via the sequence of one-step ahead forecast errors from the Kalman recursions: 1| −−= tttt Hye µ The log-likelihood then follows by treating the... parking in the transient parking structure. Norfolk Area Military Treatment Facility Patient Files GDAIS received historic CHCS data from all

  10. Sediment data collected in 2013 from the northern Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    Buster, Noreen A.; Kelso, Kyle W.; Bernier, Julie C.; Flocks, James G.; Miselis, Jennifer L.; DeWitt, Nancy T.

    2014-01-01

    This data series serves as an archive of sediment data collected in July 2013 from the Chandeleur Islands sand berm and adjacent barrier-island environments. Data products include descriptive core logs, core photographs and x-radiographs, results of sediment grain-size analyses, sample location maps, and Geographic Information System data files with accompanying formal Federal Geographic Data Committee metadata.

  11. VizieR Online Data Catalog: Reference Catalogue of Bright Galaxies (RC1; de Vaucouleurs+ 1964)

    NASA Astrophysics Data System (ADS)

    de Vaucouleurs, G.; de Vaucouleurs, A.

    1995-11-01

    The Reference Catalogue of Bright Galaxies lists for each entry the following information: NGC number, IC number, or A number; A, B, or C designation; B1950.0 positions, position at 100 year precession; galactic and supergalactic positions; revised morphological type and source; type and color class in Yerkes list 1 and 2; Hubble-Sandage type; revised Hubble type according to Holmberg; logarithm of mean major diameter (log D) and ratio of major to minor diameter (log R) and their weights; logarithm of major diameter; sources of the diameters; David Dunlap Observatory type and luminosity class; Harvard photographic apparent magnitude; weight of V, B-V(0), U-B(0); integrated magnitude B(0) and its weight in the B system; mean surface brightness in magnitude per square minute of arc and sources for the B magnitude; mean B surface brightness derived from corrected Harvard magnitude; the integrated color index in the standard B-V system; "intrinsic" color index; sources of B-V and/or U-B; integrated color in the standard U-B system; observed radial velocity in km/sec; radial velocity corrected for solar motion in km/sec; sources of radial velocities; solar motion correction; and direct photographic source. The catalog was created by concatenating four files side by side. (1 data file).

  12. Transaction aware tape-infrastructure monitoring

    NASA Astrophysics Data System (ADS)

    Nikolaidis, Fotios; Kruse, Daniele Francesco

    2014-06-01

    Administrating a large scale, multi protocol, hierarchical tape infrastructure like the CERN Advanced STORage manager (CASTOR)[2], which stores now 100 PB (with an increasing step of 25 PB per year), requires an adequate monitoring system for quick spotting of malfunctions, easier debugging and on demand report generation. The main challenges for such system are: to cope with CASTOR's log format diversity and its information scattered among several log files, the need for long term information archival, the strict reliability requirements and the group based GUI visualization. For this purpose, we have designed, developed and deployed a centralized system consisting of four independent layers: the Log Transfer layer for collecting log lines from all tape servers to a single aggregation server, the Data Mining layer for combining log data into transaction context, the Storage layer for archiving the resulting transactions and finally the Web UI layer for accessing the information. Having flexibility, extensibility and maintainability in mind, each layer is designed to work as a message broker for the next layer, providing a clean and generic interface while ensuring consistency, redundancy and ultimately fault tolerance. This system unifies information previously dispersed over several monitoring tools into a single user interface, using Splunk, which also allows us to provide information visualization based on access control lists (ACL). Since its deployment, it has been successfully used by CASTOR tape operators for quick overview of transactions, performance evaluation, malfunction detection and from managers for report generation.

  13. Expansion of the roadway reference log : KYSPR-99-201.

    DOT National Transportation Integrated Search

    2000-05-01

    The objectives of this study were to: 1) expand the current route log to include milepoints for all intersections on state maintained roads and 2) recommend a procedure for establishing milepoints and maintaining the file with up-to-date information....

  14. 78 FR 52524 - Sunoco Pipeline LP; Notice of Petition for Declaratory Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-23

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... described in their petition. Any person desiring to intervene or to protest in this proceedings must file in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  15. 78 FR 62349 - Sunoco Pipeline L.P.; Notice of Petition for Declaratory Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-18

    ... to log on and submit the intervention or protests. Persons unable to file electronically should... petition. Any person desiring to intervene or to protest in this proceeding must file in accordance with..., persons with Internet access who will eFile a document and/or be listed as a contact for an intervenor...

  16. 78 FR 77155 - Grant Program To Assess, Evaluate, and Promote Development of Tribal Energy and Mineral Resources

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-20

    ... through DEMD's in-house databases; Well log interpretation, including correlation of formation tops.... Files must have descriptive file names to help DEMD quickly locate specific components of the proposal...

  17. Fort Bliss Geothermal Area Data: Temperature profile, logs, schematic model and cross section

    DOE Data Explorer

    Adam Brandt

    2015-11-15

    This dataset contains a variety of data about the Fort Bliss geothermal area, part of the southern portion of the Tularosa Basin, New Mexico. The dataset contains schematic models for the McGregor Geothermal System, a shallow temperature survey of the Fort Bliss geothermal area. The dataset also contains Century OH logs, a full temperature profile, and complete logs from well RMI 56-5, including resistivity and porosity data, drill logs with drill rate, depth, lithology, mineralogy, fractures, temperature, pit total, gases, and descriptions among other measurements as well as CDL, CNL, DIL, GR Caliper and Temperature files. A shallow (2 meter depth) temperature survey of the Fort Bliss geothermal area with 63 data points is also included. Two cross sections through the Fort Bliss area, also included, show well position and depth. The surface map included shows faults and well spatial distribution. Inferred and observed fault distributions from gravity surveys around the Fort Bliss geothermal area.

  18. 20 CFR 658.414 - Referral of non-JS-related complaints.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... applicable, were referred on the complaint log specified in § 658.410(c)(1). The JS official shall also prepare and keep the file specified in § 658.410(c)(3) for the complaints filed pursuant to paragraph (a...

  19. The design and implementation of an automated system for logging clinical experiences using an anesthesia information management system.

    PubMed

    Simpao, Allan; Heitz, James W; McNulty, Stephen E; Chekemian, Beth; Brenn, B Randall; Epstein, Richard H

    2011-02-01

    Residents in anesthesia training programs throughout the world are required to document their clinical cases to help ensure that they receive adequate training. Current systems involve self-reporting, are subject to delayed updates and misreported data, and do not provide a practicable method of validation. Anesthesia information management systems (AIMS) are being used increasingly in training programs and are a logical source for verifiable documentation. We hypothesized that case logs generated automatically from an AIMS would be sufficiently accurate to replace the current manual process. We based our analysis on the data reporting requirements of the American College of Graduate Medical Education (ACGME). We conducted a systematic review of ACGME requirements and our AIMS record, and made modifications after identifying data element and attribution issues. We studied 2 methods (parsing of free text procedure descriptions and CPT4 procedure code mapping) to automatically determine ACGME case categories and generated AIMS-based case logs and compared these to assignments made by manual inspection of the anesthesia records. We also assessed under- and overreporting of cases entered manually by our residents into the ACGME website. The parsing and mapping methods assigned cases to a majority of the ACGME categories with accuracies of 95% and 97%, respectively, as compared with determinations made by 2 residents and 1 attending who manually reviewed all procedure descriptions. Comparison of AIMS-based case logs with reports from the ACGME Resident Case Log System website showed that >50% of residents either underreported or overreported their total case counts by at least 5%. The AIMS database is a source of contemporaneous documentation of resident experience that can be queried to generate valid, verifiable case logs. The extent of AIMS adoption by academic anesthesia departments should encourage accreditation organizations to support uploading of AIMS-based case log files to improve accuracy and to decrease the clerical burden on anesthesia residents.

  20. 78 FR 49506 - E.ON Global Commodities North America LLC; Supplemental Notice That Initial Market-Based Rate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  1. 78 FR 63977 - Enable Bakken Crude Services, LLC; Notice of Request For Waiver

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-25

    ... person desiring to intervene or to protest in this proceedings must file in accordance with Rules 211 and... Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and...

  2. MAIL LOG, program theory, volume 2

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.

  3. Addressing fluorogenic real-time qPCR inhibition using the novel custom Excel file system 'FocusField2-6GallupqPCRSet-upTool-001' to attain consistently high fidelity qPCR reactions

    PubMed Central

    Ackermann, Mark R.

    2006-01-01

    The purpose of this manuscript is to discuss fluorogenic real-time quantitative polymerase chain reaction (qPCR) inhibition and to introduce/define a novel Microsoft Excel-based file system which provides a way to detect and avoid inhibition, and enables investigators to consistently design dynamically-sound, truly LOG-linear qPCR reactions very quickly. The qPCR problems this invention solves are universal to all qPCR reactions, and it performs all necessary qPCR set-up calculations in about 52 seconds (using a pentium 4 processor) for up to seven qPCR targets and seventy-two samples at a time – calculations that commonly take capable investigators days to finish. We have named this custom Excel-based file system "FocusField2-6GallupqPCRSet-upTool-001" (FF2-6-001 qPCR set-up tool), and are in the process of transforming it into professional qPCR set-up software to be made available in 2007. The current prototype is already fully functional. PMID:17033699

  4. The Feasibility of Using Cluster Analysis to Examine Log Data from Educational Video Games. CRESST Report 790

    ERIC Educational Resources Information Center

    Kerr, Deirdre; Chung, Gregory K. W. K.; Iseli, Markus R.

    2011-01-01

    Analyzing log data from educational video games has proven to be a challenging endeavor. In this paper, we examine the feasibility of using cluster analysis to extract information from the log files that is interpretable in both the context of the game and the context of the subject area. If cluster analysis can be used to identify patterns of…

  5. TraceContract

    NASA Technical Reports Server (NTRS)

    Kavelund, Klaus; Barringer, Howard

    2012-01-01

    TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.

  6. 18 CFR 401.110 - Fees.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... include staff time associated with: (A) Processing FOIA requests; (B) Locating and reviewing files; (C) Monitoring file reviews; (D) Generating computer records (electronic print-outs); and (E) Preparing logs of..., black and white copies. The charge for copying standard sized, black and white public records shall be...

  7. 18 CFR 401.110 - Fees.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... include staff time associated with: (A) Processing FOIA requests; (B) Locating and reviewing files; (C) Monitoring file reviews; (D) Generating computer records (electronic print-outs); and (E) Preparing logs of..., black and white copies. The charge for copying standard sized, black and white public records shall be...

  8. 9 CFR 327.10 - Samples; inspection of consignments; refusal of entry; marking.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... import establishment and approved by the Director, Import Inspection Division, is on file at the import... (iv) That the establishment will maintain a daily stamping log containing the following information... covering the product to be inspected. The daily stamping log must be retained by the establishment in...

  9. 46 CFR 78.37-3 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... completed, the master or person in charge shall file the logbook with the Officer in Charge, Marine... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  10. 46 CFR 131.610 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) OFFSHORE SUPPLY VESSELS OPERATIONS Logs § 131... them. (d) When a voyage is completed, or after a specified time has elapsed, the master shall file the... alternative log or record for making entries required by law, including regulations in this subchapter. This...

  11. 46 CFR 131.610 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) OFFSHORE SUPPLY VESSELS OPERATIONS Logs § 131... them. (d) When a voyage is completed, or after a specified time has elapsed, the master shall file the... alternative log or record for making entries required by law, including regulations in this subchapter. This...

  12. 9 CFR 327.10 - Samples; inspection of consignments; refusal of entry; marking.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... import establishment and approved by the Director, Import Inspection Division, is on file at the import... (iv) That the establishment will maintain a daily stamping log containing the following information... covering the product to be inspected. The daily stamping log must be retained by the establishment in...

  13. 46 CFR 78.37-3 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... completed, the master or person in charge shall file the logbook with the Officer in Charge, Marine... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  14. 40 CFR 60.288a - Reporting.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... test to generate a submission package file, which documents performance test data. You must then submit the file generated by the ERT through the EPA's Compliance and Emissions Data Reporting Interface (CEDRI), which can be accessed by logging in to the EPA's Central Data Exchange (CDX) (https://cdx.epa...

  15. Developing a Complete and Effective ACT-R Architecture

    DTIC Science & Technology

    2008-01-01

    of computational primitives , as contrasted with the predominant “one-off” and “grab-bag” cognitive models in the field. These architectures have...transport/ semaphore protocols connected via a glue script. Both protocols rely on the fact that file rename and file remove operations are atomic...the Trial Log file until just prior to processing the next input request. Thus, to perform synchronous identifications it is necessary to run an

  16. Mechanical reduction of the intracanal Enterococcus faecalis population by Hyflex CM, K3XF, ProTaper Next, and two manual instrument systems: an in vitro comparative study.

    PubMed

    Tewari, Rajendra K; Ali, Sajid; Mishra, Surendra K; Kumar, Ashok; Andrabi, Syed Mukhtar-Un-Nisar; Zoya, Asma; Alam, Sharique

    2016-05-01

    In the present study, the effectiveness of three rotary and two manual nickel titanium instrument systems on mechanical reduction of the intracanal Enterococcus faecalis population was evaluated. Mandibular premolars with straight roots were selected. Teeth were decoronated and instrumented until 20 K file and irrigated with physiological saline. After sterilization by ethylene oxide gas, root canals were inoculated with Enterococcus faecalis. The specimens were randomly divided into five groups for canal instrumentation: Manual Nitiflex and Hero Shaper nickel titanium files, and rotary Hyflex CM, ProTaper Next, and K3XF nickel titanium files. Intracanal bacterial sampling was done before and after instrumentation. After serial dilution, samples were plated onto the Mitis Salivarius agar. The c.f.u. grown were counted, and log10 transformation was calculated. All instrumentation systems significantly reduced the intracanal bacterial population after root canal preparation. ProTaper Next was found to be significantly more effective than Hyflex CM and manual Nitiflex and Hero Shaper. However, ProTaper Next showed no significant difference with K3XF. Canal instrumentation by all the file systems significantly reduced the intracanal Enterococcus faecalis counts. ProTaper Next was found to be most effective in reducing the number of bacteria than other rotary or hand instruments. © 2014 Wiley Publishing Asia Pty Ltd.

  17. Fast skin dose estimation system for interventional radiology

    PubMed Central

    Takata, Takeshi; Kotoku, Jun’ichi; Maejima, Hideyuki; Kumagai, Shinobu; Arai, Norikazu; Kobayashi, Takenori; Shiraishi, Kenshiro; Yamamoto, Masayoshi; Kondo, Hiroshi; Furui, Shigeru

    2018-01-01

    Abstract To minimise the radiation dermatitis related to interventional radiology (IR), rapid and accurate dose estimation has been sought for all procedures. We propose a technique for estimating the patient skin dose rapidly and accurately using Monte Carlo (MC) simulation with a graphical processing unit (GPU, GTX 1080; Nvidia Corp.). The skin dose distribution is simulated based on an individual patient’s computed tomography (CT) dataset for fluoroscopic conditions after the CT dataset has been segmented into air, water and bone based on pixel values. The skin is assumed to be one layer at the outer surface of the body. Fluoroscopic conditions are obtained from a log file of a fluoroscopic examination. Estimating the absorbed skin dose distribution requires calibration of the dose simulated by our system. For this purpose, a linear function was used to approximate the relation between the simulated dose and the measured dose using radiophotoluminescence (RPL) glass dosimeters in a water-equivalent phantom. Differences of maximum skin dose between our system and the Particle and Heavy Ion Transport code System (PHITS) were as high as 6.1%. The relative statistical error (2 σ) for the simulated dose obtained using our system was ≤3.5%. Using a GPU, the simulation on the chest CT dataset aiming at the heart was within 3.49 s on average: the GPU is 122 times faster than a CPU (Core i7–7700K; Intel Corp.). Our system (using the GPU, the log file, and the CT dataset) estimated the skin dose more rapidly and more accurately than conventional methods. PMID:29136194

  18. Fast skin dose estimation system for interventional radiology.

    PubMed

    Takata, Takeshi; Kotoku, Jun'ichi; Maejima, Hideyuki; Kumagai, Shinobu; Arai, Norikazu; Kobayashi, Takenori; Shiraishi, Kenshiro; Yamamoto, Masayoshi; Kondo, Hiroshi; Furui, Shigeru

    2018-03-01

    To minimise the radiation dermatitis related to interventional radiology (IR), rapid and accurate dose estimation has been sought for all procedures. We propose a technique for estimating the patient skin dose rapidly and accurately using Monte Carlo (MC) simulation with a graphical processing unit (GPU, GTX 1080; Nvidia Corp.). The skin dose distribution is simulated based on an individual patient's computed tomography (CT) dataset for fluoroscopic conditions after the CT dataset has been segmented into air, water and bone based on pixel values. The skin is assumed to be one layer at the outer surface of the body. Fluoroscopic conditions are obtained from a log file of a fluoroscopic examination. Estimating the absorbed skin dose distribution requires calibration of the dose simulated by our system. For this purpose, a linear function was used to approximate the relation between the simulated dose and the measured dose using radiophotoluminescence (RPL) glass dosimeters in a water-equivalent phantom. Differences of maximum skin dose between our system and the Particle and Heavy Ion Transport code System (PHITS) were as high as 6.1%. The relative statistical error (2 σ) for the simulated dose obtained using our system was ≤3.5%. Using a GPU, the simulation on the chest CT dataset aiming at the heart was within 3.49 s on average: the GPU is 122 times faster than a CPU (Core i7-7700K; Intel Corp.). Our system (using the GPU, the log file, and the CT dataset) estimated the skin dose more rapidly and more accurately than conventional methods.

  19. Techtalk: Telecommunications for Improving Developmental Education.

    ERIC Educational Resources Information Center

    Caverly, David C.; Broderick, Bill

    1993-01-01

    Explains how to access the Internet, discussing hardware and software considerations, connectivity, and types of access available to users. Describes the uses of electronic mail; TELNET, a method for remotely logging onto another computer; and anonymous File Transfer Protocol (FTP), a method for downloading files from a remote computer. (MAB)

  20. Archive of digital chirp subbottom profile data collected during USGS Cruise 13GFP01, Brownlee Dam and Hells Canyon Reservoir, Idaho and Oregon, 2013

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Fosness, Ryan L.; Welcker, Chris; Kelso, Kyle W.

    2014-01-01

    From March 16 - 31, 2013, the U.S. Geological Survey in cooperation with the Idaho Power Company conducted a geophysical survey to investigate sediment deposits and long-term sediment transport within the Snake River from Brownlee Dam to Hells Canyon Reservoir, along the Idaho and Oregon border; this effort will help the USGS to better understand geologic processes. This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of acronyms and abbreviations used in this report.

  1. Archive of digital chirp subbottom profile data collected during USGS cruise 11BIM01 Offshore of the Chandeleur Islands, Louisiana, June 2011

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Miselis, Jennifer L.; Flocks, James G.; Wiese, Dana S.

    2013-01-01

    From June 3 to 13, 2011, the U.S. Geological Survey conducted a geophysical survey to investigate the geologic controls on barrier island framework and long-term sediment transport along the oil spill mitigation sand berm constructed at the north end and just offshore of the Chandeleur Islands, LA. This effort is part of a broader USGS study, which seeks to better understand barrier island evolution over medium time scales (months to years). This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided.

  2. 46 CFR 196.35-3 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... form CG-706 or in the owner's format for an official logbook. Such logs must be kept available for a... master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection. (b) The... of making entries therein as required by law or regulations in this subchapter. Such logs or records...

  3. 46 CFR 196.35-3 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... form CG-706 or in the owner's format for an official logbook. Such logs must be kept available for a... master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection. (b) The... of making entries therein as required by law or regulations in this subchapter. Such logs or records...

  4. 46 CFR 35.07-5 - Logbooks and records-TB/ALL.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., the master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  5. 29 CFR 1960.28 - Employee reports of unsafe or unhealthful working conditions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... report of an existing or potential unsafe or unhealthful working condition should be recorded on a log maintained at the establishment. If an agency finds it inappropriate to maintain a log of written reports at... sequentially numbered case file, coded for identification, should be assigned for purposes of maintaining an...

  6. 20 CFR 658.422 - Handling of non-JS-related complaints by the Regional Administrator.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... non-JS-related complaints alleging violations of employment related laws shall be logged. The... which the complainant (or complaint) was referred on a complaint log, similar to the one described in § 658.410(c)(1). The appropriate regional official shall also prepare and keep the file specified in...

  7. 46 CFR 35.07-5 - Logbooks and records-TB/ALL.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., the master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  8. Analysis of the access patterns at GSFC distributed active archive center

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore; Bedet, Jean-Jacques

    1996-01-01

    The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been operational for more than two years. Its mission is to support existing and pre Earth Observing System (EOS) Earth science datasets, facilitate the scientific research, and test Earth Observing System Data and Information System (EOSDIS) concepts. Over 550,000 files and documents have been archived, and more than six Terabytes have been distributed to the scientific community. Information about user request and file access patterns, and their impact on system loading, is needed to optimize current operations and to plan for future archives. To facilitate the management of daily activities, the GSFC DAAC has developed a data base system to track correspondence, requests, ingestion and distribution. In addition, several log files which record transactions on Unitree are maintained and periodically examined. This study identifies some of the users' requests and file access patterns at the GSFC DAAC during 1995. The analysis is limited to the subset of orders for which the data files are under the control of the Hierarchical Storage Management (HSM) Unitree. The results show that most of the data volume ordered was for two data products. The volume was also mostly made up of level 3 and 4 data and most of the volume was distributed on 8 mm and 4 mm tapes. In addition, most of the volume ordered was for deliveries in North America although there was a significant world-wide use. There was a wide range of request sizes in terms of volume and number of files ordered. On an average 78.6 files were ordered per request. Using the data managed by Unitree, several caching algorithms have been evaluated for both hit rate and the overhead ('cost') associated with the movement of data from near-line devices to disks. The algorithm called LRU/2 bin was found to be the best for this workload, but the STbin algorithm also worked well.

  9. Real World Experience With Ion Implant Fault Detection at Freescale Semiconductor

    NASA Astrophysics Data System (ADS)

    Sing, David C.; Breeden, Terry; Fakhreddine, Hassan; Gladwin, Steven; Locke, Jason; McHugh, Jim; Rendon, Michael

    2006-11-01

    The Freescale automatic fault detection and classification (FDC) system has logged data from over 3.5 million implants in the past two years. The Freescale FDC system is a low cost system which collects summary implant statistics at the conclusion of each implant run. The data is collected by either downloading implant data log files from the implant tool workstation, or by exporting summary implant statistics through the tool's automation interface. Compared to the traditional FDC systems which gather trace data from sensors on the tool as the implant proceeds, the Freescale FDC system cannot prevent scrap when a fault initially occurs, since the data is collected after the implant concludes. However, the system can prevent catastrophic scrap events due to faults which are not detected for days or weeks, leading to the loss of hundreds or thousands of wafers. At the Freescale ATMC facility, the practical applications of the FD system fall into two categories: PM trigger rules which monitor tool signals such as ion gauges and charge control signals, and scrap prevention rules which are designed to detect specific failure modes that have been correlated to yield loss and scrap. PM trigger rules are designed to detect shifts in tool signals which indicate normal aging of tool systems. For example, charging parameters gradually shift as flood gun assemblies age, and when charge control rules start to fail a flood gun PM is performed. Scrap prevention rules are deployed to detect events such as particle bursts and excessive beam noise, events which have been correlated to yield loss. The FDC system does have tool log-down capability, and scrap prevention rules often use this capability to automatically log the tool into a maintenance state while simultaneously paging the sustaining technician for data review and disposition of the affected product.

  10. Creative Analytics of Mission Ops Event Messages

    NASA Technical Reports Server (NTRS)

    Smith, Dan

    2017-01-01

    Historically, tremendous effort has been put into processing and displaying mission health and safety telemetry data; and relatively little attention has been paid to extracting information from missions time-tagged event log messages. Todays missions may log tens of thousands of messages per day and the numbers are expected to dramatically increase as satellite fleets and constellations are launched, as security monitoring continues to evolve, and as the overall complexity of ground system operations increases. The logs may contain information about orbital events, scheduled and actual observations, device status and anomalies, when operators were logged on, when commands were resent, when there were data drop outs or system failures, and much much more. When dealing with distributed space missions or operational fleets, it becomes even more important to systematically analyze this data. Several advanced information systems technologies make it appropriate to now develop analytic capabilities which can increase mission situational awareness, reduce mission risk, enable better event-driven automation and cross-mission collaborations, and lead to improved operations strategies: Industry Standard for Log Messages. The Object Management Group (OMG) Space Domain Task Force (SDTF) standards organization is in the process of creating a formal standard for industry for event log messages. The format is based on work at NASA GSFC. Open System Architectures. The DoD, NASA, and others are moving towards common open system architectures for mission ground data systems based on work at NASA GSFC with the full support of the commercial product industry and major integration contractors. Text Analytics. A specific area of data analytics which applies statistical, linguistic, and structural techniques to extract and classify information from textual sources. This presentation describes work now underway at NASA to increase situational awareness through the collection of non-telemetry mission operations information into a common log format and then providing display and analytics tools to provide in-depth assessment of the log contents. The work includes: Common interface formats for acquiring time-tagged text messages Conversion of common files for schedules, orbital events, and stored commands to the common log format Innovative displays to depict thousands of messages on a single display Structured English text queries against the log message data store, extensible to a more mature natural language query capability Goal of speech-to-text and text-to-speech additions to create a personal mission operations assistant to aid on-console operations. A wide variety of planned uses identified by the mission operations teams will be discussed.

  11. The Wettzell System Monitoring Concept and First Realizations

    NASA Technical Reports Server (NTRS)

    Ettl, Martin; Neidhardt, Alexander; Muehlbauer, Matthias; Ploetz, Christian; Beaudoin, Christopher

    2010-01-01

    Automated monitoring of operational system parameters for the geodetic space techniques is becoming more important in order to improve the geodetic data and to ensure the safety and stability of automatic and remote-controlled observations. Therefore, the Wettzell group has developed the system monitoring software, SysMon, which is based on a reliable, remotely-controllable hardware/software realization. A multi-layered data logging system based on a fanless, robust industrial PC with an internal database system is used to collect data from several external, serial, bus, or PCI-based sensors. The internal communication is realized with Remote Procedure Calls (RPC) and uses generative programming with the interface software generator idl2rpc.pl developed at Wettzell. Each data monitoring stream can be configured individually via configuration files to define the logging rates or analog-digital-conversion parameters. First realizations are currently installed at the new laser ranging system at Wettzell to address safety issues and at the VLBI station O Higgins as a meteorological data logger. The system monitoring concept should be realized for the Wettzell radio telescope in the near future.

  12. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hellfire Missile Test at Yuma Proving Ground

    DTIC Science & Technology

    2001-11-01

    that there were· no· target misses. The Hellfire missile does not have a depleted uranium head . . -,, 2.2.2.3 Tank movement During the test, the...guide otber users through the use of this. complicated program. The_input data files for NOISEMAP consist of a root file name with several extensions...SOURCES subdirectory. This file will have the root file name followed by an accession number, then the .bps extension. The user must check the *.log

  13. Evaluation of an interactive case simulation system in dermatology and venereology for medical students

    PubMed Central

    Wahlgren, Carl-Fredrik; Edelbring, Samuel; Fors, Uno; Hindbeck, Hans; Ståhle, Mona

    2006-01-01

    Background Most of the many computer resources used in clinical teaching of dermatology and venereology for medical undergraduates are information-oriented and focus mostly on finding a "correct" multiple-choice alternative or free-text answer. We wanted to create an interactive computer program, which facilitates not only factual recall but also clinical reasoning. Methods Through continuous interaction with students, a new computerised interactive case simulation system, NUDOV, was developed. It is based on authentic cases and contains images of real patients, actors and healthcare providers. The student selects a patient and proposes questions for medical history, examines the skin, and suggests investigations, diagnosis, differential diagnoses and further management. Feedback is given by comparing the user's own suggestions with those of a specialist. In addition, a log file of the student's actions is recorded. The program includes a large number of images, video clips and Internet links. It was evaluated with a student questionnaire and by randomising medical students to conventional teaching (n = 85) or conventional teaching plus NUDOV (n = 31) and comparing the results of the two groups in a final written examination. Results The questionnaire showed that 90% of the NUDOV students stated that the program facilitated their learning to a large/very large extent, and 71% reported that extensive working with authentic computerised cases made it easier to understand and learn about diseases and their management. The layout, user-friendliness and feedback concept were judged as good/very good by 87%, 97%, and 100%, respectively. Log files revealed that the students, in general, worked with each case for 60–90 min. However, the intervention group did not score significantly better than the control group in the written examination. Conclusion We created a computerised case simulation program allowing students to manage patients in a non-linear format supporting the clinical reasoning process. The student gets feedback through comparison with a specialist, eliminating the need for external scoring or correction. The model also permits discussion of case processing, since all transactions are stored in a log file. The program was highly appreciated by the students, but did not significantly improve their performance in the written final examination. PMID:16907972

  14. Umbra (core)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, Jon David; Oppel III, Fred J.; Hart, Brian E.

    Umbra is a flexible simulation framework for complex systems that can be used by itself for modeling, simulation, and analysis, or to create specific applications. It has been applied to many operations, primarily dealing with robotics and system of system simulations. This version, from 4.8 to 4.8.3b, incorporates bug fixes, refactored code, and new managed C++ wrapper code that can be used to bridge new applications written in C# to the C++ libraries. The new managed C++ wrapper code includes (project/directories) BasicSimulation, CSharpUmbraInterpreter, LogFileView, UmbraAboutBox, UmbraControls, UmbraMonitor and UmbraWrapper.

  15. 25 CFR 215.23 - Cooperation between superintendent and district mining supervisor.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... notices, reports, drill logs, maps, and records, and all other information relating to mining operations required by said regulations to be submitted by lessees, and shall maintain a file thereof for the superintendent. (b) The files of the Geological Survey supervisor relating to lead and zinc leases of Quapaw...

  16. Agentless Cloud-Wide Monitoring of Virtual Disk State

    DTIC Science & Technology

    2015-10-01

    packages include Apache, MySQL , PHP, Ruby on Rails, Java Application Servers, and many others. Figure 2.12 shows the results of a run of the Software...Linux, Apache, MySQL , PHP (LAMP) set of applications. Thus, many file-level update logs will contain the same versions of files repeated across many

  17. Military Standard Common APSE (Ada Programming Support Environment) Interface Set (CAIS).

    DTIC Science & Technology

    1985-01-01

    QUEUEASE. LAST-KEY (QUEENAME) . LASTREI.TIONI(QUEUE-NAME). FILE-NODE. PORN . ATTRIBUTTES. ACCESSCONTROL. LEVEL); CLOSE (QUEUE BASE); CLOSE(FILE NODE...PROPOSED XIIT-STD-C.4 31 J NNUAfY logs procedure zTERT (ITERATOR: out NODE ITERATON; MAMIE: NAME STRING.KIND: NODE KID : KEY : RELATIONSHIP KEY PA1TTE1 :R

  18. 47 CFR 76.1704 - Proof-of-performance test data.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-performance test data. (a) The proof of performance tests required by § 76.601 shall be maintained on file at... subscribers, subject to the requirements of § 76.601(d). Note to § 76.1704: If a signal leakage log is being... log must be retained for the period specified in § 76.601(d). ...

  19. 49 CFR Appendix A to Part 225 - Schedule of Civil Penalties 1

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... $1,000 $2,000 225.11Reports of accidents/ incidents 2,500 5,000 225.12(a): Failure to file Railroad... noncompliance: (1) a missing or incomplete log entry for a particular employee's injury or illness; or (2) a missing or incomplete log record for a particular rail equipment accident or incident. Each day a...

  20. 47 CFR 76.1704 - Proof-of-performance test data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-performance test data. (a) The proof of performance tests required by § 76.601 shall be maintained on file at... subscribers, subject to the requirements of § 76.601(d). Note to § 76.1704: If a signal leakage log is being... log must be retained for the period specified in § 76.601(d). ...

  1. Beyond Logging of Fingertip Actions: Analysis of Collaborative Learning Using Multiple Sources of Data

    ERIC Educational Resources Information Center

    Avouris, N.; Fiotakis, G.; Kahrimanis, G.; Margaritis, M.; Komis, V.

    2007-01-01

    In this article, we discuss key requirements for collecting behavioural data concerning technology-supported collaborative learning activities. It is argued that the common practice of analysis of computer generated log files of user interactions with software tools is not enough for building a thorough view of the activity. Instead, more…

  2. Consistency of Students' Pace in Online Learning

    ERIC Educational Resources Information Center

    Hershkovitz, Arnon; Nachmias, Rafi

    2009-01-01

    The purpose of this study is to investigate the consistency of students' behavior regarding their pace of actions over sessions within an online course. Pace in a session is defined as the number of logged actions divided by session length (in minutes). Log files of 6,112 students were collected, and datasets were constructed for examining pace…

  3. Archive of digital CHIRP seismic reflection data collected during USGS cruise 06FSH01 offshore of Siesta Key, Florida, May 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.; Robbins, Lisa L.

    2007-01-01

    In May of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Siesta Key, Florida. This report serves as an archive of unprocessed digital chirp seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  4. Archive of digital CHIRP seismic reflection data collected during USGS cruise 06SCC01 offshore of Isles Dernieres, Louisiana, June 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Ferina, Nick F.; Wiese, Dana S.; Flocks, James G.

    2007-01-01

    In June of 2006, the U.S. Geological Survey conducted a geophysical survey offshore of Isles Dernieres, Louisiana. This report serves as an archive of unprocessed digital CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic UNIX (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  5. Archive of Side Scan Sonar and Swath Bathymetry Data collected during USGS Cruise 10CCT02 Offshore of Petit Bois Island Including Petit Bois Pass, Gulf Islands National Seashore, Mississippi, March 2010

    USGS Publications Warehouse

    Pfeiffer, William R.; Flocks, James G.; DeWitt, Nancy T.; Forde, Arnell S.; Kelso, Kyle; Thompson, Phillip R.; Wiese, Dana S.

    2011-01-01

    In March of 2010, the U.S. Geological Survey (USGS) conducted geophysical surveys offshore of Petit Bois Island, Mississippi, and Dauphin Island, Alabama (fig. 1). These efforts were part of the USGS Gulf of Mexico Science Coordination partnership with the U.S. Army Corps of Engineers (USACE) to assist the Mississippi Coastal Improvements Program (MsCIP) and the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazards Susceptibility Project by mapping the shallow geologic stratigraphic framework of the Mississippi Barrier Island Complex. These geophysical surveys will provide the data necessary for scientists to define, interpret, and provide baseline bathymetry and seafloor habitat for this area and to aid scientists in predicting future geomorphological changes of the islands with respect to climate change, storm impact, and sea-level rise. Furthermore, these data will provide information for barrier island restoration, particularly in Camille Cut, and protection for the historical Fort Massachusetts on Ship Island, Mississippi. For more information please refer to http://ngom.usgs.gov/gomsc/mscip/index.html. This report serves as an archive of the processed swath bathymetry and side scan sonar data (SSS). Data products herein include gridded and interpolated surfaces, seabed backscatter images, and ASCII x,y,z data products for both swath bathymetry and side scan sonar imagery. Additional files include trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal FGDC metadata. Scanned images of the handwritten and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.

  6. A Kinect-based system for automatic recording of some pigeon behaviors.

    PubMed

    Lyons, Damian M; MacDonall, James S; Cunningham, Kelly M

    2015-12-01

    Contact switches and touch screens are the state of the art for recording pigeons' pecking behavior. Recording other behavior, however, requires a different sensor for each behavior, and some behaviors cannot easily be recorded. We present a flexible and inexpensive image-based approach to detecting and counting pigeon behaviors that is based on the Kinect sensor from Microsoft. Although the system is as easy to set up and use as the standard approaches, it is more flexible because it can record behaviors in addition to key pecking. In this article, we show how both the fast, fine motion of key pecking and the gross body activity of feeding can be measured. Five pigeons were trained to peck at a lighted contact switch, a pigeon key, to obtain food reward. The timing of the pecks and the food reward signals were recorded in a log file using standard equipment. The Kinect-based system, called BehaviorWatch, also measured the pecking and feeding behavior and generated a different log file. For key pecking, BehaviorWatch had an average sensitivity of 95% and a precision of 91%, which were very similar to the pecking measurements from the standard equipment. For detecting feeding activity, BehaviorWatch had a sensitivity of 95% and a precision of 97%. These results allow us to demonstrate that an advantage of the Kinect-based approach is that it can also be reliably used to measure activity other than key pecking.

  7. Well construction information, lithologic logs, water level data, and overview of research in Handcart Gulch, Colorado: an alpine watershed affected by metalliferous hydrothermal alteration

    USGS Publications Warehouse

    Caine, Jonathan S.; Manning, Andrew H.; Verplanck, Philip L.; Bove, Dana J.; Kahn, Katherine Gurley; Ge, Shemin

    2006-01-01

    Integrated, multidisciplinary studies of the Handcart Gulch alpine watershed provide a unique opportunity to study and characterize the geology and hydrology of an alpine watershed along the Continental Divide. The study area arose out of the donation of four abandoned, deep mineral exploration boreholes to the U.S. Geological Survey for research purposes by Mineral Systems Inc. These holes were supplemented with nine additional shallow holes drilled by the U.S. Geological Survey along the Handcart Gulch trunk stream. All of the holes were converted into observation wells, and a variety of data and samples were measured and collected from each. This open-file report contains: (1) An overview of the research conducted to date in Handcart Gulch; (2) well location, construction, lithologic log, and water level data from the research boreholes; and (3) a brief synopsis of preliminary results. The primary purpose of this report is to provide a research overview as well as raw data from the boreholes. Interpretation of the data will be reported in future publications. The drill hole data were tabulated into a spreadsheet included with this digital open-file report.

  8. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  9. Analysis of the Impact of Data Normalization on Cyber Event Correlation Query Performance

    DTIC Science & Technology

    2012-03-01

    2003). Organizations use it in planning, target marketing , decision-making, data analysis, and customer services (Shin, 2003). Organizations that...Following this IP address is a router message sequence number. This is a globally unique number for each router terminal and can range from...Appendix G, invokes the PERL parser for the log files from a particular USAF base, and invokes the CTL file that loads the resultant CSV file into the

  10. Archive of digital chirp subbottom profile data collected during USGS Cruise 13CCT04 offshore of Petit Bois Island, Mississippi, August 2013

    USGS Publications Warehouse

    Forde, Arnell S.; Flocks, James G.; Kindinger, Jack G.; Bernier, Julie C.; Kelso, Kyle W.; Wiese, Dana S.

    2015-01-01

    From August 13-23, 2013, the U.S. Geological Survey (USGS), in cooperation with the U.S. Army Corps of Engineers (USACE) conducted geophysical surveys to investigate the geologic controls on barrier island framework and long-term sediment transport offshore of Petit Bois Island, Mississippi. This investigation is part of a broader USGS study on Coastal Change and Transport (CCT). These surveys were funded through the Mississippi Coastal Improvements Program (MsCIP) with partial funding provided by the Northern Gulf of Mexico Ecosystem Change and Hazard Susceptibility Project. This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained-showing a relative increase in signal amplitude-digital images of the seismic profiles are provided.

  11. Archive of Digital Chirp Sub-bottom Profile Data Collected During USGS Cruises 08CCT02 and 08CCT03, Mississippi Gulf Islands, July and September 2008

    USGS Publications Warehouse

    Barry, K.M.; Cavers, D.A.; Kneale, C.W.

    2011-01-01

    In July and September of 2008, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on island framework from Ship Island to Horn Island, MS, for the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility project. This project is also part of a broader USGS study on Coastal Change and Transport (CCT). This report serves as an archive of unprocessed digital Chirp sub-bottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, observer's logbook, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (a relative increase in signal amplitude) digital images of the sub-bottom profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.

  12. Examining the Return on Investment of a Security Information and Event Management Solution in a Notional Department of Defense Network Environment

    DTIC Science & Technology

    2013-06-01

    collection are the facts that devices the lack encryption or compression methods and that the log file must be saved on the host system prior to transfer...time. Statistical correlation utilizes numerical algorithms to detect deviations from normal event levels and other routine activities (Chuvakin...can also assist in detecting low volume threats. Although easy and logical to implement, the implementation of statistical correlation algorithms

  13. 32 CFR 776.80 - Initial screening and Rules Counsel.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Director, JA Division, HQMC, to JAR. (b) JAG(13) and JAR shall log all complaints received and will ensure... within 30 days of the date of its return, the Rules Counsel may close the file without further action... action to close the file. (2) Complaints that comply with the requirements shall be further reviewed by...

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wijesooriya, K; Seitter, K; Desai, V

    Purpose: To present our single institution experience on catching errors with trajectory log file analysis. The reported causes of failures, probability of occurrences (O), severity of effects (S), and the probability of the failures to be undetected (D) could be added to guidelines of FMEA analysis. Methods: From March 2013 to March 2014, 19569 patient treatment fields/arcs were analyzed. This work includes checking all 131 treatment delivery parameters for all patients, all treatment sites and all treatment delivery fractions. TrueBeam trajectory log files for all treatment field types as well as all imaging types were accessed, read in every 20ms,more » and every control point (total of 37 million parameters) compared to the physician approved plan in the planning system. Results: Couch angle outlier occurrence: N= 327, range = −1.7 −1.2 deg; gantry angle outlier occurrence: N =59, range = 0.09 – 5.61 deg, collimator angle outlier occurrence: N = 13, range = −0.2 – 0.2 deg. VMAT cases have slightly larger variations in mechanical parameters. MLC: 3D single control point fields have a maximum deviation of 0.04 mm, 39 step and shoot IMRT cases have MLC −0.3 – 0.5 mm deviations, all (1286) VMAT cases have −0.9 – 0.7 mm deviations. Two possible serious errors were found: 1) A 4 cm isocenter shift for the PA beam of an AP-PA pair, under-dosing a portion of PTV by 25%. 2) Delivery with MLC leaves abutted behind the jaws as opposed to the midline as planned, leading to a under-dosing of a small volume of the PTV by 25%, by just the boost plan. Due to their error origin, neither of these errors could have been detected by pre-treatment verification. Conclusion: Performing Trajectory Log file analysis could catch typically undetected errors to avoid potentially adverse incidents.« less

  15. SU-E-T-261: Plan Quality Assurance of VMAT Using Fluence Images Reconstituted From Log-Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katsuta, Y; Shimizu, E; Matsunaga, K

    2014-06-01

    Purpose: A successful VMAT plan delivery includes precise modulations of dose rate, gantry rotational and multi-leaf collimator (MLC) shapes. One of the main problem in the plan quality assurance is dosimetric errors associated with leaf-positional errors are difficult to analyze because they vary with MU delivered and leaf number. In this study, we calculated integrated fluence error image (IFEI) from log-files and evaluated plan quality in the area of all and individual MLC leaves scanned. Methods: The log-file reported the expected and actual position for inner 20 MLC leaves and the dose fraction every 0.25 seconds during prostate VMAT onmore » Elekta Synergy. These data were imported to in-house software that developed to calculate expected and actual fluence images from the difference of opposing leaf trajectories and dose fraction at each time. The IFEI was obtained by adding all of the absolute value of the difference between expected and actual fluence images corresponding. Results: In the area all MLC leaves scanned in the IFEI, the average and root mean square (rms) were 2.5 and 3.6 MU, the area of errors below 10, 5 and 3 MU were 98.5, 86.7 and 68.1 %, the 95 % of area was covered with less than error of 7.1 MU. In the area individual MLC leaves scanned in the IFEI, the average and rms value were 2.1 – 3.0 and 3.1 – 4.0 MU, the area of errors below 10, 5 and 3 MU were 97.6 – 99.5, 81.7 – 89.5 and 51.2 – 72.8 %, the 95 % of area was covered with less than error of 6.6 – 8.2 MU. Conclusion: The analysis of the IFEI reconstituted from log-file was provided detailed information about the delivery in the area of all and individual MLC leaves scanned.« less

  16. 9 CFR 381.204 - Marking of poultry products offered for entry; official import inspection marks and devices.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., Import Inspection Division, is on file at the import inspection facility where the inspection is to be... stamping log containing the following information for each lot of product: the date of inspection, the... container marks, and the MP-410 number covering the product to be inspected. The daily stamping log must be...

  17. Request queues for interactive clients in a shared file system of a parallel computing system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin

    Interactive requests are processed from users of log-in nodes. A metadata server node is provided for use in a file system shared by one or more interactive nodes and one or more batch nodes. The interactive nodes comprise interactive clients to execute interactive tasks and the batch nodes execute batch jobs for one or more batch clients. The metadata server node comprises a virtual machine monitor; an interactive client proxy to store metadata requests from the interactive clients in an interactive client queue; a batch client proxy to store metadata requests from the batch clients in a batch client queue;more » and a metadata server to store the metadata requests from the interactive client queue and the batch client queue in a metadata queue based on an allocation of resources by the virtual machine monitor. The metadata requests can be prioritized, for example, based on one or more of a predefined policy and predefined rules.« less

  18. Analysis of the factors influencing healthcare professionals' adoption of mobile electronic medical record (EMR) using the unified theory of acceptance and use of technology (UTAUT) in a tertiary hospital.

    PubMed

    Kim, Seok; Lee, Kee-Hyuck; Hwang, Hee; Yoo, Sooyoung

    2016-01-30

    Although the factors that affect the end-user's intention to use a new system and technology have been researched, the previous studies have been theoretical and do not verify the factors that affected the adoption of a new system. Thus, this study aimed to confirm the factors that influence users' intentions to utilize a mobile electronic health records (EMR) system using both a questionnaire survey and a log file analysis that represented the real use of the system. After observing the operation of a mobile EMR system in a tertiary university hospital for seven months, we performed an offline survey regarding the user acceptance of the system based on the Unified Theory of Acceptance and Use of Technology (UTAUT) and the Technology Acceptance Model (TAM). We surveyed 942 healthcare professionals over two weeks and performed a structural equation modeling (SEM) analysis to identify the intention to use the system among the participants. Next, we compared the results of the SEM analysis with the results of the analyses of the actual log files for two years to identify further insights into the factors that affected the intention of use. For these analyses, we used SAS 9.0 and AMOS 21. Of the 942 surveyed end-users, 48.3 % (23.2 % doctors and 68.3 % nurses) responded. After eliminating six subjects who completed the survey insincerely, we conducted the SEM analyses on the data from 449 subjects (65 doctors and 385 nurses). The newly suggested model satisfied the standards of model fitness, and the intention to use it was especially high due to the influences of Performance Expectancy on Attitude and Attitude. Based on the actual usage log analyses, both the doctors and nurses used the menus to view the inpatient lists, alerts, and patients' clinical data with high frequency. Specifically, the doctors frequently retrieved laboratory results, and the nurses frequently retrieved nursing notes and used the menu to assume the responsibilities of nursing work. In this study, the end-users' intentions to use the mobile EMR system were particularly influenced by Performance Expectancy and Attitude. In reality, the usage log revealed high-frequency use of the functions to improve the continuity of care and work efficiency. These results indicate the influence of the factor of performance expectancy on the intention to use the mobile EMR system. Consequently, we suggest that when determining the implementation of mobile EMR systems, the functions that are related to workflow with ability to increase performance should be considered first.

  19. Ground-water data for the Hanna and Carbon basins, south-central Wyoming, through 1980

    USGS Publications Warehouse

    Daddow, P.B.

    1986-01-01

    Groundwater resources in the Hanna and Carbon Basins of Wyoming were assessed in a study from 1974 through 1980 because of the development of coal mining in the area. Data collected from 105 wells during that study, including well-completion records, lithologic logs, and water levels, are presented. The data are from stock wells, coal-test holes completed as observation wells by the U.S. Geological Survey. The data are mostly from mined coal-bearing formations: the Tertiary Hanna Formation and the Tertiary and Cretaceous Ferris Formation. Well-completion data and lithologic logs were collected on-site during drilling of the wells or from U.S. Geological Survey files, company records, Wyoming State Engineer well-permit files, and published reports. (USGS)

  20. VizieR Online Data Catalog: The Gemini Observation Log (CADC, 2001-)

    NASA Astrophysics Data System (ADS)

    Association of Universities For Research in Astronomy

    2018-01-01

    This database contains a log of the Gemini Telescope observations since 2001, managed by the Canadian Astronomical Data Center (CADC). The data are regularly updated (see the date of the last version at the end of this file). The Gemini Observatory consists of twin 8.1-meter diameter optical/infrared telescopes located on two of the best observing sites on the planet. From their locations on mountains in Hawai'i and Chile, Gemini Observatory's telescopes can collectively access the entire sky. Gemini is operated by a partnership of five countries including the United States, Canada, Brazil, Argentina and Chile. Any astronomer in these countries can apply for time on Gemini, which is allocated in proportion to each partner's financial stake. (1 data file).

  1. Ontology based log content extraction engine for a posteriori security control.

    PubMed

    Azkia, Hanieh; Cuppens-Boulahia, Nora; Cuppens, Frédéric; Coatrieux, Gouenou

    2012-01-01

    In a posteriori access control, users are accountable for actions they performed and must provide evidence, when required by some legal authorities for instance, to prove that these actions were legitimate. Generally, log files contain the needed data to achieve this goal. This logged data can be recorded in several formats; we consider here IHE-ATNA (Integrating the healthcare enterprise-Audit Trail and Node Authentication) as log format. The difficulty lies in extracting useful information regardless of the log format. A posteriori access control frameworks often include a log filtering engine that provides this extraction function. In this paper we define and enforce this function by building an IHE-ATNA based ontology model, which we query using SPARQL, and show how the a posteriori security controls are made effective and easier based on this function.

  2. Detection of Anomalous Insiders in Collaborative Environments via Relational Analysis of Access Logs

    PubMed Central

    Chen, You; Malin, Bradley

    2014-01-01

    Collaborative information systems (CIS) are deployed within a diverse array of environments, ranging from the Internet to intelligence agencies to healthcare. It is increasingly the case that such systems are applied to manage sensitive information, making them targets for malicious insiders. While sophisticated security mechanisms have been developed to detect insider threats in various file systems, they are neither designed to model nor to monitor collaborative environments in which users function in dynamic teams with complex behavior. In this paper, we introduce a community-based anomaly detection system (CADS), an unsupervised learning framework to detect insider threats based on information recorded in the access logs of collaborative environments. CADS is based on the observation that typical users tend to form community structures, such that users with low a nity to such communities are indicative of anomalous and potentially illicit behavior. The model consists of two primary components: relational pattern extraction and anomaly detection. For relational pattern extraction, CADS infers community structures from CIS access logs, and subsequently derives communities, which serve as the CADS pattern core. CADS then uses a formal statistical model to measure the deviation of users from the inferred communities to predict which users are anomalies. To empirically evaluate the threat detection model, we perform an analysis with six months of access logs from a real electronic health record system in a large medical center, as well as a publicly-available dataset for replication purposes. The results illustrate that CADS can distinguish simulated anomalous users in the context of real user behavior with a high degree of certainty and with significant performance gains in comparison to several competing anomaly detection models. PMID:25485309

  3. Archive of digital boomer and CHIRP seismic reflection data collected during USGS cruise 06FSH03 offshore of Fort Lauderdale, Florida, September 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Reich, Christopher D.; Wiese, Dana S.; Greenwood, Jason W.; Swarzenski, Peter W.

    2007-01-01

    In September of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Fort Lauderdale, FL. This report serves as an archive of unprocessed digital boomer and CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  4. Archive of Digitized Analog Boomer Seismic Reflection Data Collected from the Mississippi-Alabama-Florida Shelf During Cruises Onboard the R/V Kit Jones, June 1990 and July 1991

    USGS Publications Warehouse

    Sanford, Jordan M.; Harrison, Arnell S.; Wiese, Dana S.; Flocks, James G.

    2009-01-01

    In June of 1990 and July of 1991, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the shallow geologic framework of the Mississippi-Alabama-Florida shelf in the northern Gulf of Mexico, from Mississippi Sound to the Florida Panhandle. Work was done onboard the Mississippi Mineral Resources Institute R/V Kit Jones as part of a project to study coastal erosion and offshore sand resources. This report is part of a series to digitally archive the legacy analog data collected from the Mississippi-Alabama SHelf (MASH). The MASH data rescue project is a cooperative effort by the USGS and the Minerals Management Service (MMS). This report serves as an archive of high-resolution scanned Tagged Image File Format (TIFF) and Graphics Interchange Format (GIF) images of the original boomer paper records, navigation files, trackline maps, Geographic Information System (GIS) files, cruise logs, and formal Federal Geographic Data Committee (FGDC) metadata.

  5. Well 9-1 Logs and Data: Roosevelt Hot Spring Area, Utah (FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 9-1 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  6. Understanding Academic Information Seeking Habits through Analysis of Web Server Log Files: The Case of the Teachers College Library Website

    ERIC Educational Resources Information Center

    Asunka, Stephen; Chae, Hui Soo; Hughes, Brian; Natriello, Gary

    2009-01-01

    Transaction logs of user activity on an academic library website were analyzed to determine general usage patterns on the website. This paper reports on insights gained from the analysis, and identifies and discusses issues relating to content access, interface design and general functionality of the website. (Contains 13 figures and 8 tables.)

  7. The Use of OPAC in a Large Academic Library: A Transactional Log Analysis Study of Subject Searching

    ERIC Educational Resources Information Center

    Villen-Rueda, Luis; Senso, Jose A.; de Moya-Anegon, Felix

    2007-01-01

    The analysis of user searches in catalogs has been the topic of research for over four decades, involving numerous studies and diverse methodologies. The present study looks at how different types of users effect queries in the catalog of a university library. For this purpose, we analyzed log files to determine which was the most frequent type of…

  8. Family Child Care Inventory-Keeper: The Complete Log for Depreciating and Insuring Your Property. Redleaf Business Series.

    ERIC Educational Resources Information Center

    Copeland, Tom

    Figuring depreciation can be the most difficult aspect of filing tax returns for a family child care program. This inventory log for family child care programs is designed to assist in keeping track of the furniture, appliances, and other property used in the child care business; once these items have been identified, they can be deducted as…

  9. Wister, CA Downhole and Seismic Data

    DOE Data Explorer

    Akerley, John

    2010-12-18

    This submission contains Downhole geophysical logs associated with Wister, CA Wells 12-27 and 85-20. The logs include Spontaneous Potential (SP), HILT Caliper (HCAL), Gamma Ray (GR), Array Induction (AIT), and Neutron Porosity (NPOR) data. Also included are a well log, Injection Test, Pressure Temperature Spinner log, shut in temperature survey, a final well schematic, and files about the well's location and drilling history. This submission also contains data from a three-dimensional (3D) multi-component (3C) seismic reflection survey on the Wister Geothermal prospect area in the northern portion of the Imperial Valley, California. The Wister seismic survey area was 13.2 square miles. (Resistivity image logs (Schlumberger FMI) in 85-20 indicate that maximum horizontal stress (Shmax) is oriented NNE but that open fractures are oriented suboptimally).

  10. PKI solar thermal plant evaluation at Capitol Concrete Products, Topeka, Kansas

    NASA Astrophysics Data System (ADS)

    Hauger, J. S.; Borton, D. N.

    1982-07-01

    A system feasibility test to determine the technical and operational feasibility of using a solar collector to provide industrial process heat is discussed. The test is of a solar collector system in an industrial test bed plant at Capitol Concrete Products in Topeka, Kansas, with an experiment control at Sandia National Laboratories, Albuquerque. Plant evaluation will occur during a year-long period of industrial utilization. It will include performance testing, operability testing, and system failure analysis. Performance data will be recorded by a data acquisition system. User, community, and environmental inputs will be recorded in logs, journals, and files. Plant installation, start-up, and evaluation, are anticipated for late November, 1981.

  11. PKI solar thermal plant evaluation at Capitol Concrete Products, Topeka, Kansas

    NASA Technical Reports Server (NTRS)

    Hauger, J. S.; Borton, D. N.

    1982-01-01

    A system feasibility test to determine the technical and operational feasibility of using a solar collector to provide industrial process heat is discussed. The test is of a solar collector system in an industrial test bed plant at Capitol Concrete Products in Topeka, Kansas, with an experiment control at Sandia National Laboratories, Albuquerque. Plant evaluation will occur during a year-long period of industrial utilization. It will include performance testing, operability testing, and system failure analysis. Performance data will be recorded by a data acquisition system. User, community, and environmental inputs will be recorded in logs, journals, and files. Plant installation, start-up, and evaluation, are anticipated for late November, 1981.

  12. Mobile terrestrial light detection and ranging (T-LiDAR) survey of areas on Dauphin Island, Alabama, in the aftermath of Hurricane Isaac, 2012

    USGS Publications Warehouse

    Kimbrow, Dustin R.

    2014-01-01

    Topographic survey data of areas on Dauphin Island on the Alabama coast were collected using a truck-mounted mobile terrestrial light detection and ranging system. This system is composed of a high frequency laser scanner in conjunction with an inertial measurement unit and a position and orientation computer to produce highly accurate topographic datasets. A global positioning system base station was set up on a nearby benchmark and logged vertical and horizontal position information during the survey for post-processing. Survey control points were also collected throughout the study area to determine residual errors. Data were collected 5 days after Hurricane Isaac made landfall in early September 2012 to document sediment deposits prior to clean-up efforts. Three data files in ASCII text format with the extension .xyz are included in this report, and each file is named according to both the acquisition date and the relative geographic location on Dauphin Island (for example, 20120903_Central.xyz). Metadata are also included for each of the files in both Extensible Markup Language with the extension .xml and ASCII text formats. These topographic data can be used to analyze the effects of storm surge on barrier island environments and also serve as a baseline dataset for future change detection analyses.

  13. Analyzing Medical Image Search Behavior: Semantics and Prediction of Query Results.

    PubMed

    De-Arteaga, Maria; Eggel, Ivan; Kahn, Charles E; Müller, Henning

    2015-10-01

    Log files of information retrieval systems that record user behavior have been used to improve the outcomes of retrieval systems, understand user behavior, and predict events. In this article, a log file of the ARRS GoldMiner search engine containing 222,005 consecutive queries is analyzed. Time stamps are available for each query, as well as masked IP addresses, which enables to identify queries from the same person. This article describes the ways in which physicians (or Internet searchers interested in medical images) search and proposes potential improvements by suggesting query modifications. For example, many queries contain only few terms and therefore are not specific; others contain spelling mistakes or non-medical terms that likely lead to poor or empty results. One of the goals of this report is to predict the number of results a query will have since such a model allows search engines to automatically propose query modifications in order to avoid result lists that are empty or too large. This prediction is made based on characteristics of the query terms themselves. Prediction of empty results has an accuracy above 88%, and thus can be used to automatically modify the query to avoid empty result sets for a user. The semantic analysis and data of reformulations done by users in the past can aid the development of better search systems, particularly to improve results for novice users. Therefore, this paper gives important ideas to better understand how people search and how to use this knowledge to improve the performance of specialized medical search engines.

  14. VizieR Online Data Catalog: Astron low resolution UV spectra (Boyarchuk+, 1994)

    NASA Astrophysics Data System (ADS)

    Boyarchuk, A. A.

    2017-05-01

    Astron was a Soviet spacecraft launched on 23 March 1983, and it was operational for eight years as the largest ultraviolet space telescope during its lifetime. Astron's payload consisted of an 80 cm ultraviolet telescope Spica and an X-ray spectroscope. We present 159 low resolution spectra of stars obtained during the Astron space mission (Tables 4, 5; hereafter table numbers in Boyarchuk et al. 1994 are given). Table 4 (observational log, logs.dat) contains data on 142 sessions for 90 stars (sorted in ascending order of RA), where SED was obtained by scanning method, and then data on 17 sessions for 15 stars (also sorted in ascending order of RA), where multicolor photometry was done. Kilpio et al. (2016, Baltic Astronomy 25, 23) presented results of the comparison of Astron data to the modern UV stellar data, discussed Astron precision and accuracy, and made some conclusions on potential application areas of these data. Also 34 sessions of observations of 27 stellar systems (galaxies and globular clusters) are presented. Observational log was published in Table 10 and data were published in Table 11, respectively. Also 16 sessions of observations of 12 nebulae (Table 12 for observational log and Table 13 for data themselves) are presented. Background radiation intensity data (Table 14) are presented in Table 15. At last, data on comets are presented in different forms. We draw your attention that observational data for stars, stellar systems, nebulae and comets are expressed in log [erg/s/cm^2/A], while for comets data 10E-13 erg/s/cm^2/A units are used, hydroxyl band photometric data for comets are expressed in log [erg/s/cm^2], and for the background data it is radiation intensity expressed in log [erg/s/cm^2/A/sr]. Scanned (PDF version of) Boyarchuk et al. (1994) book is available at http://www.inasan.ru/~astron/astron.pdf (12 data files).

  15. Sediment data collected in 2010 from Cat Island, Mississippi

    USGS Publications Warehouse

    Buster, Noreen A.; Kelso, Kyle W.; Miselis, Jennifer L.; Kindinger, Jack G.

    2014-01-01

    Scientists from the U.S. Geological Survey, St. Petersburg Coastal and Marine Science Center, in collaboration with the U.S. Army Corps of Engineers, conducted geophysical and sedimentological surveys in 2010 around Cat Island, Mississippi, which is the westernmost island in the Mississippi-Alabama barrier island chain. The objective of the study was to understand the geologic evolution of Cat Island relative to other barrier islands in the northern Gulf of Mexico by identifying relationships between the geologic history, present day morphology, and sediment distribution. This data series serves as an archive of terrestrial and marine sediment vibracores collected August 4-6 and October 20-22, 2010, respectively. Geographic information system data products include marine and terrestrial core locations and 2007 shoreline data. Additional files include marine and terrestrial core description logs, core photos, results of sediment grain-size analyses, optically stimulated luminescence dating and carbon-14 dating locations and results, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.

  16. Design and implementation of wireless dose logger network for radiological emergency decision support system.

    PubMed

    Gopalakrishnan, V; Baskaran, R; Venkatraman, B

    2016-08-01

    A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee-Pro wireless modules and PSoC controller for wireless interfacing, and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.

  17. Design and implementation of wireless dose logger network for radiological emergency decision support system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopalakrishnan, V.; Baskaran, R.; Venkatraman, B.

    A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee–Pro wireless modules and PSoC controller for wireless interfacing,more » and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.« less

  18. Well 14-2 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 14-2 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  19. Well 52-21 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 52-21 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  20. Well 82-33 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 82-33 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  1. Well Acord 1-26 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joe Moore

    This is a compilation of logs and data from Well Acord 1-26 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  2. Simulation Control Graphical User Interface Logging Report

    NASA Technical Reports Server (NTRS)

    Hewling, Karl B., Jr.

    2012-01-01

    One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.

  3. Geohydrologic and water-quality characterization of a fractured-bedrock test hole in an area of Marcellus shale gas development, Bradford County, Pennsylvania

    USGS Publications Warehouse

    Risser, Dennis W.; Williams, John H.; Hand, Kristen L.; Behr, Rose-Anna; Markowski, Antonette K.

    2013-01-01

    Open-File Miscellaneous Investigation 13–01.1 presents the results of geohydrologic investigations on a 1,664-foot-deep core hole drilled in the Bradford County part of the Gleason 7.5-minute quadrangle in north-central Pennsylvania. In the text, the authors discuss their methods of investigation, summarize physical and analytical results, and place those results in context. Four appendices include (1) a full description of the core in an Excel worksheet; (2) water-quality and core-isotope analytical results in Excel workbooks; (3) geophysical logs in LAS and PDF files, and an Excel workbook containing attitudes of bedding and fractures calculated from televiewer logs; and (4) MP4 clips from the downhole video at selected horizons.

  4. Archive of single-beam bathymetry data collected during USGS cruise 07CCT01 nearshore of Fort Massachusetts and within Camille Cut, West and East Ship Islands, Gulf Islands National Seashore, Mississippi, July 2007

    USGS Publications Warehouse

    DeWitt, Nancy T.; Flocks, James G.; Reynolds, B.J.; Hansen, Mark

    2012-01-01

    The Gulf Islands National Seashore (GUIS) is composed of a series of barrier islands along the Mississippi - Alabama coastline. Historically these islands have undergone long-term shoreline change. The devastation of Hurricane Katrina in 2005 prompted questions about the stability of the barrier islands and their potential response to future storm impacts. Additionally, there was concern from the National Park Service (NPS) about the preservation of the historical Fort Massachusetts, located on West Ship Island. During the early 1900s, Ship Island was an individual island. In 1969 Hurricane Camille breached Ship Island, widening the cut and splitting it into what is now known as West Ship Island and East Ship Island. In July of 2007, the U.S. Geological Survey (USGS) was able to provide the NPS with a small bathymetric survey of Camille Cut using high-resolution single-beam bathymetry. This provided GUIS with a post-Katrina assessment of the bathymetry in Camille Cut and along the northern shoreline directly in front of Fort Massachusetts. Ultimately, this survey became an initial bathymetry dataset toward a larger USGS effort included in the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility Project (http://ngom.usgs.gov/gomsc/mscip/). This report serves as an archive of the processed single-beam bathymetry. Data products herein include gridded and interpolated digital depth surfaces and x,y,z data products. Additional files include trackline maps, navigation files, geographic information system (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Scanned images of the handwritten FACS logs and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for description of acronyms and abbreviations used in this report or hold the cursor over an acronym for a pop-up explanation. The USGS St. Petersburg Coastal and Marine Science Center assigns a unique identifier to each cruise or field activity. For example, 07CCT01 tells us the data were collected in 2007 for the Coastal Change and Transport (CCT) study and the data were collected during the first (01) field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. Data were collected using a 26-foot (ft) Glacier Bay catamaran. The single-beam transducers were sled mounted on a rail attached between the catamaran hulls. Navigation was acquired using HYPACK, Inc., Hypack version 4.3a.7.1 and differentially corrected using land-based GPS stations. See the digital FACS equipment log for details about the acquisition equipment used. Raw datasets were stored digitally and processed systematically using NovAtel's Waypoint GrafNav version 7.6, SANDS version 3.7, and ESRI ArcGIS version 9.3.1. For more information on processing refer to the Equipment and Processing page.

  5. Navigating Streams of Paper.

    ERIC Educational Resources Information Center

    Bennett-Abney, Cheryl

    2001-01-01

    Three organizational tools for counselors are described: three-ring binder for notes, forms, and schedules; daily log of time and activities; and a tickler file with tasks arranged by days of the week. (SK)

  6. Archive of digital Chirp subbottom profile data collected during USGS cruise 08CCT01, Mississippi Gulf Islands, July 2008

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Worley, Charles R.

    2011-01-01

    In July of 2008, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on island framework from Ship Island to Horn Island, Mississippi, for the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility project. Funding was provided through the Geologic Framework and Holocene Coastal Evolution of the Mississippi-Alabama Region Subtask (http://ngom.er.usgs.gov/task2_2/index.php); this project is also part of a broader USGS study on Coastal Change and Transport (CCT). This report serves as an archive of unprocessed digital Chirp seismic reflection data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, observer's logbook, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.

  7. Cyber Fundamental Exercises

    DTIC Science & Technology

    2013-03-01

    the /bin, /sbin, /etc, /var/log, /home, /proc, /root, /dev, /tmp, and /lib directories • Describe the purpose of the /etc/shadow and /etc/ passwd ...UNLIMITED 19 2.6.2 /etc/ passwd and /etc/shadow The /etc/shadow file didn’t exist on early Linux distributions. Originally only root could access the...etc/ passwd file, which stored user names, user configuration information, and passwords. However, when common programs such as ls running under

  8. MO-G-BRE-04: Automatic Verification of Daily Treatment Deliveries and Generation of Daily Treatment Reports for a MR Image-Guided Treatment Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, D; Li, X; Li, H

    2014-06-15

    Purpose: Two aims of this work were to develop a method to automatically verify treatment delivery accuracy immediately after patient treatment and to develop a comprehensive daily treatment report to provide all required information for daily MR-IGRT review. Methods: After systematically analyzing the requirements for treatment delivery verification and understanding the available information from a novel MR-IGRT treatment machine, we designed a method to use 1) treatment plan files, 2) delivery log files, and 3) dosimetric calibration information to verify the accuracy and completeness of daily treatment deliveries. The method verifies the correctness of delivered treatment plans and beams, beammore » segments, and for each segment, the beam-on time and MLC leaf positions. Composite primary fluence maps are calculated from the MLC leaf positions and the beam-on time. Error statistics are calculated on the fluence difference maps between the plan and the delivery. We also designed the daily treatment delivery report by including all required information for MR-IGRT and physics weekly review - the plan and treatment fraction information, dose verification information, daily patient setup screen captures, and the treatment delivery verification results. Results: The parameters in the log files (e.g. MLC positions) were independently verified and deemed accurate and trustable. A computer program was developed to implement the automatic delivery verification and daily report generation. The program was tested and clinically commissioned with sufficient IMRT and 3D treatment delivery data. The final version has been integrated into a commercial MR-IGRT treatment delivery system. Conclusion: A method was developed to automatically verify MR-IGRT treatment deliveries and generate daily treatment reports. Already in clinical use since December 2013, the system is able to facilitate delivery error detection, and expedite physician daily IGRT review and physicist weekly chart review.« less

  9. A Scientific Data Provenance API for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raju, Bibi; Elsethagen, Todd O.; Stephan, Eric G.

    Data provenance has been an active area of research as a means to standardize how the origin of data, process event history, and what or who was responsible for influencing results is explained. There are two approaches to capture provenance information. The first approach is to collect observed evidence produced by an executing application using log files, event listeners, and temporary files that are used by the application or application developer. The provenance translated from these observations is an interpretation of the provided evidence. The second approach is called disclosed because the application provides a firsthand account of the provenancemore » based on the anticipated questions on data flow, process flow, and responsible agents. Most observed provenance collection systems collect lot of provenance information during an application run or workflow execution. The common trend in capturing provenance is to collect all possible information, then attempt to find relevant information, which is not efficient. Existing disclosed provenance system APIs do not work well in distributed environment and have trouble finding where to fit the individual pieces of provenance information. This work focuses on determining more reliable solutions for provenance capture. As part of the Integrated End-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project, an API was developed, called Producer API (PAPI), which can disclose application targeted provenance, designed to work in distributed environments by means of unique object identification methods. The provenance disclosure approach used adds additional metadata to the provenance information to uniquely identify the pieces and connect them together. PAPI uses a common provenance model to support this provenance integration across disclosure sources. The API also provides the flexibility to let the user decide what to do with the collected provenance. The collected provenance can be sent to a triple store using REST services or it can be logged to a file.« less

  10. Tools for Administration of a UNIX-Based Network

    NASA Technical Reports Server (NTRS)

    LeClaire, Stephen; Farrar, Edward

    2004-01-01

    Several computer programs have been developed to enable efficient administration of a large, heterogeneous, UNIX-based computing and communication network that includes a variety of computers connected to a variety of subnetworks. One program provides secure software tools for administrators to create, modify, lock, and delete accounts of specific users. This program also provides tools for users to change their UNIX passwords and log-in shells. These tools check for errors. Another program comprises a client and a server component that, together, provide a secure mechanism to create, modify, and query quota levels on a network file system (NFS) mounted by use of the VERITAS File SystemJ software. The client software resides on an internal secure computer with a secure Web interface; one can gain access to the client software from any authorized computer capable of running web-browser software. The server software resides on a UNIX computer configured with the VERITAS software system. Directories where VERITAS quotas are applied are NFS-mounted. Another program is a Web-based, client/server Internet Protocol (IP) address tool that facilitates maintenance lookup of information about IP addresses for a network of computers.

  11. 47 CFR 22.359 - Emission limitations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... + 10 log (P) dB. (b) Measurement procedure. Compliance with these rules is based on the use of... contract in their station files and disclose it to prospective assignees or transferees and, upon request...

  12. 7 CFR 274.5 - Record retention and forms security.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... control logs, or similar controls from the point of initial receipt through the issuance and.... (2) For notices of change which initiate, update or terminate the master issuance file, the State...

  13. Network Basics.

    ERIC Educational Resources Information Center

    Tennant, Roy

    1992-01-01

    Explains how users can find and access information resources available on the Internet. Highlights include network information centers (NICs); lists, both formal and informal; computer networking protocols, including international standards; electronic mail; remote log-in; and file transfer. (LRW)

  14. VizieR Online Data Catalog: CoRoT red giants abundances (Morel+, 2014)

    NASA Astrophysics Data System (ADS)

    Morel, T.; Miglio, A.; Lagarde, N.; Montalban, J.; Rainer, M.; Poretti, E.; Eggenberger, P.; Hekker, S.; Kallinger, T.; Mosser, B.; Valentini, M.; Carrier, F.; Hareter, M.; Mantegazza, L.

    2014-02-01

    The equivalent widths were measured manually assuming Gaussian profiles or Voigt profiles for the few lines with extended damping wings. Lines with an unsatisfactory fit or significantly affected by telluric features were discarded. Only values eventually retained for the analysis are provided. For the chemical abundances, the usual notation is used: [X/Y]=[log({epsilon}(X))-log({epsilon}(Y))]star - [log({epsilon}(X))-log({epsilon}(Y))]⊙ with log{epsilon}(X)=12+log[N(X)/N(H)] (N is the number density of the species). For lithium, the following notation is used: [Li/H]=log(N(Li))star-log(N(Li))⊙. The adopted solar abundances are taken from Grevesse & Sauval (1998SSRv...85..161G), except for Li for which we adopt our derived values: log({epsilon}(Li))⊙=1.09 and 1.13 in LTE and NLTE, respectively (see text). All the abundances are computed under the assumption of LTE, except Li for which values corrected for departures from LTE using the data of Lind et al. (2009A&A...503..541L) are also provided. All the quoted error bars are 1-sigma uncertainties. (6 data files).

  15. Development of a four-dimensional Monte Carlo dose calculation system for real-time tumor-tracking irradiation with a gimbaled X-ray head.

    PubMed

    Ishihara, Yoshitomo; Nakamura, Mitsuhiro; Miyabe, Yuki; Mukumoto, Nobutaka; Matsuo, Yukinori; Sawada, Akira; Kokubo, Masaki; Mizowaki, Takashi; Hiraoka, Masahiro

    2017-03-01

    To develop a four-dimensional (4D) dose calculation system for real-time tumor tracking (RTTT) irradiation by the Vero4DRT. First, a 6-MV photon beam delivered by the Vero4DRT was simulated using EGSnrc. A moving phantom position was directly measured by a laser displacement gauge. The pan and tilt angles, monitor units, and the indexing time indicating the phantom position were also extracted from a log file. Next, phase space data at any angle were created from both the log file and particle data under the dynamic multileaf collimator. Irradiation both with and without RTTT, with the phantom moving, were simulated using several treatment field sizes. Each was compared with the corresponding measurement using films. Finally, dose calculation for each computed tomography dataset of 10 respiratory phases with the X-ray head rotated was performed to simulate the RTTT irradiation (4D plan) for lung, liver, and pancreatic cancer patients. Dose-volume histograms of the 4D plan were compared with those calculated on the single reference respiratory phase without the gimbal rotation [three-dimensional (3D) plan]. Differences between the simulated and measured doses were less than 3% for RTTT irradiation in most areas, except the high-dose gradient. For clinical cases, the target coverage in 4D plans was almost identical to that of the 3D plans. However, the doses to organs at risk in the 4D plans varied at intermediate- and low-dose levels. Our proposed system has acceptable accuracy for RTTT irradiation in the Vero4DRT and is capable of simulating clinical RTTT plans. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  16. Archive of digital boomer seismic reflection data collected during USGS field activities 95LCA03 and 96LCA02 in the Peace River of West-Central Florida, 1995 and 1996

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Tihansky, Ann B.; Lewelling, Bill R.; Flocks, James G.; Wiese, Dana S.; Kindinger, Jack G.; Harrison, Arnell S.

    2006-01-01

    In October and November of 1995 and February of 1996, the U.S. Geological Survey, in cooperation with the Southwest Florida Water Management District, conducted geophysical surveys of the Peace River in west-central Florida from east of Bartow to west of Arcadia. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, observers' logbooks, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  17. Pizza.py Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plimpton, Steve; Jones, Matt; Crozier, Paul

    2006-01-01

    Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs on any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invokedmore » interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.« less

  18. HLYWD: a program for post-processing data files to generate selected plots or time-lapse graphics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munro, J.K. Jr.

    1980-05-01

    The program HLYWD is a post-processor of output files generated by large plasma simulation computations or of data files containing a time sequence of plasma diagnostics. It is intended to be used in a production mode for either type of application; i.e., it allows one to generate along with the graphics sequence, segments containing title, credits to those who performed the work, text to describe the graphics, and acknowledgement of funding agency. The current version is designed to generate 3D plots and allows one to select type of display (linear or semi-log scales), choice of normalization of function values formore » display purposes, viewing perspective, and an option to allow continuous rotations of surfaces. This program was developed with the intention of being relatively easy to use, reasonably flexible, and requiring a minimum investment of the user's time. It uses the TV80 library of graphics software and ORDERLIB system software on the CDC 7600 at the National Magnetic Fusion Energy Computing Center at Lawrence Livermore Laboratory in California.« less

  19. SU-F-T-230: A Simple Method to Assess Accuracy of Dynamic Wave Arc Irradiation Using An Electronic Portal Imaging Device and Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirashima, H; Miyabe, Y; Yokota, K

    2016-06-15

    Purpose: The Dynamic Wave Arc (DWA) technique, where the multi-leaf collimator (MLC) and gantry/ring move simultaneously in a predefined non-coplanar trajectory, has been developed on the Vero4DRT. The aim of this study is to develop a simple method for quality assurance of DWA delivery using an electronic portal imaging device (EPID) measurements and log files analysis. Methods: The Vero4DRT has an EPID on the beam axis, the resolution of which is 0.18 mm/pixel at the isocenter plane. EPID images were acquired automatically. To verify the detection accuracy of the MLC position by EPID images, the MLC position with intentional errorsmore » was assessed. Tests were designed considering three factors: (1) accuracy of the MLC position (2) dose output consistency with variable dose rate (160–400 MU/min), gantry speed (2.4–6°/s), ring speed (0.5–2.5°/s), and (3) MLC speed (1.6–4.2 cm/s). All the patterns were delivered to the EPID and compared with those obtained with a stationary radiation beam with a 0° gantry angle. The irradiation log, including the MLC position and gantry/ring angle, were recorded simultaneously. To perform independent checks of the machine accuracy, the MLC position and gantry/ring angle position were assessed using log files. Results: 0.1 mm intentional error can be detected by the EPID, which is smaller than the EPID pixel size. The dose outputs with different conditions of the dose rate and gantry/ring speed and MLC speed showed good agreement, with a root mean square (RMS) error of 0.76%. The RMS error between the detected and recorded data were 0.1 mm for the MLC position, 0.12° for the gantry angle, and 0.07° for the ring angle. Conclusion: The MLC position and dose outputs in variable conditions during DWA irradiation can be easily detected using EPID measurements and log file analysis. The proposed method is useful for routine verification. This research is (partially) supported by the Practical Research for Innovative Cancer Control (15Ack0106151h0001) from Japan Agency for Medical Research and development, AMED. Authors Takashi Mizowaki and Masahiro Hiraoka have consultancy agreement with Mitsubishi Heavy Industries, Ltd., Japan.« less

  20. Poster — Thur Eve — 30: 4D VMAT dose calculation methodology to investigate the interplay effect: experimental validation using TrueBeam Developer Mode and Gafchromic film

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teke, T; Milette, MP; Huang, V

    2014-08-15

    The interplay effect between the tumor motion and the radiation beam modulation during a VMAT treatment delivery alters the delivered dose distribution from the planned one. This work present and validate a method to accurately calculate the dose distribution in 4D taking into account the tumor motion, the field modulation and the treatment starting phase. A QUASAR™ respiratory motion phantom was 4D scanned with motion amplitude of 3 cm and with a 3 second period. A static scan was also acquired with the lung insert and the tumor contained in it centered. A VMAT plan with a 6XFFF beam wasmore » created on the averaged CT and delivered on a Varian TrueBeam and the trajectory log file was saved. From the trajectory log file 10 VMAT plans (one for each breathing phase) and a developer mode XML file were created. For the 10 VMAT plans, the tumor motion was modeled by moving the isocentre on the static scan, the plans were re-calculated and summed in the treatment planning system. In the developer mode, the tumor motion was simulated by moving the couch dynamically during the treatment. Gafchromic films were placed in the QUASAR phantom static and irradiated using the developer mode. Different treatment starting phase were investigated (no phase shift, maximum inhalation and maximum exhalation). Calculated and measured isodose lines and profiles are in very good agreement. For each starting phase, the dose distribution exhibit significant differences but are accurately calculated with the methodology presented in this work.« less

  1. VizieR Online Data Catalog: NGC 2264, NGC 2547 and NGC 2516 stellar radii (Jackson+, 2016)

    NASA Astrophysics Data System (ADS)

    Jackson, R. J.; Jeffries, R. D.; Randich, S.; Bragaglia, A.; Carraro, G.; Costado, M. T.; Flaccomio, E.; Lanzafame; Lardo, C.; Monaco, L.; Morbidelli, L.; Smiljanic, R.; Zaggia, S.

    2015-11-01

    File Table1.dat contains Photometric and spectroscopic data of GES Survey targets in clusters in NGC 2547, NGC 2516, NGC 22264 downloaded from the Edinburugh GES archive (http://ges/roe.ac.uk/) . Photometric data comprised the (Cousins) I magnitude and 2MASS J, H and K magnitudes. Spectroscopic data comprises the signal to noise ratio, S/N of the target spectrum, the radial velocity, RV (in km/s), the projected equatorial velocity, vsini (in km/s), the number of separate observations co-added to produce the target spectrum and the log of effective temperature (logTeff) of the template spectrum fitted to measure RV and vsini. The absolute precision in RV, pRV (in km/s) and relative precision vsini (pvsini) were estimated, as a function of the logTeff, vsini and S/N, using the prescription described in Jackson et al. (2015A&A...580A..75J, Cat. J/A+A/580/A75). File Table3.dat contains measured and calculated properties of cluster targets with resolved vsini and a reported rotation period. The cluster name, right ascension, RA (deg) and declination, Dec (deg) are given for targets with measured periods given in the literature. Dynamic properties comprise: the radial velocity, RV (in km/s), the absolute precision in RV, pRV (km/s), the projected equatorial velocity, vsini (in km/s), the relative precision in vsini (pvsini) and the rotational period (in days). Also shown are values of absolute K magnitude, MK log of luminosity, log L (in solar units) and probability of cluster membership estimated using cluster data given in the text. Period shows reported values of cluster taken from the literature Estimated values of the projected radius, Rsini (in Rsolar) and uncertainty in projected radius, e_Rsini (in Rsolar) are given for targets where vsini>5km/s and pvsini>0.2. The final column shows a flag which is set to 1 for targets in cluster NGC 2264 where a (H-K) versus (J-H) colour-colour plot indicates possible infra-red excess. Period shows reported values of cluster taken from the literature (2 data files).

  2. Optimizing Earth Data Search Ranking using Deep Learning and Real-time User Behaviour

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C. P.; Armstrong, E. M.; Huang, T.; Moroni, D. F.; McGibbney, L. J.; Greguska, F. R., III

    2017-12-01

    Finding Earth science data has been a challenging problem given both the quantity of data available and the heterogeneity of the data across a wide variety of domains. Current search engines in most geospatial data portals tend to induce end users to focus on one single data characteristic dimension (e.g., term frequency-inverse document frequency (TF-IDF) score, popularity, release date, etc.). This approach largely fails to take account of users' multidimensional preferences for geospatial data, and hence may likely result in a less than optimal user experience in discovering the most applicable dataset out of a vast range of available datasets. With users interacting with search engines, sufficient information is already hidden in the log files. Compared with explicit feedback data, information that can be derived/extracted from log files is virtually free and substantially more timely. In this dissertation, I propose an online deep learning framework that can quickly update the learning function based on real-time user clickstream data. The contributions of this framework include 1) a log processor that can ingest, process and create training data from web logs in a real-time manner; 2) a query understanding module to better interpret users' search intent using web log processing results and metadata; 3) a feature extractor that identifies ranking features representing users' multidimensional interests of geospatial data; and 4) a deep learning based ranking algorithm that can be trained incrementally using user behavior data. The search ranking results will be evaluated using precision at K and normalized discounted cumulative gain (NDCG).

  3. 43 CFR 2743.3 - Leased disposal sites.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... review of all records and inspection reports on file with the Bureau of Land Management, State, and local... landfill concerning site management and a review of all reports and logs pertaining to the type and amount...

  4. 25 CFR 214.13 - Diligence; annual expenditures; mining records.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... within 90 days after an ore body of sufficient quantity is discovered, and shown by the logs or records.... Lessee shall, before commencing operations, file with the superintendent a plat and preliminary statement...

  5. 47 CFR 22.861 - Emission limitations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... below the transmitting power (P) by a factor of at least 43 + 10 log (P) dB. (b) Measurement procedure... maintain a copy of the contract in their station files and disclose it to prospective assignees or...

  6. Use of treatment log files in spot scanning proton therapy as part of patient-specific quality assurance

    PubMed Central

    Li, Heng; Sahoo, Narayan; Poenisch, Falk; Suzuki, Kazumichi; Li, Yupeng; Li, Xiaoqiang; Zhang, Xiaodong; Lee, Andrew K.; Gillin, Michael T.; Zhu, X. Ronald

    2013-01-01

    Purpose: The purpose of this work was to assess the monitor unit (MU) values and position accuracy of spot scanning proton beams as recorded by the daily treatment logs of the treatment control system, and furthermore establish the feasibility of using the delivered spot positions and MU values to calculate and evaluate delivered doses to patients. Methods: To validate the accuracy of the recorded spot positions, the authors generated and executed a test treatment plan containing nine spot positions, to which the authors delivered ten MU each. The spot positions were measured with radiographic films and Matrixx 2D ion-chambers array placed at the isocenter plane and compared for displacements from the planned and recorded positions. Treatment logs for 14 patients were then used to determine the spot MU values and position accuracy of the scanning proton beam delivery system. Univariate analysis was used to detect any systematic error or large variation between patients, treatment dates, proton energies, gantry angles, and planned spot positions. The recorded patient spot positions and MU values were then used to replace the spot positions and MU values in the plan, and the treatment planning system was used to calculate the delivered doses to patients. The results were compared with the treatment plan. Results: Within a treatment session, spot positions were reproducible within ±0.2 mm. The spot positions measured by film agreed with the planned positions within ±1 mm and with the recorded positions within ±0.5 mm. The maximum day-to-day variation for any given spot position was within ±1 mm. For all 14 patients, with ∼1 500 000 spots recorded, the total MU accuracy was within 0.1% of the planned MU values, the mean (x, y) spot displacement from the planned value was (−0.03 mm, −0.01 mm), the maximum (x, y) displacement was (1.68 mm, 2.27 mm), and the (x, y) standard deviation was (0.26 mm, 0.42 mm). The maximum dose difference between calculated dose to the patient based on the plan and recorded data was within 2%. Conclusions: The authors have shown that the treatment log file in a spot scanning proton beam delivery system is precise enough to serve as a quality assurance tool to monitor variation in spot position and MU value, as well as the delivered dose uncertainty from the treatment delivery system. The analysis tool developed here could be useful for assessing spot position uncertainty and thus dose uncertainty for any patient receiving spot scanning proton beam therapy. PMID:23387726

  7. The Basis System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubois, P.F.

    1989-05-16

    This paper discusses the basis system. Basis is a program development system for scientific programs. It has been developed over the last five years at Lawrence Livermore National Laboratory (LLNL), where it is now used in about twenty major programming efforts. The Basis System includes two major components, a program development system and a run-time package. The run-time package provides the Basis Language interpreter, through which the user does input, output, plotting, and control of the program's subroutines and functions. Variables in the scientific packages are known to this interpreter, so that the user may arbitrarily print, plot, and calculatemore » with, any major program variables. Also provided are facilities for dynamic memory management, terminal logs, error recovery, text-file i/o, and the attachment of non-Basis-developed packages.« less

  8. Remote Environmental Monitoring and Diagnostics in the Perishables Supply Chain - Phase 1

    DTIC Science & Technology

    2011-12-12

    The table below displays the  raw  data from the tests. Each cell contains a number between 0  and 5 corresponding  to  the number of  successful...along  with  the  raw   temperature  data  to  the  email  addresses  specified  in  the  configuration file.    As mentioned previously, for the CAEN...the Intelleflex system.    The user also has the option to save the data log, which contains the  raw  temperature data, to  a file on the Windows

  9. D0 Superconducting Solenoid Quench Data and Slow Dump Data Acquisition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markley, D.; /Fermilab

    1998-06-09

    This Dzero Engineering note describes the method for which the 2 Tesla Superconducting Solenoid Fast Dump and Slow Dump data are accumulated, tracked and stored. The 2 Tesla Solenoid has eleven data points that need to be tracked and then stored when a fast dump or a slow dump occur. The TI555(Texas Instruments) PLC(Programmable Logic Controller) which controls the DC power circuit that powers the Solenoid, also has access to all the voltage taps and other equipment in the circuit. The TI555 constantly logs these eleven points in a rotating memory buffer. When either a fast dump(dump switch opens) ormore » a slow dump (power supply turns off) occurs, the TI555 organizes the respective data and will down load the data to a file on DO-CCRS2. This data in this file is moved over ethernet and is stored in a CSV (comma separated format) file which can easily be examined by Microsoft Excel or any other spreadsheet. The 2 Tesla solenoid control system also locks in first fault information. The TI555 decodes the first fault and passes it along to the program collecting the data and storing it on DO-CCRS2. This first fault information is then part of the file.« less

  10. A Novel Network Attack Audit System based on Multi-Agent Technology

    NASA Astrophysics Data System (ADS)

    Jianping, Wang; Min, Chen; Xianwen, Wu

    A network attack audit system which includes network attack audit Agent, host audit Agent and management control center audit Agent is proposed. And the improved multi-agent technology is carried out in the network attack audit Agent which has achieved satisfactory audit results. The audit system in terms of network attack is just in-depth, and with the function improvement of network attack audit Agent, different attack will be better analyzed and audit. In addition, the management control center Agent should manage and analyze audit results from AA (or HA) and audit data on time. And the history files of network packets and host log data should also be audit to find deeper violations that cannot be found in real time.

  11. Contamination Analysis Tools

    NASA Technical Reports Server (NTRS)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  12. Archive of Digital Boomer Seismic Reflection Data Collected During USGS Field Activities 93LCA01 and 94LCA01 in Kingsley, Orange, and Lowry Lakes, Northeast Florida, 1993 and 1994

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Davis, Jeffrey B.; Flocks, James G.; Wiese, Dana S.

    2004-01-01

    In August and September of 1993 and January of 1994, the U.S. Geological Survey, under a cooperative agreement with the St. Johns River Water Management District (SJRWMD), conducted geophysical surveys of Kingsley Lake, Orange Lake, and Lowry Lake in northeast Florida. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, observer's logbook, Field Activity Collection System (FACS) logs, and formal FGDC metadata. A filtered and gained GIF image of each seismic profile is also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. The data archived here were collected under a cooperative agreement with the St. Johns River Water Management District as part of the USGS Lakes and Coastal Aquifers (LCA) Project. For further information about this study, refer to http://coastal.er.usgs.gov/stjohns, Kindinger and others (1994), and Kindinger and others (2000). The USGS Florida Integrated Science Center (FISC) - Coastal and Watershed Studies in St. Petersburg, Florida, assigns a unique identifier to each cruise or field activity. For example, 93LCA01 tells us the data were collected in 1993 for the Lakes and Coastal Aquifers (LCA) Project and the data were collected during the first field activity for that project in that calendar year. For a detailed description of the method used to assign the field activity ID, see http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html. The boomer is an acoustic energy source that consists of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled at the sea surface and when discharged emits a short acoustic pulse, or shot, that propagates through the water and sediment column. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by the receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (e.g., 0.5 s) and recorded for specific intervals of time (e.g., 100 ms). In this way, a two-dimensional vertical image of the shallow geologic structure beneath the ship track is produced. Acquisition geometery for 94LCA01 is recorded in the operations logbook. No logbook exists for 93LCA01. Table 1 displays acquisition parameters for both field activities. For more information about the acquisition equipment used, refer to the FACS equipment logs. The unprocessed seismic data are stored in SEG-Y format (Barry and others, 1975). For a detailed description of the data format, refer to the SEG-Y Format page. See the How To Download SEG-Y Data page for more information about these files. Processed profiles can be viewed as GIF images from the Profiles page. Refer to the Software page for details about the processing and examples of the processing scripts. Detailed information about the navigation systems used for each field activity can be found in Table 1 and the FACS equipment logs. To view the trackline maps and navigation files, and for more information about these items, see the Navigation page. The original trace files were recorded in nonstandard ELICS format and later converted to standard SEG-Y format. The original trace files for 94LCA01 lines ORJ127_1, ORJ127_3, and ORJ131_1 were divided into two or more trace files (e.g., ORJ127_1 became ORJ127_1a and ORJ127_1b) because the original total number of traces exceeded the maximum allowed by the processing system. Digital data were not recoverable for 93LCA

  13. Archive of sediment data from vibracores collected in 2010 offshore of the Mississippi barrier islands

    USGS Publications Warehouse

    Kelso, Kyle W.; Flocks, James G.

    2015-01-01

    Selection of the core site locations was based on geophysical surveys conducted around the islands from 2008 to 2010. The surveys, using acoustic systems to image and interpret the nearsurface stratigraphy, were conducted to investigate the geologic controls on island evolution. This data series serves as an archive of sediment data collected from August to September 2010, offshore of the Mississippi barrier islands. Data products, including descriptive core logs, core photographs, results of sediment grain-size analyses, sample location maps, and geographic information system (GIS) data files with accompanying formal Federal Geographic Data Committee (FDGC) metadata can be downloaded from the data products and downloads page.

  14. 47 CFR 22.917 - Emission limitations for cellular equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... frequency ranges must be attenuated below the transmitting power (P) by a factor of at least 43 + 10 log(P... such contract shall maintain a copy of the contract in their station files and disclose it to...

  15. Parallel log structured file system collective buffering to achieve a compact representation of scientific and/or dimensional data

    DOEpatents

    Grider, Gary A.; Poole, Stephen W.

    2015-09-01

    Collective buffering and data pattern solutions are provided for storage, retrieval, and/or analysis of data in a collective parallel processing environment. For example, a method can be provided for data storage in a collective parallel processing environment. The method comprises receiving data to be written for a plurality of collective processes within a collective parallel processing environment, extracting a data pattern for the data to be written for the plurality of collective processes, generating a representation describing the data pattern, and saving the data and the representation.

  16. Summary of available hydrogeologic data for the northeast portion of the alluvial aquifer at Louisville, Kentucky

    USGS Publications Warehouse

    Unthank, Michael D.; Nelson, Hugh L.

    2006-01-01

    The hydrogeologic characteristics of the unconsolidated glacial outwash sand and gravel deposits that compose the northeast portion of the alluvial aquifer at Louisville, Kentucky, indicate a prolific water-bearing formation with approximately 7 billion gallons of ground-water storage and an estimated sustainable yield of over 280 million gallons per day. This abundance of ground water and the need to properly develop and manage this resource has prompted many past investigations (since 1956), which have produced reports, maps, and data files covering a variety of topics relative to the movement, availability, and use of ground water in this area. These data have been compiled into a single report to assist in future development and use of the ground-water resources. Available ground-water data for the alluvial aquifer at Louisville, Kentucky, from Beargrass Creek to Harrods Creek, were compiled from the U.S. Geological Survey National Water Information System and the Kentucky Groundwater Data Repository. Data contained in these databases include ground-water well-construction details and historical ground-water levels, drillers' logs, and water-quality information. Additional data and information were gathered from project files at the U.S. Geological Survey--Kentucky Water Science Center and files at the Louisville Water Company. Information contained in these files included data from area pumping tests describing aquifer characteristics and ground-water flow. Data describing current conditions of the ground-water system in the northeast portion of the alluvial aquifer also are included. Ground-water levels from a network of observation wells show recent trends in the flow system, and information from the Kentucky Division of Water-Groundwater Branch lists current permitted ground-water withdrawals in the area.

  17. Archive of digital chirp subbottom profile data collected during USGS cruises 13BIM02 and 13BIM07 offshore of the Chandeleur Islands, Louisiana, 2013

    USGS Publications Warehouse

    Forde, Arnell S.; Miselis, Jennifer L.; Flocks, James G.; Bernier, Julie C.; Wiese, Dana S.

    2014-01-01

    On July 5–19 (cruise 13BIM02) and August 22–September 1 (cruise 13BIM07), 2013, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on barrier island evolution and medium-term and interannual sediment transport along the oil spill mitigation sand berm constructed at the north end and offshore of the Chandeleur Islands, Louisiana. This investigation is part of a broader USGS study, which seeks to understand barrier island evolution better over medium time scales (months to years). This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained–showing a relative increase in signal amplitude–digital images of the seismic profiles are provided. Refer to the Abbreviations page for explanations of acronyms and abbreviations used in this report.

  18. Archive of digital Chirp subbottom profile data collected during USGS cruises 09CCT03 and 09CCT04, Mississippi and Alabama Gulf Islands, June and July 2009

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.

    2011-01-01

    In June and July of 2009, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on island framework from Cat Island, Mississippi, to Dauphin Island, Alabama, as part of a broader USGS study on Coastal Change and Transport (CCT). The surveys were funded through the Northern Gulf of Mexico Ecosystem Change and Hazard Susceptibility Project as part of the Holocene Evolution of the Mississippi-Alabama Region Subtask (http://ngom.er.usgs.gov/task2_2/index.php). This report serves as an archive of unprocessed digital Chirp seismic profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Single-beam and Swath bathymetry data were also collected during these cruises and will be published as a separate archive. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.

  19. Small Aircraft Data Distribution System

    NASA Technical Reports Server (NTRS)

    Chazanoff, Seth L.; Dinardo, Steven J.

    2012-01-01

    The CARVE Small Aircraft Data Distribution System acquires the aircraft location and attitude data that is required by the various programs running on a distributed network. This system distributes the data it acquires to the data acquisition programs for inclusion in their data files. It uses UDP (User Datagram Protocol) to broadcast data over a LAN (Local Area Network) to any programs that might have a use for the data. The program is easily adaptable to acquire additional data and log that data to disk. The current version also drives displays using precision pitch and roll information to aid the pilot in maintaining a level-level attitude for radar/radiometer mapping beyond the degree available by flying visually or using a standard gyro-driven attitude indicator. The software is designed to acquire an array of data to help the mission manager make real-time decisions as to the effectiveness of the flight. This data is displayed for the mission manager and broadcast to the other experiments on the aircraft for inclusion in their data files. The program also drives real-time precision pitch and roll displays for the pilot and copilot to aid them in maintaining the desired attitude, when required, during data acquisition on mapping lines.

  20. Paleomagnetic dating: Methods, MATLAB software, example

    NASA Astrophysics Data System (ADS)

    Hnatyshin, Danny; Kravchinsky, Vadim A.

    2014-09-01

    A MATLAB software tool has been developed to provide an easy to use graphical interface for the plotting and interpretation of paleomagnetic data. The tool takes either paleomagnetic directions or paleopoles and compares them to a user defined apparent polar wander path or secular variation curve to determine the age of a paleomagnetic sample. Ages can be determined in two ways, either by translating the data onto the reference curve, or by rotating it about a set location (e.g. sampling location). The results are then compiled in data tables which can be exported as an excel file. This data can also be plotted using variety of built-in stereographic projections, which can then be exported as an image file. This software was used to date the giant Sukhoi Log gold deposit in Russia. Sukhoi Log has undergone a complicated history of faulting, folding, metamorphism, and is the vicinity of many granitic bodies. Paleomagnetic analysis of Sukhoi Log allowed for the timing of large scale thermal or chemical events to be determined. Paleomagnetic analysis from gold mineralized black shales was used to define the natural remanent magnetization recorded at Sukhoi Log. The obtained paleomagnetic direction from thermal demagnetization produced a paleopole at 61.3°N, 155.9°E, with the semi-major axis and semi-minor axis of the 95% confidence ellipse being 16.6° and 15.9° respectively. This paleopole is compared to the Siberian apparent polar wander path (APWP) by translating the paleopole to the nearest location on the APWP. This produced an age of 255.2- 31.0+ 32.0Ma and is the youngest well defined age known for Sukhoi Log. We propose that this is the last major stage of activity at Sukhoi Log, and likely had a role in determining the present day state of mineralization seen at the deposit.

  1. Archive of digital Chirp sub-bottom profile data collected during USGS Cruise 07SCC01 offshore of the Chandeleur Islands, Louisiana, June 2007

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.

    2010-01-01

    In June of 2007, the U.S. Geological Survey (USGS) conducted a geophysical survey offshore of the Chandeleur Islands, Louisiana, in cooperation with the Louisiana Department of Natural Resources (LDNR) as part of the USGS Barrier Island Comprehensive Monitoring (BICM) project. This project is part of a broader study focused on Subsidence and Coastal Change (SCC). The purpose of the study was to investigate the shallow geologic framework and monitor the enviromental impacts of Hurricane Katrina (Louisiana landfall was on August 29, 2005) on the Gulf Coast's barrier island chains. This report serves as an archive of unprocessed digital 512i and 424 Chirp sub-bottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, observer's logbook, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 07SCC01 tells us the data were collected in 2007 for the Subsidence and Coastal Change (SCC) study and the data were collected during the first field activity for that study in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID). All Chirp systems use a signal of continuously varying frequency; the Chirp systems used during this survey produce high resolution, shallow penetration profile images beneath the seafloor. The towfish is a sound source and receiver, which is typically towed 1 - 2 m below the sea surface. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by a receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.125 s) and recorded for specific intervals of time (for example, 50 ms). In this way, a two-dimensional vertical image of the shallow geologic structure beneath the ship track is produced. Figure 1 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters. See the digital FACS equipment log (11-KB PDF) for details about the acquisition equipment used. Table 2 lists trackline statistics. Scanned images of the handwritten FACS logs and handwritten science logbook (449-KB PDF) are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y rev 1 format (Norris and Faichney, 2002); ASCII character encoding is used for the first 3,200 bytes of the card image header instead of the SEG-Y rev 0 (Barry and others, 1975) EBCDIC format. The SEG-Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG-Y Data page for download instructions. The web version of this archive does not contain the SEG-Y trace files. These files are very large and would require extremely long download times. To obtain the complete DVD archive, contact USGS Information at 1-888-ASK-USGS or infoservices@usgs.gov. The printable profiles provided here are GIF images that were processed and gained using SU software; refer to the Software page for links to example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992). The processed SEG-Y data were also exported to Chesapeake Technology, Inc. (CTI) SonarWeb software to produce an interactive version of the profile that allows the user to obtain a geographic location and depth from the profile for a given cursor position. This information is displayed in the status bar of the browser.

  2. An Efficient Format for Nearly Constant-Time Access to Arbitrary Time Intervals in Large Trace Files

    DOE PAGES

    Chan, Anthony; Gropp, William; Lusk, Ewing

    2008-01-01

    A powerful method to aid in understanding the performance of parallel applications uses log or trace files containing time-stamped events and states (pairs of events). These trace files can be very large, often hundreds or even thousands of megabytes. Because of the cost of accessing and displaying such files, other methods are often used that reduce the size of the tracefiles at the cost of sacrificing detail or other information. This paper describes a hierarchical trace file format that provides for display of an arbitrary time window in a time independent of the total size of the file and roughlymore » proportional to the number of events within the time window. This format eliminates the need to sacrifice data to achieve a smaller trace file size (since storage is inexpensive, it is necessary only to make efficient use of bandwidth to that storage). The format can be used to organize a trace file or to create a separate file of annotations that may be used with conventional trace files. We present an analysis of the time to access all of the events relevant to an interval of time and we describe experiments demonstrating the performance of this file format.« less

  3. Archive of digital Boomer seismic reflection data collected during USGS Cruises 94CCT01 and 95CCT01, eastern Texas and western Louisiana, 1994 and 1995

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Morton, Robert A.; Wiese, Dana S.

    2004-01-01

    In June of 1994 and August and September of 1995, the U.S. Geological Survey, in cooperation with the University of Texas Bureau of Economic Geology, conducted geophysical surveys of the Sabine and Calcasieu Lake areas and the Gulf of Mexico offshore eastern Texas and western Louisiana. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, observers' logbooks, GIS information, and formal FGDC metadata. In addition, a filtered and gained GIF image of each seismic profile is provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. Processed profile images, trackline maps, navigation files, and formal metadata may be viewed with a web browser. Scanned handwritten logbooks and Field Activity Collection System (FACS) logs may be viewed with Adobe Reader.

  4. Archive of digital Boomer and Chirp seismic reflection data collected during USGS Cruises 01RCE05 and 02RCE01 in the Lower Atchafalaya River, Mississippi River Delta, and offshore southeastern Louisiana, October 23-30, 2001, and August 18-19, 2002

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Ferina, Nicholas F.; Wiese, Dana S.

    2004-01-01

    In October of 2001 and August of 2002, the U.S. Geological Survey conducted geophysical surveys of the Lower Atchafalaya River, the Mississippi River Delta, Barataria Bay, and the Gulf of Mexico south of East Timbalier Island, Louisiana. This report serves as an archive of unprocessed digital marine seismic reflection data, trackline maps, navigation files, observers' logbooks, GIS information, and formal FGDC metadata. In addition, a filtered and gained GIF image of each seismic profile is provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and othes, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. Processed profile images, trackline maps, navigation files, and formal metadata may be viewed with a web browser. Scanned handwritten logbooks and Field Activity Collection System (FACS) logs may be viewed with Adobe Reader.

  5. Windows Instant Messaging App Forensics: Facebook and Skype as Case Studies

    PubMed Central

    Yang, Teing Yee; Dehghantanha, Ali; Choo, Kim-Kwang Raymond; Muda, Zaiton

    2016-01-01

    Instant messaging (IM) has changed the way people communicate with each other. However, the interactive and instant nature of these applications (apps) made them an attractive choice for malicious cyber activities such as phishing. The forensic examination of IM apps for modern Windows 8.1 (or later) has been largely unexplored, as the platform is relatively new. In this paper, we seek to determine the data remnants from the use of two popular Windows Store application software for instant messaging, namely Facebook and Skype on a Windows 8.1 client machine. This research contributes to an in-depth understanding of the types of terrestrial artefacts that are likely to remain after the use of instant messaging services and application software on a contemporary Windows operating system. Potential artefacts detected during the research include data relating to the installation or uninstallation of the instant messaging application software, log-in and log-off information, contact lists, conversations, and transferred files. PMID:26982207

  6. Windows Instant Messaging App Forensics: Facebook and Skype as Case Studies.

    PubMed

    Yang, Teing Yee; Dehghantanha, Ali; Choo, Kim-Kwang Raymond; Muda, Zaiton

    2016-01-01

    Instant messaging (IM) has changed the way people communicate with each other. However, the interactive and instant nature of these applications (apps) made them an attractive choice for malicious cyber activities such as phishing. The forensic examination of IM apps for modern Windows 8.1 (or later) has been largely unexplored, as the platform is relatively new. In this paper, we seek to determine the data remnants from the use of two popular Windows Store application software for instant messaging, namely Facebook and Skype on a Windows 8.1 client machine. This research contributes to an in-depth understanding of the types of terrestrial artefacts that are likely to remain after the use of instant messaging services and application software on a contemporary Windows operating system. Potential artefacts detected during the research include data relating to the installation or uninstallation of the instant messaging application software, log-in and log-off information, contact lists, conversations, and transferred files.

  7. 77 FR 10451 - Fishing Tackle Containing Lead; Disposition of Petition Filed Pursuant to TSCA Section 21

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-22

    ... visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...

  8. Gamma-index method sensitivity for gauging plan delivery accuracy of volumetric modulated arc therapy.

    PubMed

    Park, Jong In; Park, Jong Min; Kim, Jung-In; Park, So-Yeon; Ye, Sung-Joon

    2015-12-01

    The aim of this study was to investigate the sensitivity of the gamma-index method according to various gamma criteria for volumetric modulated arc therapy (VMAT). Twenty head and neck (HN) and twenty prostate VMAT plans were retrospectively selected for this study. Both global and local 2D gamma evaluations were performed with criteria of 3%/3 mm, 2%/2 mm, 1%/2 mm and 2%/1 mm. In this study, the global and local gamma-index calculated the differences in doses relative to the maximum dose and the dose at the current measurement point, respectively. Using log files acquired during delivery, the differences in parameters at every control point between the VMAT plans and the log files were acquired. The differences in dose-volumetric parameters between reconstructed VMAT plans using the log files and the original VMAT plans were calculated. The Spearman's rank correlation coefficients (rs) were calculated between the passing rates and those differences. Considerable correlations with statistical significances were observed between global 1%/2 mm, local 1%/2 mm and local 2%/1 mm and the MLC position differences (rs = -0.712, -0.628 and -0.581). The numbers of rs values with statistical significance between the passing rates and the changes in dose-volumetric parameters were largest in global 2%/2 mm (n = 16), global 2%/1 mm (n = 15) and local 2%/1 mm (n = 13) criteria. Local gamma-index method with 2%/1 mm generally showed higher sensitivity to detect deviations between a VMAT plan and the delivery of the VMAT plan. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  9. Neuropsychological constraints to human data production on a global scale

    NASA Astrophysics Data System (ADS)

    Gros, C.; Kaczor, G.; Marković, D.

    2012-01-01

    Which are the factors underlying human information production on a global level? In order to gain an insight into this question we study a corpus of 252-633 mil. publicly available data files on the Internet corresponding to an overall storage volume of 284-675 Terabytes. Analyzing the file size distribution for several distinct data types we find indications that the neuropsychological capacity of the human brain to process and record information may constitute the dominant limiting factor for the overall growth of globally stored information, with real-world economic constraints having only a negligible influence. This supposition draws support from the observation that the files size distributions follow a power law for data without a time component, like images, and a log-normal distribution for multimedia files, for which time is a defining qualia.

  10. Chirp subbottom profile data collected in 2015 from the northern Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    Forde, Arnell S.; DeWitt, Nancy T.; Fredericks, Jake J.; Miselis, Jennifer L.

    2018-01-30

    As part of the Barrier Island Evolution Research project, scientists from the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center conducted a nearshore geophysical survey around the northern Chandeleur Islands, Louisiana, in September 2015. The objective of the project is to improve the understanding of barrier island geomorphic evolution, particularly storm-related depositional and erosional processes that shape the islands over annual to interannual time scales (1–5 years). Collecting geophysical data can help researchers identify relations between the geologic history of the islands and their present day morphology and sediment distribution. High-resolution geophysical data collected along this rapidly changing barrier island system can provide a unique time-series dataset to further the analyses and geomorphological interpretations of this and other coastal systems, improving our understanding of coastal response and evolution over medium-term time scales (months to years). Subbottom profile data were collected in September 2015 offshore of the northern Chandeleur Islands, during USGS Field Activity Number 2015-331-FA. Data products, including raw digital chirp subbottom data, processed subbottom profile images, survey trackline map, navigation files, geographic information system data files and formal Federal Geographic Data Committee metadata, and Field Activity Collection System and operation logs are available for download.

  11. A teledentistry system for the second opinion.

    PubMed

    Gambino, Orazio; Lima, Fausto; Pirrone, Roberto; Ardizzone, Edoardo; Campisi, Giuseppina; di Fede, Olga

    2014-01-01

    In this paper we present a Teledentistry system aimed to the Second Opinion task. It make use of a particular camera called intra-oral camera, also called dental camera, in order to perform the photo shooting and real-time video of the inner part of the mouth. The pictures acquired by the Operator with such a device are sent to the Oral Medicine Expert (OME) by means of a current File Transfer Protocol (FTP) service and the real-time video is channeled into a video streaming thanks to the VideoLan client/server (VLC) application. It is composed by a HTML5 web-pages generated by PHP and allows to perform the Second Opinion both when Operator and OME are logged and when one of them is offline.

  12. 20 CFR 655.201 - Temporary labor certification applications.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Temporary labor certification applications... applications. (a)(1) An employer who anticipates a labor shortage of workers for agricultural or logging... an agent file, in duplicate, a temporary labor certification application, signed by the employer...

  13. Information Retrieval Using Hadoop Big Data Analysis

    NASA Astrophysics Data System (ADS)

    Motwani, Deepak; Madan, Madan Lal

    This paper concern on big data analysis which is the cognitive operation of probing huge amounts of information in an attempt to get uncovers unseen patterns. Through Big Data Analytics Applications such as public and private organization sectors have formed a strategic determination to turn big data into cut throat benefit. The primary occupation of extracting value from big data give rise to a process applied to pull information from multiple different sources; this process is known as extract transforms and lode. This paper approach extract information from log files and Research Paper, awareness reduces the efforts for blueprint finding and summarization of document from several positions. The work is able to understand better Hadoop basic concept and increase the user experience for research. In this paper, we propose an approach for analysis log files for finding concise information which is useful and time saving by using Hadoop. Our proposed approach will be applied on different research papers on a specific domain and applied for getting summarized content for further improvement and make the new content.

  14. DIRECT secure messaging as a common transport layer for reporting structured and unstructured lab results to outpatient providers.

    PubMed

    Sujansky, Walter; Wilson, Tom

    2015-04-01

    This report describes a grant-funded project to explore the use of DIRECT secure messaging for the electronic delivery of laboratory test results to outpatient physicians and electronic health record systems. The project seeks to leverage the inherent attributes of DIRECT secure messaging and electronic provider directories to overcome certain barriers to the delivery of lab test results in the outpatient setting. The described system enables laboratories that generate test results as HL7 messages to deliver these results as structured or unstructured documents attached to DIRECT secure messages. The system automatically analyzes generated HL7 messages and consults an electronic provider directory to determine the appropriate DIRECT address and delivery format for each indicated recipient. The system also enables lab results delivered to providers as structured attachments to be consumed by HL7 interface engines and incorporated into electronic health record systems. Lab results delivered as unstructured attachments may be printed or incorporated into patient records as PDF files. The system receives and logs acknowledgement messages to document the status of each transmitted lab result, and a graphical interface allows searching and review of this logged information. The described system is a fully implemented prototype that has been tested in a laboratory setting. Although this approach is promising, further work is required to pilot test the system in production settings with clinical laboratories and outpatient provider organizations. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. P1198: software for tracing decision behavior in lending to small businesses.

    PubMed

    Andersson, P

    2001-05-01

    This paper describes a process-tracing software program specially designed to capture decision behavior in lending to small businesses. The source code was written in Lotus Notes. The software runs in a Web browser and consists of two interacting systems: a database and a user interface. The database includes three realistic loan applications. The user interface consists of different but interacting screens that enable the participant to operate the software. Log files register the decision behavior of the participant. An empirical example is presented in order to show the software's potential in providing insights into judgment and decision making. The implications of the software are discussed.

  16. VizieR Online Data Catalog: Wide binaries in Tycho-Gaia: search method (Andrews+, 2017)

    NASA Astrophysics Data System (ADS)

    Andrews, J. J.; Chaname, J.; Agueros, M. A.

    2017-11-01

    Our catalogue of wide binaries identified in the Tycho-Gaia Astrometric Solution catalogue. The Gaia source IDs, Tycho IDs, astrometry, posterior probabilities for both the log-flat prior and power-law prior models, and angular separation are presented. (1 data file).

  17. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  18. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  19. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  20. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  1. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doug Blankenship

    Natural fracture data from wells 33-7, 33A-7,52A-7, 52B-7 and 83-11 at West Flank. Fracture orientations were determined from image logs of these wells (see accompanying submissions). Data files contain depth, apparent (in wellbore reference frame) and true (in geographic reference frame) azimuth and dip, respectively.

  3. Assessment of feasibility of running RSNA's MIRC on a Raspberry Pi: a cost-effective solution for teaching files in radiology.

    PubMed

    Pereira, Andre; Atri, Mostafa; Rogalla, Patrik; Huynh, Thien; O'Malley, Martin E

    2015-11-01

    The value of a teaching case repository in radiology training programs is immense. The allocation of resources for putting one together is a complex issue, given the factors that have to be coordinated: hardware, software, infrastructure, administration, and ethics. Costs may be significant and cost-effective solutions are desirable. We chose Medical Imaging Resource Center (MIRC) to build our teaching file. It is offered by RSNA for free. For the hardware, we chose the Raspberry Pi, developed by the Raspberry Foundation: a small control board developed as a low cost computer for schools also used in alternative projects such as robotics and environmental data collection. Its performance and reliability as a file server were unknown to us. For the operational system, we chose Raspbian, a variant of Debian Linux, along with Apache (web server), MySql (database server) and PHP, which enhance the functionality of the server. A USB hub and an external hard drive completed the setup. Installation of software was smooth. The Raspberry Pi was able to handle very well the task of hosting the teaching file repository for our division. Uptime was logged at 100 %, and loading times were similar to other MIRC sites available online. We setup two servers (one for backup), each costing just below $200.00 including external storage and USB hub. It is feasible to run RSNA's MIRC off a low-cost control board (Raspberry Pi). Performance and reliability are comparable to full-size servers for the intended purpose of hosting a teaching file within an intranet environment.

  4. VizieR Online Data Catalog: New atmospheric parameters of MILES cool stars (Sharma+, 2016)

    NASA Astrophysics Data System (ADS)

    Sharma, K.; Prugniel, P.; Singh, H. P.

    2015-11-01

    MILES V2 spectral interpolator The FITS file is an improved version of MILES interpolator previously presented in PVK. It contains the coefficients of the interpolator, which allows one to compute an interpolated spectrum, giving an effective temperature, log of surface gravity and metallicity (Teff, logg, and [Fe/H]). The file consists of three extensions containing the three temperature regimes described in the paper. Extension Teff range 0 warm 4000-9000K 1 hot >7000K 2 cold <4550K The three functions are linearly interpolated in the Teff overlapping regions. Each extension contains a 2D image-type array, whose first axis is the wavelength described by a WCS (Air wavelength, starting at 3536Å, step=0.9Å). This FITS file can be used by the ULySS v1.3 or higher. (5 data files).

  5. General Chemistry Division. Quarterly report, July--September 1978

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrar, J.E.

    1978-11-17

    Status of the following studies is given: nonaqueous titrimetry; molar absorbance of 1,3,5,-triamine-2,4,6,-trinitrobenzene in dimethylsulfoxide, potentiometric microdetermination of pentaerythritol tetranitrate (PETN) in PETN-containing composites; potentiometric semimicrodetermination of some tetrazoles with silver nitrate; applications of a mode-locked krypton ion laser; time-resolved spectroscopy; photoelectrochemistry; evaluation of a prototype atomic emission source system; laser spectroscopy of neptunium; high-performance liquid chromatography of polyphenyl ether; acquisition of a portable, computerized mass spectrometer; improved inlet for quantitative mass spectrometry; a computer data system for the UTI gas analyzers; analysis of perfluorobutene-2; examination of iridium coatings; source of high-intensity, polarized x rays for fluorescence analysis; mass spectrometermore » for the coal gasification field test; materials protection measurement guides; the LOG system of sample file control; and methylation of platinum compounds by methylcobalamin. (LK)« less

  6. Archive of digital chirp subbottom profile data collected during USGS cruise 12BIM03 offshore of the Chandeleur Islands, Louisiana, July 2012

    USGS Publications Warehouse

    Forde, Arnell S.; Miselis, Jennifer L.; Wiese, Dana S.

    2014-01-01

    From July 23 - 31, 2012, the U.S. Geological Survey conducted geophysical surveys to investigate the geologic controls on barrier island framework and long-term sediment transport along the oil spill mitigation sand berm constructed at the north end and just offshore of the Chandeleur Islands, La. (figure 1). This effort is part of a broader USGS study, which seeks to better understand barrier island evolution over medium time scales (months to years). This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Abbreviations page for expansions of acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 12BIM03 tells us the data were collected in 2012 during the third field activity for that project in that calendar year and BIM is a generic code, which represents efforts related to Barrier Island Mapping. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. All chirp systems use a signal of continuously varying frequency; the EdgeTech SB-424 system used during this survey produces high-resolution, shallow-penetration (typically less than 50 milliseconds (ms)) profile images of sub-seafloor stratigraphy. The towfish contains a transducer that transmits and receives acoustic energy and is typically towed 1 - 2 m below the sea surface. As transmitted acoustic energy intersects density boundaries, such as the seafloor or sub-surface sediment layers, energy is reflected back toward the transducer, received, and recorded by a PC-based seismic acquisition system. This process is repeated at regular time intervals (for example, 0.125 seconds (s)) and returned energy is recorded for a specific duration (for example, 50 ms). In this way, a two-dimensional (2-D) vertical image of the shallow geologic structure beneath the ship track is produced. Figure 2 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters and table 2 for trackline statistics. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in ASCII format instead of EBCDIC format. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The web version of this archive does not contain the SEG Y trace files. These files are very large and would require extremely long download times. To obtain the complete DVD archive, contact USGS Information Services at 1-888-ASK-USGS or infoservices@usgs.gov. The printable profiles provided here are GIF images that were processed and gained using SU software and can be viewed from the Profiles page or from links located on the trackline maps; refer to the Software page for links to example SU processing scripts. The SEG Y files are available on the DVD version of this report or on the Web, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software. Detailed information about the navigation system used can be found in table 1 and the Field Activity Collection System (FACS) logs. To view the trackline maps and navigation files, and for more information about these items, see the Navigation page.

  7. VizieR Online Data Catalog: Distances to RRab stars from WISE and Gaia (Sesar+, 2017)

    NASA Astrophysics Data System (ADS)

    Sesar, B.; Fouesneau, M.; Price-Whelan, A. M.; Bailer-Jones, C. A. L.; Gould, A.; Rix, H.-W.

    2017-10-01

    To constrain the period-luminosity-metallicity (PLZ) relations for RR Lyrae stars in WISE W1 and W2 bands, we use TGAS trigonometric parallaxes (barω), spectroscopic metallicities ([Fe/H]; Fernley+ 1998, J/A+A/330/515), log-periods (logP, base 10), and apparent magnitudes (m; Klein+ 2014, J/MNRAS/440/L96) for 102 RRab stars within ~2.5kpc from the Sun. The E(B-V) reddening at a star's position is obtained from the Schlegel+ (1998ApJ...500..525S) dust map. (1 data file).

  8. User-composable Electronic Health Record Improves Efficiency of Clinician Data Viewing for Patient Case Appraisal: A Mixed-Methods Study.

    PubMed

    Senathirajah, Yalini; Kaufman, David; Bakken, Suzanne

    2016-01-01

    Challenges in the design of electronic health records (EHRs) include designing usable systems that must meet the complex, rapidly changing, and high-stakes information needs of clinicians. The ability to move and assemble elements together on the same page has significant human-computer interaction (HCI) and efficiency advantages, and can mitigate the problems of negotiating multiple fixed screens and the associated cognitive burdens. We compare MedWISE-a novel EHR that supports user-composable displays-with a conventional EHR in terms of the number of repeat views of data elements for patient case appraisal. The study used mixed-methods for examination of clinical data viewing in four patient cases. The study compared use of an experimental user-composable EHR with use of a conventional EHR, for case appraisal. Eleven clinicians used a user-composable EHR in a case appraisal task in the laboratory setting. This was compared with log file analysis of the same patient cases in the conventional EHR. We investigated the number of repeat views of the same clinical information during a session and across these two contexts, and compared them using Fisher's exact test. There was a significant difference (p<.0001) in proportion of cases with repeat data element viewing between the user-composable EHR (14.6 percent) and conventional EHR (72.6 percent). Users of conventional EHRs repeatedly viewed the same information elements in the same session, as revealed by log files. Our findings are consistent with the hypothesis that conventional systems require that the user view many screens and remember information between screens, causing the user to forget information and to have to access the information a second time. Other mechanisms (such as reduction in navigation over a population of users due to interface sharing, and information selection) may also contribute to increased efficiency in the experimental system. Systems that allow a composable approach that enables the user to gather together on the same screen any desired information elements may confer cognitive support benefits that can increase productive use of systems by reducing fragmented information. By reducing cognitive overload, it can also enhance the user experience.

  9. Robo-line storage: Low latency, high capacity storage systems over geographically distributed networks

    NASA Technical Reports Server (NTRS)

    Katz, Randy H.; Anderson, Thomas E.; Ousterhout, John K.; Patterson, David A.

    1991-01-01

    Rapid advances in high performance computing are making possible more complete and accurate computer-based modeling of complex physical phenomena, such as weather front interactions, dynamics of chemical reactions, numerical aerodynamic analysis of airframes, and ocean-land-atmosphere interactions. Many of these 'grand challenge' applications are as demanding of the underlying storage system, in terms of their capacity and bandwidth requirements, as they are on the computational power of the processor. A global view of the Earth's ocean chlorophyll and land vegetation requires over 2 terabytes of raw satellite image data. In this paper, we describe our planned research program in high capacity, high bandwidth storage systems. The project has four overall goals. First, we will examine new methods for high capacity storage systems, made possible by low cost, small form factor magnetic and optical tape systems. Second, access to the storage system will be low latency and high bandwidth. To achieve this, we must interleave data transfer at all levels of the storage system, including devices, controllers, servers, and communications links. Latency will be reduced by extensive caching throughout the storage hierarchy. Third, we will provide effective management of a storage hierarchy, extending the techniques already developed for the Log Structured File System. Finally, we will construct a protototype high capacity file server, suitable for use on the National Research and Education Network (NREN). Such research must be a Cornerstone of any coherent program in high performance computing and communications.

  10. Log on to the Future: One School's Success Story.

    ERIC Educational Resources Information Center

    Hovenic, Ginger

    This paper describes Clear View Elementary School's (California) successful experience with integrating technology into the curriculum. Since its inception seven years ago, the school has acquired 250 computers, networked them all on two central file servers, and computerized the library and trained all staff members to be proficient facilitators…

  11. 40 CFR 146.14 - Information to be considered by the Director.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., logging procedures, deviation checks, and a drilling, testing, and coring program; and (16) A certificate... information listed below which are current and accurate in the file. For a newly drilled Class I well, the..., construction, date drilled, location, depth, record of plugging and/or completion, and any additional...

  12. All Aboard the Internet.

    ERIC Educational Resources Information Center

    Descy, Don E.

    1993-01-01

    This introduction to the Internet with examples for Macintosh computer users demonstrates the ease of using e-mail, participating on discussion group listservs, logging in to remote sites using Telnet, and obtaining resources using the File Transfer Protocol (FTP). Included are lists of discussion groups, Telnet sites, and FTP Archive sites. (EA)

  13. A Query Analysis of Consumer Health Information Retrieval

    PubMed Central

    Hong, Yi; de la Cruz, Norberto; Barnas, Gary; Early, Eileen; Gillis, Rick

    2002-01-01

    The log files of MCW HealthLink web site were analyzed to study users' needs for consumer health information and get a better understanding of the health topics users are searching for, the paths users usually take to find consumer health information and the way to improve search effectiveness.

  14. The Internet and Technical Services: A Point Break Approach.

    ERIC Educational Resources Information Center

    McCombs, Gillian M.

    1994-01-01

    Discusses implications of using the Internet for library technical services. Topics addressed include creative uses of the Internet; three basic applications on the Internet, i.e., electronic mail, remote log-in to another computer, and file transfer; electronic processing of information; electronic access to information; and electronic processing…

  15. 77 FR 35956 - Appalachian Power Company; Notice of Application Accepted for Filing, Soliciting Motions To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-15

    ...) screened intake structures; (3) a concrete powerhouse containing three turbine-generator units with a total... structures; (3) a concrete powerhouse containing three turbine-generator units with a total installed... by a log boom; (2) screened intake structures; (3) a concrete powerhouse containing three turbine...

  16. Library Web Proxy Use Survey Results.

    ERIC Educational Resources Information Center

    Murray, Peter E.

    2001-01-01

    Outlines the use of proxy Web servers by libraries and reports on a survey on their use in libraries. Highlights include proxy use for remote resource access, for filtering, for bandwidth conservation, and for gathering statistics; privacy policies regarding the use of proxy server log files; and a copy of the survey. (LRW)

  17. ILRS Station Reporting

    NASA Technical Reports Server (NTRS)

    Noll, Carey E.; Pearlman, Michael Reisman; Torrence, Mark H.

    2013-01-01

    Network stations provided system configuration documentation upon joining the ILRS. This information, found in the various site and system log files available on the ILRS website, is essential to the ILRS analysis centers, combination centers, and general user community. Therefore, it is imperative that the station personnel inform the ILRS community in a timely fashion when changes to the system occur. This poster provides some information about the various documentation that must be maintained. The ILRS network consists of over fifty global sites actively ranging to over sixty satellites as well as five lunar reflectors. Information about these stations are available on the ILRS website (http://ilrs.gsfc.nasa.gov/network/stations/index.html). The ILRS Analysis Centers must have current information about the stations and their system configuration in order to use their data in generation of derived products. However, not all information available on the ILRS website is as up-to-date as necessary for correct analysis of their data.

  18. Video Analysis and Remote Digital Ethnography: Approaches to understanding user perspectives and processes involving healthcare information technology.

    PubMed

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    Innovations in healthcare information systems promise to revolutionize and streamline healthcare processes worldwide. However, the complexity of these systems and the need to better understand issues related to human-computer interaction have slowed progress in this area. In this chapter the authors describe their work in using methods adapted from usability engineering, video ethnography and analysis of digital log files for improving our understanding of complex real-world healthcare interactions between humans and technology. The approaches taken are cost-effective and practical and can provide detailed ethnographic data on issues health professionals and consumers encounter while using systems as well as potential safety problems. The work is important in that it can be used in techno-anthropology to characterize complex user interactions with technologies and also to provide feedback into redesign and optimization of improved healthcare information systems.

  19. Streamlining CASTOR to manage the LHC data torrent

    NASA Astrophysics Data System (ADS)

    Lo Presti, G.; Espinal Curull, X.; Cano, E.; Fiorini, B.; Ieri, A.; Murray, S.; Ponce, S.; Sindrilaru, E.

    2014-06-01

    This contribution describes the evolution of the main CERN storage system, CASTOR, as it manages the bulk data stream of the LHC and other CERN experiments, achieving over 90 PB of stored data by the end of LHC Run 1. This evolution was marked by the introduction of policies to optimize the tape sub-system throughput, going towards a cold storage system where data placement is managed by the experiments' production managers. More efficient tape migrations and recalls have been implemented and deployed where bulk meta-data operations greatly reduce the overhead due to small files. A repack facility is now integrated in the system and it has been enhanced in order to automate the repacking of several tens of petabytes, required in 2014 in order to prepare for the next LHC run. Finally the scheduling system has been evolved to integrate the internal monitoring. To efficiently manage the service a solid monitoring infrastructure is required, able to analyze the logs produced by the different components (about 1 kHz of log messages). A new system has been developed and deployed, which uses a transport messaging layer provided by the CERN-IT Agile Infrastructure and exploits technologies including Hadoop and HBase. This enables efficient data mining by making use of MapReduce techniques, and real-time data aggregation and visualization. The outlook for the future is also presented. Directions and possible evolution will be discussed in view of the restart of data taking activities.

  20. PDB explorer -- a web based algorithm for protein annotation viewer and 3D visualization.

    PubMed

    Nayarisseri, Anuraj; Shardiwal, Rakesh Kumar; Yadav, Mukesh; Kanungo, Neha; Singh, Pooja; Shah, Pratik; Ahmed, Sheaza

    2014-12-01

    The PDB file format, is a text format characterizing the three dimensional structures of macro molecules available in the Protein Data Bank (PDB). Determined protein structure are found in coalition with other molecules or ions such as nucleic acids, water, ions, Drug molecules and so on, which therefore can be described in the PDB format and have been deposited in PDB database. PDB is a machine generated file, it's not human readable format, to read this file we need any computational tool to understand it. The objective of our present study is to develop a free online software for retrieval, visualization and reading of annotation of a protein 3D structure which is available in PDB database. Main aim is to create PDB file in human readable format, i.e., the information in PDB file is converted in readable sentences. It displays all possible information from a PDB file including 3D structure of that file. Programming languages and scripting languages like Perl, CSS, Javascript, Ajax, and HTML have been used for the development of PDB Explorer. The PDB Explorer directly parses the PDB file, calling methods for parsed element secondary structure element, atoms, coordinates etc. PDB Explorer is freely available at http://www.pdbexplorer.eminentbio.com/home with no requirement of log-in.

  1. Production data in media systems and press front ends: capture, formats and database methods

    NASA Astrophysics Data System (ADS)

    Karttunen, Simo

    1997-02-01

    The nature, purpose and data presentation features of media jobs are analyzed in relation to the content, document, process and resource management in media production. Formats are the natural way of presenting, collecting and storing information, contents, document components and final documents. The state of the art and the trends in the media formats and production data are reviewed. The types and the amount of production data are listed, e.g. events, schedules, product descriptions, reports, visual support, quality, process states and color data. The data exchange must be vendor-neutral. Adequate infrastructure and system architecture are defined for production and media data. The roles of open servers and intranets are evaluated and their potential roles as future solutions are anticipated. The press frontend is the part of print media production where large files dominate. The new output alternatives, i.e. film recorders, direct plate output (CTP and CTP-on-press) and digital, plateless printing lines need new workflow tools and very efficient file and format management. The paper analyzes the capture, formatting and storing of job files and respective production data, such as the event logs of the processes. Intranet, browsers, Java applets and open web severs will be used to capture production data, especially where intranets are used anyhow, or where several companies are networked to plan, design and use documents and printed products. The user aspects of installing intranets is stressed since there are numerous more traditional and more dedicated networking solutions on the market.

  2. Building Specialized Multilingual Lexical Graphs Using Community Resources

    NASA Astrophysics Data System (ADS)

    Daoud, Mohammad; Boitet, Christian; Kageura, Kyo; Kitamoto, Asanobu; Mangeot, Mathieu; Daoud, Daoud

    We are describing methods for compiling domain-dedicated multilingual terminological data from various resources. We focus on collecting data from online community users as a main source, therefore, our approach depends on acquiring contributions from volunteers (explicit approach), and it depends on analyzing users' behaviors to extract interesting patterns and facts (implicit approach). As a generic repository that can handle the collected multilingual terminological data, we are describing the concept of dedicated Multilingual Preterminological Graphs MPGs, and some automatic approaches for constructing them by analyzing the behavior of online community users. A Multilingual Preterminological Graph is a special lexical resource that contains massive amount of terms related to a special domain. We call it preterminological, because it is a raw material that can be used to build a standardized terminological repository. Building such a graph is difficult using traditional approaches, as it needs huge efforts by domain specialists and terminologists. In our approach, we build such a graph by analyzing the access log files of the website of the community, and by finding the important terms that have been used to search in that website, and their association with each other. We aim at making this graph as a seed repository so multilingual volunteers can contribute. We are experimenting this approach with the Digital Silk Road Project. We have used its access log files since its beginning in 2003, and obtained an initial graph of around 116000 terms. As an application, we used this graph to obtain a preterminological multilingual database that is serving a CLIR system for the DSR project.

  3. Stratigraphic framework of Cambrian and Ordovician rocks in the central Appalachian basin from Medina County, Ohio, through southwestern and south-central Pennsylvania to Hampshire County, West Virginia: Chapter E.2.2 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Harris, Anita G.; Repetski, John E.; Crangle, Robert D.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    This chapter is a re-release of U.S. Geological Survey Bulletin 1839-K, of the same title, by Ryder and others (1992; online version 2.0 revised and digitized by Robert D. Crangle, Jr., 2003). It consists of one file of the report text as it appeared in USGS Bulletin 1839-K and a second file containing the cross section, figures 1 and 2, and tables 1 and 2 on one oversized sheet; the second file was digitized in 2003 as version 2.0 and also includes the gamma-ray well log traces.

  4. Improved method estimating bioconcentration/bioaccumulation factor from octanol/water partition coefficient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meylan, W.M.; Howard, P.H.; Aronson, D.

    1999-04-01

    A compound`s bioconcentration factor (BDF) is the most commonly used indicator of its tendency to accumulate in aquatic organisms from the surrounding medium. Because it is expensive to measure, the BCF is generally estimated from the octanol/water partition coefficient (K{sub ow}), but currently used regression equations were developed from small data sets that do not adequately represent the wide range of chemical substances now subject to review. To develop and improved method, the authors collected BCF data in a file that contained information on measured BCFs and other key experimental details for 694 chemicals. Log BCF was then regressed againstmore » log K{sub ow} and chemicals with significant deviations from the line of best fit were analyzed by chemical structure. The resulting algorithm classifies a substance as either nonionic or ionic, the latter group including carboxylic acids, sulfonic acids and their salts, and quaternary N compounds. Log BCF for nonionics is estimated from log K{sub ow} and a series of correction factors if applicable; different equations apply for log K{sub ow} 1.0 to 7.0 and >7.0. For ionics, chemicals are categorized by log K{sub ow} and a log BCF in the range 0.5 to 1.75 is assigned. Organometallics, nonionics with long alkyl chains, and aromatic azo compounds receive special treatment. The correlation coefficient and mean error for log BCF indicate that the new method is a significantly better fit to existing data than other methods.« less

  5. Forensic Investigation of Cooperative Storage Cloud Service: Symform as a Case Study.

    PubMed

    Teing, Yee-Yang; Dehghantanha, Ali; Choo, Kim-Kwang Raymond; Dargahi, Tooska; Conti, Mauro

    2017-05-01

    Researchers envisioned Storage as a Service (StaaS) as an effective solution to the distributed management of digital data. Cooperative storage cloud forensic is relatively new and is an under-explored area of research. Using Symform as a case study, we seek to determine the data remnants from the use of cooperative cloud storage services. In particular, we consider both mobile devices and personal computers running various popular operating systems, namely Windows 8.1, Mac OS X Mavericks 10.9.5, Ubuntu 14.04.1 LTS, iOS 7.1.2, and Android KitKat 4.4.4. Potential artefacts recovered during the research include data relating to the installation and uninstallation of the cloud applications, log-in to and log-out from Symform account using the client application, file synchronization as well as their time stamp information. This research contributes to an in-depth understanding of the types of terrestrial artifacts that are likely to remain after the use of cooperative storage cloud on client devices. © 2016 American Academy of Forensic Sciences.

  6. Production, prices, employment, and trade in Northwest forest industries, third quarter 1996.

    Treesearch

    Debra D. Warren

    1997-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries: international trade in logs, lumber, and plywood: volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  7. Production, prices, employment, and trade in Northwest forest industries, all quarters 2000.

    Treesearch

    Debra D. Warren

    2002-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  8. Production, prices, employment, and trade in Northwest forest industries, all quarters 2002.

    Treesearch

    Debra D. Warren

    2004-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  9. Production, prices, employment, and trade in Northwest forest industries, all quarters 2005.

    Treesearch

    Debra D. Warren

    2007-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  10. Production, prices, employment, and trade in Northwest forest industries, all quarters 2006.

    Treesearch

    Debra D. Warren

    2008-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  11. Production, prices, employment, and trade in Northwest forest industries, all quarters 2004.

    Treesearch

    Debra D. Warren

    2006-01-01

    Provides current information on lumber and plywood production and prices; employment in forest industries; international trade in logs, lumber, and plywood; volumes and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  12. Voting with Their Seats: Computer Laboratory Design and the Casual User

    ERIC Educational Resources Information Center

    Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David

    2007-01-01

    Student computer laboratories are provided by most teaching institutions around the world; however, what is the most effective layout for such facilities? The log-in data files from computer laboratories at a regional university in Australia were analysed to determine whether there was a pattern in student seating. In particular, it was…

  13. Using Learning Styles and Viewing Styles in Streaming Video

    ERIC Educational Resources Information Center

    de Boer, Jelle; Kommers, Piet A. M.; de Brock, Bert

    2011-01-01

    Improving the effectiveness of learning when students observe video lectures becomes urgent with the rising advent of (web-based) video materials. Vital questions are how students differ in their learning preferences and what patterns in viewing video can be detected in log files. Our experiments inventory students' viewing patterns while watching…

  14. Recommendations for Benchmarking Web Site Usage among Academic Libraries.

    ERIC Educational Resources Information Center

    Hightower, Christy; Sih, Julie; Tilghman, Adam

    1998-01-01

    To help library directors and Web developers create a benchmarking program to compare statistics of academic Web sites, the authors analyzed the Web server log files of 14 university science and engineering libraries. Recommends a centralized voluntary reporting structure coordinated by the Association of Research Libraries (ARL) and a method for…

  15. Motivational Aspects of Learning Genetics with Interactive Multimedia

    ERIC Educational Resources Information Center

    Tsui, Chi-Yan; Treagust, David F.

    2004-01-01

    A BioLogica trial in six U.S. schools using interpretive approach is conducted by the Concord Consortium that examined the student motivation of learning genetics. Multiple data sources like online tests, computer data log files and classroom observation are used that found the result in terms of interviewees' perception, class-wide online…

  16. 16. Photocopy of photograph (4 x 5 inch reduction of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. Photocopy of photograph (4 x 5 inch reduction of 1939 3-1/4 x 5-5/8 inch print, photographer unknown; in Recreation files, Supervisor's Office, Mt. Baker-Snoqualmie National Forest) GENERAL VIEW, NORTHEAST CORNER, INTERPRETIVE LOG TO LEFT. - Glacier Ranger Station, Washington State Route 542, Glacier, Whatcom County, WA

  17. Production, prices, employment, and trade in Northwest forest industries, all quarters 1998.

    Treesearch

    Debra D. Warren

    2000-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  18. Production, prices, employment, and trade in Northwest forest industries, fourth quarter 1996.

    Treesearch

    Debra D. Warren

    1997-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  19. Production, prices, employment, and trade in Northwest forest industries, all quarters of 2007.

    Treesearch

    Debra D. Warren

    2008-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  20. Production, prices, employment, and trade in Northwest forest industries, all quarters 2003.

    Treesearch

    Debra D. Warren

    2005-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  1. Production, prices, employment, and trade in Northwest forest industries, all quarters 2008

    Treesearch

    Debra Warren

    2009-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  2. Data Retention Policy | High-Performance Computing | NREL

    Science.gov Websites

    HPC Data Retention Policy. File storage areas on Peregrine and Gyrfalcon are either user-centric to reclaim storage. We can make special arrangements for permanent storage, if needed. User-Centric > is 3 months after the last project ends. During this retention period, the user may log in to

  3. Elementary School Students' Strategic Learning: Does Task-Type Matter?

    ERIC Educational Resources Information Center

    Malmberg, Jonna; Järvelä, Sanna; Kirschner, Paul A.

    2014-01-01

    This study investigated what types of learning patterns and strategies elementary school students use to carry out ill- and well-structured tasks. Specifically, it was investigated which and when learning patterns actually emerge with respect to students' task solutions. The present study uses computer log file traces to investigate how…

  4. Patterns in Elementary School Students' Strategic Actions in Varying Learning Situations

    ERIC Educational Resources Information Center

    Malmberg, Jonna; Järvenoja, Hanna; Järvelä, Sanna

    2013-01-01

    This study uses log file traces to examine differences between high-and low-achieving students' strategic actions in varying learning situations. In addition, this study illustrates, in detail, what strategic and self-regulated learning constitutes in practice. The study investigates the learning patterns that emerge in learning situations…

  5. Online Persistence in Higher Education Web-Supported Courses

    ERIC Educational Resources Information Center

    Hershkovitz, Arnon; Nachmias, Rafi

    2011-01-01

    This research consists of an empirical study of online persistence in Web-supported courses in higher education, using Data Mining techniques. Log files of 58 Moodle websites accompanying Tel Aviv University courses were drawn, recording the activity of 1189 students in 1897 course enrollments during the academic year 2008/9, and were analyzed…

  6. 78 FR 56873 - Information Collection Being Reviewed by the Federal Communications Commission

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... on the respondents, including the use of automated collection techniques or other forms of....: 3060-0360. Title: Section 80.409, Station Logs (Maritime Services). Form No.: N/A. Type of Review... the claim or complaint has been satisfied or barred by statute limiting the time for filing suits upon...

  7. Archive of side scan sonar and swath bathymetry data collected during USGS cruise 10CCT01 offshore of Cat Island, Gulf Islands National Seashore, Mississippi, March 2010

    USGS Publications Warehouse

    DeWitt, Nancy T.; Flocks, James G.; Pfeiffer, William R.; Wiese, Dana S.

    2010-01-01

    In March of 2010, the U.S. Geological Survey (USGS) conducted geophysical surveys east of Cat Island, Mississippi (fig. 1). The efforts were part of the USGS Gulf of Mexico Science Coordination partnership with the U.S. Army Corps of Engineers (USACE) to assist the Mississippi Coastal Improvements Program (MsCIP) and the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazards Susceptibility Project by mapping the shallow geological stratigraphic framework of the Mississippi Barrier Island Complex. These geophysical surveys will provide the data necessary for scientists to define, interpret, and provide baseline bathymetry and seafloor habitat for this area and to aid scientists in predicting future geomorpholocial changes of the islands with respect to climate change, storm impact, and sea-level rise. Furthermore, these data will provide information for barrier island restoration, particularly in Camille Cut, and provide protection for the historical Fort Massachusetts. For more information refer to http://ngom.usgs.gov/gomsc/mscip/index.html. This report serves as an archive of the processed swath bathymetry and side scan sonar data (SSS). Data products herein include gridded and interpolated surfaces, surface images, and x,y,z data products for both swath bathymetry and side scan sonar imagery. Additional files include trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal FGDC metadata. Scanned images of the handwritten FACS logs and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report or hold the cursor over an acronym for a pop-up explanation. The USGS St. Petersburg Coastal and Marine Science Center assigns a unique identifier to each cruise or field activity. For example, 10CCT01 tells us the data were collected in 2010 for the Coastal Change and Transport (CCT) study and the data were collected during the first field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. Data were collected using a 26-foot (ft) Glacier Bay Catamaran. Side scan sonar and interferometric swath bathymetry data were collected simultaneously along the tracklines. The side scan sonar towfish was towed off the port side just slightly behind the vessel, close to the seafloor. The interferometric swath transducer was sled-mounted on a rail attached between the catamaran hulls. During the survey the sled is secured into position. Navigation was acquired with a CodaOctopus Octopus F190 Precision Attitude and Positioning System and differentially corrected with OmniSTAR. See the digital FACS equipment log for details about the acquisition equipment used. Both raw datasets were stored digitally and processed using CARIS HIPS and SIPS software at the USGS St. Petersburg Coastal and Marine Science Center. For more information on processing refer to the Equipment and Processing page. Post-processing of the swath dataset revealed a motion artifact that is attributed to movement of the pole that the swath transducers are attached to in relation to the boat. The survey took place in the winter months, in which strong winds and rough waves contributed to a reduction in data quality. The rough seas contributed to both the movement of the pole and the very high noise base seen in the raw amplitude data of the side scan sonar. Chirp data were also collected during this survey and are archived separately.

  8. Rapid Diagnostics of Onboard Sequences

    NASA Technical Reports Server (NTRS)

    Starbird, Thomas W.; Morris, John R.; Shams, Khawaja S.; Maimone, Mark W.

    2012-01-01

    Keeping track of sequences onboard a spacecraft is challenging. When reviewing Event Verification Records (EVRs) of sequence executions on the Mars Exploration Rover (MER), operators often found themselves wondering which version of a named sequence the EVR corresponded to. The lack of this information drastically impacts the operators diagnostic capabilities as well as their situational awareness with respect to the commands the spacecraft has executed, since the EVRs do not provide argument values or explanatory comments. Having this information immediately available can be instrumental in diagnosing critical events and can significantly enhance the overall safety of the spacecraft. This software provides auditing capability that can eliminate that uncertainty while diagnosing critical conditions. Furthermore, the Restful interface provides a simple way for sequencing tools to automatically retrieve binary compiled sequence SCMFs (Space Command Message Files) on demand. It also enables developers to change the underlying database, while maintaining the same interface to the existing applications. The logging capabilities are also beneficial to operators when they are trying to recall how they solved a similar problem many days ago: this software enables automatic recovery of SCMF and RML (Robot Markup Language) sequence files directly from the command EVRs, eliminating the need for people to find and validate the corresponding sequences. To address the lack of auditing capability for sequences onboard a spacecraft during earlier missions, extensive logging support was added on the Mars Science Laboratory (MSL) sequencing server. This server is responsible for generating all MSL binary SCMFs from RML input sequences. The sequencing server logs every SCMF it generates into a MySQL database, as well as the high-level RML file and dictionary name inputs used to create the SCMF. The SCMF is then indexed by a hash value that is automatically included in all command EVRs by the onboard flight software. Second, both the binary SCMF result and the RML input file can be retrieved simply by specifying the hash to a Restful web interface. This interface enables command line tools as well as large sophisticated programs to download the SCMF and RMLs on-demand from the database, enabling a vast array of tools to be built on top of it. One such command line tool can retrieve and display RML files, or annotate a list of EVRs by interleaving them with the original sequence commands. This software has been integrated with the MSL sequencing pipeline where it will serve sequences useful in diagnostics, debugging, and situational awareness throughout the mission.

  9. VizieR Online Data Catalog: Bessel (1825) calculation for geodesic measurements (Karney+, 2010)

    NASA Astrophysics Data System (ADS)

    Karney, C. F. F.; Deakin, R. E.

    2010-06-01

    The solution of the geodesic problem for an oblate ellipsoid is developed in terms of series. Tables are provided to simplify the computation. Included here are the tables that accompanied Bessel's paper (with corrections). The tables were crafted by Bessel to be minimize the labor of hand calculations. To this end, he adjusted the intervals in the tables, the number of terms included in the series, and the number of significant digits given so that the final results are accurate to about 8 places. For that reason, the most useful form of the tables is as the PDF file which provides the tables in a layout close to the original. Also provided is the LaTeX source file for the PDF file. Finally, the data has been put into a format so that it can be read easily by computer programs. All the logarithms are in base 10 (common logarithms). The characteristic and the mantissa should be read separately (indicated as x.c and x.m in the file description). Thus the first entry in the table, -4.4, should be parsed as "-4" (the characteristic) and ".4" (the mantissa); the anti-log for this entry is 10(-4+0.4)=2.5e-4. The "Delta" columns give the first difference of the preceding column, i.e., the difference of the preceding column in the next row and the preceding column in the current row. In the printed tables these are expressed as "units in the last place" and the differences are of the rounded representations in the preceding columns (to minimize interpolation errors). In table1.dat these are given scaled to a match the format used for the preceding column, as indicated by the units given for these columns. The unit log(") (in the description within square brackets [arcsec]) means the logarithm of a quantity expressed in arcseconds. (3 data files).

  10. Measurement, Modeling, and Analysis of a Large-scale Blog Sever Workload

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeon, Myeongjae; Hwang, Jeaho; Kim, Youngjae

    2010-01-01

    Despite the growing popularity of Online Social Networks (OSNs), the workload characteristics of OSN servers, such as those hosting blog services, are not well understood. Understanding workload characteristics is important for opti- mizing and improving the performance of current systems and software based on observed trends. Thus, in this paper, we characterize the system workload of the largest blog hosting servers in South Korea, Tistory1. In addition to understanding the system workload of the blog hosting server, we have developed synthesized workloads and obtained the following major findings: (i) the transfer size of non-multimedia files and blog articles can bemore » modeled by a truncated Pareto distribution and a log-normal distribution respectively, and (ii) users accesses to blog articles do not show temporal locality, but they are strongly biased toward those posted along with images or audio.« less

  11. Creating an EPICS Based Test Stand Development System for a BPM Digitizer of the Linac Coherent Light Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2011-06-22

    The Linac Coherent Light Source (LCLS) is required to deliver a high quality electron beam for producing coherent X-rays. As a result, high resolution beam position monitoring is required. The Beam Position Monitor (BPM) digitizer acquires analog signals from the beam line and digitizes them to obtain beam position data. Although Matlab is currently being used to test the BPM digitizer?s functions and capability, the Controls Department at SLAC prefers to use Experimental Physics and Industrial Control Systems (EPICS). This paper discusses the transition of providing similar as well as enhanced functionalities, than those offered by Matlab, to test themore » digitizer. Altogether, the improved test stand development system can perform mathematical and statistical calculations with the waveform signals acquired from the digitizer and compute the fast Fourier transform (FFT) of the signals. Finally, logging of meaningful data into files has been added.« less

  12. Murmer, a message generator and reporter for Unix, VMS, and VxWorks

    NASA Astrophysics Data System (ADS)

    Oleynik, G.; Appleton, B.; Moore, C.; Sergey, G.; Udumula, L.

    1994-02-01

    Murmer is a Unix based message generation, reporting, display, and logging system that we have developed for use in data acquisition systems at Fermilab. Murmer is a tool for the production and management of message reporting. Its usefulness ranges from software product development and maintenance to system level shakedown and diagnostics. Murmer provides a VMS MESSAGE-like function code generation utility, a client routine package for sending these codes over the network to a central server, and a server which translates the codes into meaningful visual information, writes the information to a logfile, and display it on B&W or color X windows. Because Murmer stores message information in keyed access files, it can provide advanced features such as popping up help when a displayed message is clicked on by the mouse and executing 'action' shell scripts when selected messages are received by the server.

  13. Grid-wide neuroimaging data federation in the context of the NeuroLOG project

    PubMed Central

    Michel, Franck; Gaignard, Alban; Ahmad, Farooq; Barillot, Christian; Batrancourt, Bénédicte; Dojat, Michel; Gibaud, Bernard; Girard, Pascal; Godard, David; Kassel, Gilles; Lingrand, Diane; Malandain, Grégoire; Montagnat, Johan; Pélégrini-Issac, Mélanie; Pennec, Xavier; Rojas Balderrama, Javier; Wali, Bacem

    2010-01-01

    Grid technologies are appealing to deal with the challenges raised by computational neurosciences and support multi-centric brain studies. However, core grids middleware hardly cope with the complex neuroimaging data representation and multi-layer data federation needs. Moreover, legacy neuroscience environments need to be preserved and cannot be simply superseded by grid services. This paper describes the NeuroLOG platform design and implementation, shedding light on its Data Management Layer. It addresses the integration of brain image files, associated relational metadata and neuroscience semantic data in a heterogeneous distributed environment, integrating legacy data managers through a mediation layer. PMID:20543431

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stathakis, S; Defoor, D; Linden, P

    Purpose: To study the frequency of Multi-Leaf Collimator (MLC) leaf failures, investigate methods to predict them and reduce linac downtime. Methods: A Varian HD120 MLC was used in our study. The hyperterminal MLC errors logged from 06/2012 to 12/2014 were collected. Along with the hyperterminal errors, the MLC motor changes and all other MLC interventions by the linear accelerator engineer were recorded. The MLC dynalog files were also recorded on a daily basis for each treatment and during linac QA. The dynalog files were analyzed to calculate root mean square errors (RMS) and cumulative MLC travel distance per motor. Anmore » in-house MatLab code was used to analyze all dynalog files, record RMS errors and calculate the distance each MLC traveled per day. Results: A total of 269 interventions were recorded over a period of 18 months. Of these, 146 included MLC motor leaf change, 39 T-nut replacements, and 84 MLC cleaning sessions. Leaves close to the middle of each side required the most maintenance. In the A bank, leaves A27 to A40 recorded 73% of all interventions, while the same leaves in the B bank counted for 52% of the interventions. On average, leaves in the middle of the bank had their motors changed approximately every 1500m of travel. Finally, it was found that the number of RMS errors increased prior to an MLC motor change. Conclusion: An MLC dynalog file analysis software was developed that can be used to log daily MLC usage. Our eighteen-month data analysis showed that there is a correlation between the distance an MLC travels, the RMS and the life of the MLC motor. We plan to use this tool to predict MLC motor failures and with proper and timely intervention, reduce the downtime of the linac during clinical hours.« less

  15. Overview of GSE as a multifunctional GUI

    NASA Astrophysics Data System (ADS)

    Kurtovich, Boyan; Malangone, Fabio; Voss, David L.; Carssow, Douglas B.; Fritz, Theodore A.; Mavretic, Anton

    2009-08-01

    Ground Support Equipment (GSE) [1] is a versatile and multifunctional graphical user interface (GUI) and a software/hardware platform. It is a custom-designed system executed in the LabVIEW programming language to serve as an instrument health monitor for the Loss Cone Imager (LCI) satellite project. GSE mimics the behavior of the onboard Experiment Computer System (ECS). Its functions comprise the measurement of voltage, current, and power, as well as acting as a safety mechanism in case of any anomalous condition (e.g., over-current and/or over-voltage situation). Individual log files record the sessions during which data is gathered and analyzed. Safety/warning alarm flags shall be 'visible' from any individual window/tab. Analog-to-Digital Conversion (ADC) particle group measurements will be displayed on six individual panels. GSE will be supplemented with a comprehensive user's manual for added clarity.

  16. Titian: Data Provenance Support in Spark

    PubMed Central

    Interlandi, Matteo; Shah, Kshitij; Tetali, Sai Deep; Gulzar, Muhammad Ali; Yoo, Seunghyun; Kim, Miryung; Millstein, Todd; Condie, Tyson

    2015-01-01

    Debugging data processing logic in Data-Intensive Scalable Computing (DISC) systems is a difficult and time consuming effort. Today’s DISC systems offer very little tooling for debugging programs, and as a result programmers spend countless hours collecting evidence (e.g., from log files) and performing trial and error debugging. To aid this effort, we built Titian, a library that enables data provenance—tracking data through transformations—in Apache Spark. Data scientists using the Titian Spark extension will be able to quickly identify the input data at the root cause of a potential bug or outlier result. Titian is built directly into the Spark platform and offers data provenance support at interactive speeds—orders-of-magnitude faster than alternative solutions—while minimally impacting Spark job performance; observed overheads for capturing data lineage rarely exceed 30% above the baseline job execution time. PMID:26726305

  17. Analysis of Student Activity in Web-Supported Courses as a Tool for Predicting Dropout

    ERIC Educational Resources Information Center

    Cohen, Anat

    2017-01-01

    Persistence in learning processes is perceived as a central value; therefore, dropouts from studies are a prime concern for educators. This study focuses on the quantitative analysis of data accumulated on 362 students in three academic course website log files in the disciplines of mathematics and statistics, in order to examine whether student…

  18. Self-Regulation during E-Learning: Using Behavioural Evidence from Navigation Log Files

    ERIC Educational Resources Information Center

    Jeske, D.; Backhaus, J.; Stamov Roßnagel, C.

    2014-01-01

    The current paper examined the relationship between perceived characteristics of the learning environment in an e-module in relation to test performance among a group of e-learners. Using structural equation modelling, the relationship between these variables is further explored in terms of the proposed double mediation as outlined by Ning and…

  19. Microanalytic Case studies of Individual Participation Patterns in an Asynchronous Online Discussion in an Undergraduate Blended Course

    ERIC Educational Resources Information Center

    Wise, Alyssa Friend; Perera, Nishan; Hsiao, Ying-Ting; Speer, Jennifer; Marbouti, Farshid

    2012-01-01

    This study presents three case studies of students' participation patterns in an online discussion to address the gap in our current understanding of how "individuals" experience asynchronous learning environments. Cases were constructed via microanalysis of log-file data, post contents, and the evolving discussion structure. The first student was…

  20. Query Classification and Study of University Students' Search Trends

    ERIC Educational Resources Information Center

    Maabreh, Majdi A.; Al-Kabi, Mohammed N.; Alsmadi, Izzat M.

    2012-01-01

    Purpose: This study is an attempt to develop an automatic identification method for Arabic web queries and divide them into several query types using data mining. In addition, it seeks to evaluate the impact of the academic environment on using the internet. Design/methodology/approach: The web log files were collected from one of the higher…

  1. VizieR Online Data Catalog: GAMA. Stellar mass budget (Moffett+, 2016)

    NASA Astrophysics Data System (ADS)

    Moffett, A. J.; Lange, R.; Driver, S. P.; Robotham, A. S. G.; Kelvin, L. S.; Alpaslan, M.; Andrews, S. K.; Bland-Hawthorn, J.; Brough, S.; Cluver, M. E.; Colless, M.; Davies, L. J. M.; Holwerda, B. W.; Hopkins, A. M.; Kafle, P. R.; Liske, J.; Meyer, M.

    2018-04-01

    Using the recently expanded Galaxy and Mass Assembly (GAMA) survey phase II visual morphology sample and the large-scale bulge and disc decomposition analysis of Lange et al. (2016MNRAS.462.1470L), we derive new stellar mass function fits to galaxy spheroid and disc populations down to log(M*/Mȯ)=8. (1 data file).

  2. 76 FR 54835 - Child Labor Regulations, Orders and Statements of Interpretation; Child Labor Violations-Civil...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ....m. in your local time zone, or log onto the Wage and Hour Division's Web site for a nationwide... INFORMATION: I. Electronic Access and Filing Comments Public Participation: This notice of proposed rulemaking is available through the Federal Register and the http://www.regulations.gov Web site. You may also...

  3. 15. Photocopy of photograph (4 x 5 inch reduction of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. Photocopy of photograph (4 x 5 inch reduction of 1939 3-1/4 x 5-1/2 inch print, photographer unknown; in Recreation files, Supervisor's Office, Mt. Baker-Snoqualmie National Forest) GENERAL VIEW, LOOKING SOUTHWEST, SHOWING INTERPRETIVE LOG AND PROTECTION ASSISTANT'S HOUSE IN BACKGROUND. - Glacier Ranger Station, Washington State Route 542, Glacier, Whatcom County, WA

  4. Negotiating the Context of Online In-Service Training: "Expert" and "Non-Expert" Footings

    ERIC Educational Resources Information Center

    Nilsen, Mona

    2010-01-01

    This paper focuses on how people working in the Swedish food production industry engage in in-service training by means of computer-mediated communication. The empirical material consists of archived chat log files from a course concerning quality assurance and food safety hazards control in the preparation and handling of foodstuff. Drawing on…

  5. Learner Characteristics Predict Performance and Confidence in E-Learning: An Analysis of User Behavior and Self-Evaluation

    ERIC Educational Resources Information Center

    Jeske, Debora; Roßnagell, Christian Stamov; Backhaus, Joy

    2014-01-01

    We examined the role of learner characteristics as predictors of four aspects of e-learning performance, including knowledge test performance, learning confidence, learning efficiency, and navigational effectiveness. We used both self reports and log file records to compute the relevant statistics. Regression analyses showed that both need for…

  6. Digging Deeper into Learners' Experiences in MOOCs: Participation in Social Networks outside of MOOCs, Notetaking and Contexts Surrounding Content Consumption

    ERIC Educational Resources Information Center

    Veletsianos, George; Collier, Amy; Schneider, Emily

    2015-01-01

    Researchers describe with increasing confidence "what" they observe participants doing in massive open online courses (MOOCs). However, our understanding of learner activities in open courses is limited by researchers' extensive dependence on log file analyses and clickstream data to make inferences about learner behaviors. Further, the…

  7. Developing an Efficient Computational Method that Estimates the Ability of Students in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2012-01-01

    This paper presents a computational method that can efficiently estimate the ability of students from the log files of a Web-based learning environment capturing their problem solving processes. The computational method developed in this study approximates the posterior distribution of the student's ability obtained from the conventional Bayes…

  8. Making Sense of Students' Actions in an Open-Ended Virtual Laboratory Environment

    ERIC Educational Resources Information Center

    Gal, Ya'akov; Uzan, Oriel; Belford, Robert; Karabinos, Michael; Yaron, David

    2015-01-01

    A process for analyzing log files collected from open-ended learning environments is developed and tested on a virtual lab problem involving reaction stoichiometry. The process utilizes a set of visualization tools that, by grouping student actions in a hierarchical manner, helps experts make sense of the linear list of student actions recorded in…

  9. Web processing service for landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Sandric, I.; Ursaru, P.; Chitu, D.; Mihai, B.; Savulescu, I.

    2012-04-01

    Hazard analysis requires heavy computation and specialized software. Web processing services can offer complex solutions that can be accessed through a light client (web or desktop). This paper presents a web processing service (both WPS and Esri Geoprocessing Service) for landslides hazard assessment. The web processing service was build with Esri ArcGIS Server solution and Python, developed using ArcPy, GDAL Python and NumPy. A complex model for landslide hazard analysis using both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation was build and published as WPS and Geoprocessing service using ArcGIS Standard Enterprise 10.1. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. All these parameters can be served by the client from other WFS services or by uploading and processing the data on the server. The user can select the option of creating the first and second derivatives from the DEM automatically on the server or to upload the data already calculated. One of the main dynamic factors from the landslide analysis model is leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index can be derived from various satellite images or downloaded as a product. The upload of such data (time series) is possible using a NetCDF file format. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created and it can be downloaded from the server with the log file

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meneses, Esteban; Ni, Xiang; Jones, Terry R

    The unprecedented computational power of cur- rent supercomputers now makes possible the exploration of complex problems in many scientific fields, from genomic analysis to computational fluid dynamics. Modern machines are powerful because they are massive: they assemble millions of cores and a huge quantity of disks, cards, routers, and other components. But it is precisely the size of these machines that glooms the future of supercomputing. A system that comprises many components has a high chance to fail, and fail often. In order to make the next generation of supercomputers usable, it is imperative to use some type of faultmore » tolerance platform to run applications on large machines. Most fault tolerance strategies can be optimized for the peculiarities of each system and boost efficacy by keeping the system productive. In this paper, we aim to understand how failure characterization can improve resilience in several layers of the software stack: applications, runtime systems, and job schedulers. We examine the Titan supercomputer, one of the fastest systems in the world. We analyze a full year of Titan in production and distill the failure patterns of the machine. By looking into Titan s log files and using the criteria of experts, we provide a detailed description of the types of failures. In addition, we inspect the job submission files and describe how the system is used. Using those two sources, we cross correlate failures in the machine to executing jobs and provide a picture of how failures affect the user experience. We believe such characterization is fundamental in developing appropriate fault tolerance solutions for Cray systems similar to Titan.« less

  11. Coastal single-beam bathymetry data collected in 2015 from Raccoon Point to Point Au Fer Island, Louisiana

    USGS Publications Warehouse

    Stalk, Chelsea A.; DeWitt, Nancy T.; Kindinger, Jack L.; Flocks, James G.; Reynolds, Billy J.; Kelso, Kyle W.; Fredericks, Joseph J.; Tuten, Thomas M.

    2017-03-10

    As part of the Barrier Island Comprehensive Monitoring Program (BICM), scientists from the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center conducted a nearshore single-beam bathymetry survey along the south-central coast of Louisiana, from Raccoon Point to Point Au Fer Island, in July 2015. The goal of the BICM program is to provide long-term data on Louisiana’s coastline and use this data to plan, design, evaluate, and maintain current and future barrier island restoration projects. The data described in this report will provide baseline bathymetric information for future research investigating island evolution, sediment transport, and recent and long-term geomorphic change, and will support modeling of future changes in response to restoration and storm impacts. The survey area encompasses more than 300 square kilometers of nearshore environment from Raccoon Point to Point Au Fer Island. This data series serves as an archive of processed single-beam bathymetry data, collected from July 22–29, 2015, under USGS Field Activity Number 2015-320-FA. Geographic information system data products include a 200-meter-cell-size interpolated bathymetry grid, trackline maps, and point data files. Additional files include error analysis maps, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.

  12. A compiler and validator for flight operations on NASA space missions

    NASA Astrophysics Data System (ADS)

    Fonte, Sergio; Politi, Romolo; Capria, Maria Teresa; Giardino, Marco; De Sanctis, Maria Cristina

    2016-07-01

    In NASA missions the management and the programming of the flight systems is performed by a specific scripting language, the SASF (Spacecraft Activity Sequence File). In order to perform a check on the syntax and grammar it is necessary a compiler that stress the errors (eventually) found in the sequence file produced for an instrument on board the flight system. In our experience on Dawn mission, we developed VIRV (VIR Validator), a tool that performs checks on the syntax and grammar of SASF, runs a simulations of VIR acquisitions and eventually finds violation of the flight rules of the sequences produced. The project of a SASF compiler (SSC - Spacecraft Sequence Compiler) is ready to have a new implementation: the generalization for different NASA mission. In fact, VIRV is a compiler for a dialect of SASF; it includes VIR commands as part of SASF language. Our goal is to produce a general compiler for the SASF, in which every instrument has a library to be introduced into the compiler. The SSC can analyze a SASF, produce a log of events, perform a simulation of the instrument acquisition and check the flight rules for the instrument selected. The output of the program can be produced in GRASS GIS format and may help the operator to analyze the geometry of the acquisition.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, X; Yang, F

    Purpose: Knowing MLC leaf positioning error over the course of treatment would be valuable for treatment planning, QA design, and patient safety. The objective of the current study was to quantify the MLC positioning accuracy for VMAT delivery of head and neck treatment plans. Methods: A total of 837 MLC log files were collected from 14 head and neck cancer patients undergoing full arc VMAT treatment on one Varian Trilogy machine. The actual and planned leaf gaps were extracted from the retrieved MLC log files. For a given patient, the leaf gap error percentage (LGEP), defined as the ratio ofmore » the actual leaf gap over the planned, was evaluated for each leaf pair at all the gantry angles recorded over the course of the treatment. Statistics describing the distribution of the largest LGEP (LLGEP) of the 60 leaf pairs including the maximum, minimum, mean, Kurtosis, and skewness were evaluated. Results: For the 14 studied patients, their PTV located at tonsil, base of tongue, larynx, supraglottis, nasal cavity, and thyroid gland with volume ranging from 72.0 cm{sup 3} to 602.0 cm{sup 3}. The identified LLGEP differed between patients. It ranged from 183.9% to 457.7% with a mean of 368.6%. For the majority of the patients, the LLGEP distributions peaked at non-zero positions and showed no obvious dependence on gantry rotations. Kurtosis and skewness, with minimum/maximum of 66.6/217.9 and 6.5/12.6, respectively, suggested relatively more peaked while right-skewed leaf error distribution pattern. Conclusion: The results indicate pattern of MLC leaf gap error differs between patients of lesion located at similar anatomic site. Understanding the systemic mechanisms underlying these observed error patterns necessitates examining more patient-specific plan parameters in a large patient cohort setting.« less

  14. A dosimetric evaluation of the Eclipse AAA algorithm and Millennium 120 MLC for cranial intensity-modulated radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calvo Ortega, Juan Francisco, E-mail: jfcdrr@yahoo.es; Moragues, Sandra; Pozo, Miquel

    2014-07-01

    The aim of this study is to assess the accuracy of a convolution-based algorithm (anisotropic analytical algorithm [AAA]) implemented in the Eclipse planning system for intensity-modulated radiosurgery (IMRS) planning of small cranial targets by using a 5-mm leaf-width multileaf collimator (MLC). Overall, 24 patient-based IMRS plans for cranial lesions of variable size (0.3 to 15.1 cc) were planned (Eclipse, AAA, version 10.0.28) using fixed field-based IMRS produced by a Varian linear accelerator equipped with a 120 MLC (5-mm width on central leaves). Plan accuracy was evaluated according to phantom-based measurements performed with radiochromic film (EBT2, ISP, Wayne, NJ). Film 2Dmore » dose distributions were performed with the FilmQA Pro software (version 2011, Ashland, OH) by using the triple-channel dosimetry method. Comparison between computed and measured 2D dose distributions was performed using the gamma method (3%/1 mm). Performance of the MLC was checked by inspection of the DynaLog files created by the linear accelerator during the delivery of each dynamic field. The absolute difference between the calculated and measured isocenter doses for all the IMRS plans was 2.5% ± 2.1%. The gamma evaluation method resulted in high average passing rates of 98.9% ± 1.4% (red channel) and 98.9% ± 1.5% (blue and green channels). DynaLog file analysis revealed a maximum root mean square error of 0.46 mm. According to our results, we conclude that the Eclipse/AAA algorithm provides accurate cranial IMRS dose distributions that may be accurately delivered by a Varian linac equipped with a Millennium 120 MLC.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiCostanzo, D; Ayan, A; Woollard, J

    Purpose: To predict potential failures of hardware within the Varian TrueBeam linear accelerator in order to proactively replace parts and decrease machine downtime. Methods: Machine downtime is a problem for all radiation oncology departments and vendors. Most often it is the result of unexpected equipment failure, and increased due to lack of in-house clinical engineering support. Preventative maintenance attempts to assuage downtime, but often is ineffective at preemptively preventing many failure modes such as MLC motor failures, the need to tighten a gantry chain, or the replacement of a jaw motor, among other things. To attempt to alleviate downtime, softwaremore » was developed in house that determines the maximum value of each axis enumerated in the Truebeam trajectory log files. After patient treatments, this data is stored in a SQL database. Microsoft Power BI is used to plot the average maximum error of each day of each machine as a function of time. The results are then correlated with actual faults that occurred at the machine with the help of Varian service engineers. Results: Over the course of six months, 76,312 trajectory logs have been written into the database and plotted in Power BI. Throughout the course of analysis MLC motors have been replaced on three machines due to the early warning of the trajectory log analysis. The service engineers have also been alerted to possible gantry issues on one occasion due to the aforementioned analysis. Conclusion: Analyzing the trajectory log data is a viable and effective early warning system for potential failures of the TrueBeam linear accelerator. With further analysis and tightening of the tolerance values used to determine a possible imminent failure, it should be possible to pinpoint future issues more thoroughly and for more axes of motion.« less

  16. lsjk—a C++ library for arbitrary-precision numeric evaluation of the generalized log-sine functions

    NASA Astrophysics Data System (ADS)

    Kalmykov, M. Yu.; Sheplyakov, A.

    2005-10-01

    Generalized log-sine functions Lsj(k)(θ) appear in higher order ɛ-expansion of different Feynman diagrams. We present an algorithm for the numerical evaluation of these functions for real arguments. This algorithm is implemented as a C++ library with arbitrary-precision arithmetics for integer 0⩽k⩽9 and j⩾2. Some new relations and representations of the generalized log-sine functions are given. Program summaryTitle of program:lsjk Catalogue number:ADVS Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVS Program obtained from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing terms: GNU General Public License Computers:all Operating systems:POSIX Programming language:C++ Memory required to execute:Depending on the complexity of the problem, at least 32 MB RAM recommended No. of lines in distributed program, including testing data, etc.:41 975 No. of bytes in distributed program, including testing data, etc.:309 156 Distribution format:tar.gz Other programs called:The CLN library for arbitrary-precision arithmetics is required at version 1.1.5 or greater External files needed:none Nature of the physical problem:Numerical evaluation of the generalized log-sine functions for real argument in the region 0<θ<π. These functions appear in Feynman integrals Method of solution:Series representation for the real argument in the region 0<θ<π Restriction on the complexity of the problem:Limited up to Lsj(9)(θ), and j is an arbitrary integer number. Thus, all function up to the weight 12 in the region 0<θ<π can be evaluated. The algorithm can be extended up to higher values of k(k>9) without modification Typical running time:Depending on the complexity of problem. See text below.

  17. Archive of Digitized Analog Boomer and Minisparker Seismic Reflection Data Collected from the Alabama-Mississippi-Louisiana Shelf During Cruises Onboard the R/V Carancahua and R/V Gyre, April and July, 1981

    USGS Publications Warehouse

    Sanford, Jordan M.; Harrison, Arnell S.; Wiese, Dana S.; Flocks, James G.

    2009-01-01

    In April and July of 1981, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the shallow geologic framework of the Alabama-Mississippi-Louisiana Shelf in the northern Gulf of Mexico. Work was conducted onboard the Texas A&M University R/V Carancahua and the R/V Gyre to develop a geologic understanding of the study area and to locate potential hazards related to offshore oil and gas production. While the R/V Carancahua only collected boomer data, the R/V Gyre used a 400-Joule minisparker, 3.5-kilohertz (kHz) subbottom profiler, 12-kHz precision depth recorder, and two air guns. The authors selected the minisparker data set because, unlike with the boomer data, it provided the most complete record. This report is part of a series to digitally archive the legacy analog data collected from the Mississippi-Alabama SHelf (MASH). The MASH data rescue project is a cooperative effort by the USGS and the Minerals Management Service (MMS). This report serves as an archive of high-resolution scanned Tagged Image File Format (TIFF) and Graphics Interchange Format (GIF) images of the original boomer and minisparker paper records, navigation files, trackline maps, Geographic Information System (GIS) files, cruise logs, and formal Federal Geographic Data Committee (FGDC) metadata.

  18. Introduction to Data Acquisition 3.Let’s Acquire Data!

    NASA Astrophysics Data System (ADS)

    Nakanishi, Hideya; Okumura, Haruhiko

    In fusion experiments, diagnostic control and logging devices are usually connected through the field bus, e.g. GP-IB. Internet technologies are often applied for their remote operation. All equipment and digitizers are driven by pre-programmed sequences, in which clocks and triggers give the essential timing for data acquisition. Data production rate and amount must be checked in comparison with the transfer and store rates. To store binary raw data safely, journaling file systems are preferably used with redundant disks (RAID) or mirroring mechanism, such as “rsync”. A proper choice of the data compression method not only reduces the storage size but also improves the I/O throughputs. DBMS is even applicable to quick search or security around the table data.

  19. VizieR Online Data Catalog: AO imaging of KOIs with gas giant planets (Wang+, 2015)

    NASA Astrophysics Data System (ADS)

    Wang, J.; Fischer, D. A.; Horch, E. P.; Xie, J.-W.

    2017-09-01

    From the NASA Exoplanet Archive (http://exoplanetarchive.ipac.caltech.edu), we select Kepler Objects of Interest (KOIs) that satisfy the following criteria: (1) disposition of either Candidate or Confirmed, (2) stellar effective temperature (Teff) lower than 6500 K, (3) stellar surface gravity (log g) higher than 4.0, (4) Kepler magnitude (KP) brighter than 14th mag, (5) with at least one gas giant planet (3.8 R{earth}=

  20. SU-D-BRC-03: Development and Validation of an Online 2D Dose Verification System for Daily Patient Plan Delivery Accuracy Check

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, J; Hu, W; Xing, Y

    Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, positionmore » and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.« less

  1. SU-F-T-308: Mobius FX Evaluation and Comparison Against a Commercial 4D Detector Array for VMAT Plan QA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vazquez Quino, L; Huerta Hernandez, C; Morrow, A

    2016-06-15

    Purpose: To evaluate the use of MobiusFX as a pre-treatment verification IMRT QA tool and compare it with a commercial 4D detector array for VMAT plan QA. Methods: 15 VMAT plan QA of different treatment sites were delivered and measured by traditional means with the 4D detector array ArcCheck (Sun Nuclear corporation) and at the same time measurement in linac treatment logs (Varian Dynalogs files) were analyzed from the same delivery with MobiusFX software (Mobius Medical Systems). VMAT plan QAs created in Eclipse treatment planning system (Varian) in a TrueBeam linac machine (Varian) were delivered and analyzed with the gammamore » analysis routine from SNPA software (Sun Nuclear corporation). Results: Comparable results in terms of the gamma analysis with 99.06% average gamma passing with 3%,3mm passing rate is observed in the comparison among MobiusFX, ArcCheck measurements, and the Treatment Planning System dose calculated. When going to a stricter criterion (1%,1mm) larger discrepancies are observed in different regions of the measurements with an average gamma of 66.24% between MobiusFX and ArcCheck. Conclusion: This work indicates the potential for using MobiusFX as a routine pre-treatment patient specific IMRT method for quality assurance purposes and its advantages as a phantom-less method which reduce the time for IMRT QA measurement. MobiusFX is capable of produce similar results of those by traditional methods used for patient specific pre-treatment verification VMAT QA. Even the gamma results comparing to the TPS are similar the analysis of both methods show that the errors being identified by each method are found in different regions. Traditional methods like ArcCheck are sensitive to setup errors and dose difference errors coming from the linac output. On the other hand linac log files analysis record different errors in the VMAT QA associated with the MLCs and gantry motion that by traditional methods cannot be detected.« less

  2. On the Automation of the MarkIII Data Analysis System.

    NASA Astrophysics Data System (ADS)

    Schwegmann, W.; Schuh, H.

    1999-03-01

    A faster and semiautomatic data analysis is an important contribution to the acceleration of the VLBI procedure. A concept for the automation of one of the most widely used VLBI software packages the MarkIII Data Analysis System was developed. Then, the program PWXCB, which extracts weather and cable calibration data from the station log-files, was automated supplementing the existing Fortran77 program-code. The new program XLOG and its results will be presented. Most of the tasks in the VLBI data analysis are very complex and their automation requires typical knowledge-based techniques. Thus, a knowledge-based system (KBS) for support and guidance of the analyst is being developed using the AI-workbench BABYLON, which is based on methods of artificial intelligence (AI). The advantages of a KBS for the MarkIII Data Analysis System and the required steps to build a KBS will be demonstrated. Examples about the current status of the project will be given, too.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gibson, Garth

    Petascale computing infrastructures for scientific discovery make petascale demands on information storage capacity, performance, concurrency, reliability, availability, and manageability. The Petascale Data Storage Institute focuses on the data storage problems found in petascale scientific computing environments, with special attention to community issues such as interoperability, community buy-in, and shared tools. The Petascale Data Storage Institute is a collaboration between researchers at Carnegie Mellon University, National Energy Research Scientific Computing Center, Pacific Northwest National Laboratory, Oak Ridge National Laboratory, Sandia National Laboratory, Los Alamos National Laboratory, University of Michigan, and the University of California at Santa Cruz. Because the Institute focusesmore » on low level files systems and storage systems, its role in improving SciDAC systems was one of supporting application middleware such as data management and system-level performance tuning. In retrospect, the Petascale Data Storage Institute’s most innovative and impactful contribution is the Parallel Log-structured File System (PLFS). Published in SC09, PLFS is middleware that operates in MPI-IO or embedded in FUSE for non-MPI applications. Its function is to decouple concurrently written files into a per-process log file, whose impact (the contents of the single file that the parallel application was concurrently writing) is determined on later reading, rather than during its writing. PLFS is transparent to the parallel application, offering a POSIX or MPI-IO interface, and it shows an order of magnitude speedup to the Chombo benchmark and two orders of magnitude to the FLASH benchmark. Moreover, LANL production applications see speedups of 5X to 28X, so PLFS has been put into production at LANL. Originally conceived and prototyped in a PDSI collaboration between LANL and CMU, it has grown to engage many other PDSI institutes, international partners like AWE, and has a large team at EMC supporting and enhancing it. PLFS is open sourced with a BSD license on sourceforge. Post PDSI funding comes from NNSA and industry sources. Moreover, PLFS has spin out half a dozen or more papers, partnered on research with multiple schools and vendors, and has projects to transparently 1) dis- tribute metadata over independent metadata servers, 2) exploit drastically non-POSIX Hadoop storage for HPC POSIX applications, 3) compress checkpoints on the fly, 4) batch delayed writes for write speed, 5) compress read-back indexes and parallelize their redistribution, 6) double-buffer writes in NAND Flash storage to decouple host blocking during checkpoint from disk write time in the storage system, 7) pack small files into a smaller number of bigger containers. There are two large scale open source Linux software projects that PDSI significantly incubated, though neither were initated in PDSI. These are 1) Ceph, a UCSC parallel object storage research project that has continued to be a vehicle for research, and has become a released part of Linux, and 2) Parallel NFS (pNFS) a portion of the IETF’s NFSv4.1 that brings the core data parallelism found in Lustre, PanFS, PVFS, and Ceph to the industry standard NFS, with released code in Linux 3.0, and its vendor offerings, with products from NetApp, EMC, BlueArc and RedHat. Both are fundamentally supported and advanced by vendor companies now, but were critcally transferred from research demonstration to viable product with funding from PDSI, in part. At this point Lustre remains the primary path to scalable IO in Exascale systems, but both Ceph and pNFS are viable alternatives with different fundamental advantages. Finally, research community building was a big success for PDSI. Through the HECFSIO workshops and HECURA project with NSF PDSI stimulated and helped to steer leveraged funding of over $25M. Through the Petascale (now Parallel) Data Storage Workshop series, www.pdsw.org, colocated with SCxy each year, PDSI created and incubated five offerings of this high-attendance workshop. The workshop has gone on without PDSI support with two more highly successfully workshops, rewriting its organizational structure to be community managed. More than 70 peer reviewed papers have been presented at PDSW workshops.« less

  4. User-composable Electronic Health Record Improves Efficiency of Clinician Data Viewing for Patient Case Appraisal: A Mixed-Methods Study

    PubMed Central

    Senathirajah, Yalini; Kaufman, David; Bakken, Suzanne

    2016-01-01

    Background: Challenges in the design of electronic health records (EHRs) include designing usable systems that must meet the complex, rapidly changing, and high-stakes information needs of clinicians. The ability to move and assemble elements together on the same page has significant human-computer interaction (HCI) and efficiency advantages, and can mitigate the problems of negotiating multiple fixed screens and the associated cognitive burdens. Objective: We compare MedWISE—a novel EHR that supports user-composable displays—with a conventional EHR in terms of the number of repeat views of data elements for patient case appraisal. Design and Methods: The study used mixed-methods for examination of clinical data viewing in four patient cases. The study compared use of an experimental user-composable EHR with use of a conventional EHR, for case appraisal. Eleven clinicians used a user-composable EHR in a case appraisal task in the laboratory setting. This was compared with log file analysis of the same patient cases in the conventional EHR. We investigated the number of repeat views of the same clinical information during a session and across these two contexts, and compared them using Fisher’s exact test. Results: There was a significant difference (p<.0001) in proportion of cases with repeat data element viewing between the user-composable EHR (14.6 percent) and conventional EHR (72.6 percent). Discussion and Conclusion: Users of conventional EHRs repeatedly viewed the same information elements in the same session, as revealed by log files. Our findings are consistent with the hypothesis that conventional systems require that the user view many screens and remember information between screens, causing the user to forget information and to have to access the information a second time. Other mechanisms (such as reduction in navigation over a population of users due to interface sharing, and information selection) may also contribute to increased efficiency in the experimental system. Systems that allow a composable approach that enables the user to gather together on the same screen any desired information elements may confer cognitive support benefits that can increase productive use of systems by reducing fragmented information. By reducing cognitive overload, it can also enhance the user experience. PMID:27195306

  5. Efficacy of 3D conforming nickel titanium rotary instruments in eliminating canal wall bacteria from oval-shaped root canals.

    PubMed

    Bortoluzzi, Eduardo A; Carlon, Daniel; Meghil, Mohamed M; El-Awady, Ahmed R; Niu, Lina; Bergeron, Brian E; Susin, Lisiane; Cutler, Christopher W; Pashley, David H; Tay, Franklin R

    2015-05-01

    To evaluate the effectiveness of TRUShape® 3D Conforming Files, compared with Twisted Files, in reducing bacteria load from root canal walls, in the presence or absence of irrigant agitation. Extracted human premolars with single oval-shaped canals were infected with Enterococcus faecalis. Teeth in Group I (N=10; NaOCl and QMix® 2in1 as respective initial and final irrigants) were subdivided into 4 subgroups: (A) TRUShape® instrumentation without irrigant activation; (B) TRUShape® instrumentation with sonic irrigant agitation; (C) Twisted Files without irrigant agitation; (D) Twisted Files with sonic irrigant agitation. To remove confounding factor (antimicrobial irrigants), teeth in Group II (N=10) were irrigated with sterile saline, using the same subgroup designations. Specimens before and after chemomechanical débridement were cultured for quantification of colony-forming units (CFUs). Data from each group were analyzed separately using two-factor ANOVA and Holm-Sidak multiple comparison (α=0.05). Canal wall bacteria were qualitatively examined using scanning electron microscopy (SEM) and light microscopy of Taylor-modified Brown and Brenn-stained demineralised sections. CFUs from subgroups in Group I were not significantly different (P=0.935). For Group II, both file type (P<0.001) and irrigant agitation (P<0.001) significantly affected log-reduction in CFU concentrations. The interaction of these two factors was not significant (P=0.601). Although SEM showed reduced canal wall bacteria, bacteria were present within dentinal tubules after rotary instrumentation, as revealed by light microscopy of longitudinal root sections. TRUShape® files removed significantly more canal wall bacteria than Twisted Files when used without an antibacterial irrigant; the latter is required to decontaminate dentinal tubules. Root canal disinfection should not be focused only on a mechanistic approach. Rather, the rational choice of a rotary instrumentation system should be combined with the use of well-tested antimicrobial irrigants and delivery/agitation techniques to establish a clinically realistic chemomechanical débridement protocol. Published by Elsevier Ltd.

  6. Toward a Real-Time (Day) Dreamcatcher: Sensor-Free Detection of Mind Wandering during Online Reading

    ERIC Educational Resources Information Center

    Mills, Caitlin; D'Mello, Sidney

    2015-01-01

    This paper reports the results from a sensor-free detector of mind wandering during an online reading task. Features consisted of reading behaviors (e.g., reading time) and textual features (e.g., level of difficulty) extracted from self-paced reading log files. Supervised machine learning was applied to two datasets in order to predict if…

  7. Diagnostic Problem-Solving Process in Professional Contexts: Theory and Empirical Investigation in the Context of Car Mechatronics Using Computer-Generated Log-Files

    ERIC Educational Resources Information Center

    Abele, Stephan

    2018-01-01

    This article deals with a theory-based investigation of the diagnostic problem-solving process in professional contexts. To begin with, a theory of the diagnostic problem-solving process was developed drawing on findings from different professional contexts. The theory distinguishes between four sub-processes of the diagnostic problem-solving…

  8. Some Features of "Alt" Texts Associated with Images in Web Pages

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    2006-01-01

    Introduction: This paper extends a series on summaries of Web objects, in this case, the alt attribute of image files. Method: Data were logged from 1894 pages from Yahoo!'s random page service and 4703 pages from the Google directory; an img tag was extracted randomly from each where present; its alt attribute, if any, was recorded; and the…

  9. Solutions for acceleration measurement in vehicle crash tests

    NASA Astrophysics Data System (ADS)

    Dima, D. S.; Covaciu, D.

    2017-10-01

    Crash tests are useful for validating computer simulations of road traffic accidents. One of the most important parameters measured is the acceleration. The evolution of acceleration versus time, during a crash test, form a crash pulse. The correctness of the crash pulse determination depends on the data acquisition system used. Recommendations regarding the instrumentation for impact tests are given in standards, which are focused on the use of accelerometers as impact sensors. The goal of this paper is to present the device and software developed by authors for data acquisition and processing. The system includes two accelerometers with different input ranges, a processing unit based on a 32-bit microcontroller and a data logging unit with SD card. Data collected on card, as text files, is processed with a dedicated software running on personal computers. The processing is based on diagrams and includes the digital filters recommended in standards.

  10. XRootD popularity on hadoop clusters

    NASA Astrophysics Data System (ADS)

    Meoni, Marco; Boccali, Tommaso; Magini, Nicolò; Menichetti, Luca; Giordano, Domenico; CMS Collaboration

    2017-10-01

    Performance data and metadata of the computing operations at the CMS experiment are collected through a distributed monitoring infrastructure, currently relying on a traditional Oracle database system. This paper shows how to harness Big Data architectures in order to improve the throughput and the efficiency of such monitoring. A large set of operational data - user activities, job submissions, resources, file transfers, site efficiencies, software releases, network traffic, machine logs - is being injected into a readily available Hadoop cluster, via several data streamers. The collected metadata is further organized running fast arbitrary queries; this offers the ability to test several Map&Reduce-based frameworks and measure the system speed-up when compared to the original database infrastructure. By leveraging a quality Hadoop data store and enabling an analytics framework on top, it is possible to design a mining platform to predict dataset popularity and discover patterns and correlations.

  11. Application of aerial photography to water-related programs in Michigan

    NASA Technical Reports Server (NTRS)

    Enslin, W. R.; Hill-Rowley, R.; Tilmann, S. E.

    1977-01-01

    Aerial photography and information system technology were used to generate information required for the effective operation of three water-related programs in Michigan. Potential mosquito breeding sites were identified from specially acquired low altitude 70 mm color photography for the city of Lansing; the inventory identified 35% more surface water areas than indicated on existing field maps. A comprehensive inventory of surface water sources and potential access sites was prepared to assist fire departments in Antrim County with fire truck water-recharge operations. Remotely-sensed land cover/use data for Windsor Township, Eaton County, were integrated with other resource data into a computer-based information system for regional water quality studies. Eleven thematic maps focusing on landscape features affecting non-point water pollution and waste disposal were generated from analyses of a four-hectare grid-based data file containing land cover/use, soils, topographic and geologic (well-log) data.

  12. Application of aerial photography to water-related programs in Michigan

    NASA Technical Reports Server (NTRS)

    Enslin, W. R.; Hill-Rowley, R.; Tilmann, S. E.

    1977-01-01

    The paper describes the use of aerial photography and information system technology in the provision of information required for the effective operation of three water-related programs in Michigan. Potential mosquito breeding sites were identified from specially acquired low altitude 70 mm color photography for the City of Lansing Vector Control Area. A comprehensive inventory of surface water sources and potential access sites was prepared to assist fire departments in Antrim County with fire truck water-recharge operations. Remotely-sensed land cover/use data for Windsor Township, Eaton County were integrated with other resource data into a computer-based information system for regional water quality studies. Eleven thematic maps specifically focussed on landscape features affecting non-point water pollution and waste disposal were generated from analyses of a four-hectare grid-based data file containing land cover/use, soils, topographic and geologic (well-log) data.

  13. Major technology issues in surgical data collection.

    PubMed

    Kirschenbaum, I H

    1995-10-01

    Surgical scheduling and data collection is a field that has a long history as well as a bright future. Historically, surgical cases have always involved some amount of data collection. Surgical cases are scheduled and then reviewed. The classic method, that large black surgical log, actually still exists in many hospitals. In fact, there is nothing new about the recording or reporting of surgical cases. If we only needed to record the information and produce a variety of reports on the data, then modern electronic technology would function as a glorified fast index card box--or, in computer database terms, a simple flat file database. But, this is not the future of technology in surgical case management. This article makes the general case for integrating surgical data systems. Instead of reviewing specific software, it essentially addresses the issues of strategic planning related to this important aspect of medical information systems.

  14. Improved grading system for structural logs for log homes

    Treesearch

    D.W. Green; T.M. Gorman; J.W. Evans; J.F. Murphy

    2004-01-01

    Current grading standards for logs used in log home construction use visual criteria to sort logs into either “wall logs” or structural logs (round and sawn round timbers). The conservative nature of this grading system, and the grouping of stronger and weaker species for marketing purposes, probably results in the specification of logs with larger diameter than would...

  15. Archive of digital boomer subbottom profile data collected in the Atlantic Ocean offshore northeast Florida during USGS cruises 03FGS01 and 03FGS02 in September and October of 2003

    USGS Publications Warehouse

    Calderon, Karynna; Forde, Arnell S.; Dadisman, Shawn V.; Wiese, Dana S.; Phelps, Daniel C.

    2012-01-01

    In September and October of 2003, the U.S. Geological Survey (USGS), in cooperation with the Florida Geological Survey, conducted geophysical surveys of the Atlantic Ocean offshore northeast Florida from St. Augustine, Florida, to the Florida-Georgia border. This report serves as an archive of unprocessed digital boomer subbottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Filtered and gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of all acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 03FGS01 tells us the data were collected in 2003 as part of cooperative work with the Florida Geological Survey (FGS) and that the data were collected during the first field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID). The naming convention used for each seismic line is as follows: yye##a, where 'yy' are the last two digits of the year in which the data were collected, 'e' is a 1-letter abbreviation for the equipment type (for example, b for boomer), '##' is a 2-digit number representing a specific track, and 'a' is a letter representing the section of a line if recording was prematurely terminated or rerun for quality or acquisition problems. The boomer plate is an acoustic energy source that consists of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled floating on the water surface and when discharged emits a short acoustic pulse, or shot, which propagates through the water, sediment column, or rock beneath. The acoustic energy is reflected at density boundaries (such as the seafloor, sediment, or rock layers beneath the seafloor), detected by hydrophone receivers, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.5 seconds) and recorded for specific intervals of time (for example, 100 milliseconds). In this way, a two-dimensional (2-D) vertical profile of the shallow geologic structure beneath the ship track is produced. Refer to the handwritten FACS operation log (PDF, 442 KB) for diagrams and descriptions of acquisition geometry, which varied throughout the cruises. Table 1 displays a summary of acquisition parameters. See the digital FACS equipment logs (PDF, 9-13 KB each) for details about the acquisition equipment used. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y (Barry and others, 1975) format (rev. 0), except for the first 3,200 bytes of the card image header, which are stored in ASCII format instead of the standard EBCDIC format. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2005). See the How To Download SEG Y Data page for download instructions. The printable profiles provided here are Graphics Interchange Format (GIF) images that were filtered and gained using SU software. Refer to the Software page for details about the processing and links to example SU processing scripts and USGS software for viewing the SEG Y files (Zihlman, 1992).

  16. LAS - LAND ANALYSIS SYSTEM, VERSION 5.0

    NASA Technical Reports Server (NTRS)

    Pease, P. B.

    1994-01-01

    The Land Analysis System (LAS) is an image analysis system designed to manipulate and analyze digital data in raster format and provide the user with a wide spectrum of functions and statistical tools for analysis. LAS offers these features under VMS with optional image display capabilities for IVAS and other display devices as well as the X-Windows environment. LAS provides a flexible framework for algorithm development as well as for the processing and analysis of image data. Users may choose between mouse-driven commands or the traditional command line input mode. LAS functions include supervised and unsupervised image classification, film product generation, geometric registration, image repair, radiometric correction and image statistical analysis. Data files accepted by LAS include formats such as Multi-Spectral Scanner (MSS), Thematic Mapper (TM) and Advanced Very High Resolution Radiometer (AVHRR). The enhanced geometric registration package now includes both image to image and map to map transformations. The over 200 LAS functions fall into image processing scenario categories which include: arithmetic and logical functions, data transformations, fourier transforms, geometric registration, hard copy output, image restoration, intensity transformation, multispectral and statistical analysis, file transfer, tape profiling and file management among others. Internal improvements to the LAS code have eliminated the VAX VMS dependencies and improved overall system performance. The maximum LAS image size has been increased to 20,000 lines by 20,000 samples with a maximum of 256 bands per image. The catalog management system used in earlier versions of LAS has been replaced by a more streamlined and maintenance-free method of file management. This system is not dependent on VAX/VMS and relies on file naming conventions alone to allow the use of identical LAS file names on different operating systems. While the LAS code has been improved, the original capabilities of the system have been preserved. These include maintaining associated image history, session logging, and batch, asynchronous and interactive mode of operation. The LAS application programs are integrated under version 4.1 of an interface called the Transportable Applications Executive (TAE). TAE 4.1 has four modes of user interaction: menu, direct command, tutor (or help), and dynamic tutor. In addition TAE 4.1 allows the operation of LAS functions using mouse-driven commands under the TAE-Facelift environment provided with TAE 4.1. These modes of operation allow users, from the beginner to the expert, to exercise specific application options. LAS is written in C-language and FORTRAN 77 for use with DEC VAX computers running VMS with approximately 16Mb of physical memory. This program runs under TAE 4.1. Since TAE 4.1 is not a current version of TAE, TAE 4.1 is included within the LAS distribution. Approximately 130,000 blocks (65Mb) of disk storage space are necessary to store the source code and files generated by the installation procedure for LAS and 44,000 blocks (22Mb) of disk storage space are necessary for TAE 4.1 installation. The only other dependencies for LAS are the subroutine libraries for the specific display device(s) that will be used with LAS/DMS (e.g. X-Windows and/or IVAS). The standard distribution medium for LAS is a set of two 9track 6250 BPI magnetic tapes in DEC VAX BACKUP format. It is also available on a set of two TK50 tape cartridges in DEC VAX BACKUP format. This program was developed in 1986 and last updated in 1992.

  17. Scalable Trust of Next-Generation Management (STRONGMAN)

    DTIC Science & Technology

    2004-10-01

    remote logins might be policy controlled to allow only strongly encrypted IPSec tunnels to log in remotely, to access selected files, etc. The...and Angelos D. Keromytis. Drop-in Security for Distributed and Portable Computing Elements. Emerald Journal of Internet Research. Electronic...Security and Privacy, pp. 17-31, May 1999. [2] S. M. Bellovin. Distributed Firewalls. ; login : magazine, special issue on security, November 1999. [3] M

  18. 77 FR 66608 - New England Hydropower Company, LLC; Notice of Preliminary Permit Application Accepted for Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-06

    ... Spillway Dike with an 8-foot-long stop-log slot; (2) an existing 31-foot-long, 42-inch-diameter low level penstock; (3) an existing 0.13 acre impoundment with a normal maximum water surface elevation of 66.3 feet... transmission line connected to the NSTAR regional grid. The project would have an estimated average annual...

  19. 76 FR 7838 - Claverack Creek, LLC; Notice of Preliminary Permit Application Accepted for Filing and Soliciting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ...-deep intake canal; (5) new trash racks, head gates, and stop log structure; (6) an existing 6-foot... Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site http://www.ferc... copy of the application, can be viewed or printed on the ``eLibrary'' link of the Commission's Web site...

  20. Effective HTCondor-based monitoring system for CMS

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Bockelman, B. P.; Da Silva, J. M.; Hernandez, J.; Khan, F. A.; Letts, J.; Mascheroni, M.; Mason, D. A.; Perez-Calero Yzquierdo, A.; Vlimant, J.-R.; pre="for the"> CMS Consortium,

    2017-10-01

    The CMS experiment at the LHC relies on HTCondor and glideinWMS as its primary batch and pilot-based Grid provisioning systems, respectively. Given the scale of the global queue in CMS, the operators found it increasingly difficult to monitor the pool to find problems and fix them. The operators had to rely on several different web pages, with several different levels of information, and sift tirelessly through log files in order to monitor the pool completely. Therefore, coming up with a suitable monitoring system was one of the crucial items before the beginning of the LHC Run 2 in order to ensure early detection of issues and to give a good overview of the whole pool. Our new monitoring page utilizes the HTCondor ClassAd information to provide a complete picture of the whole submission infrastructure in CMS. The monitoring page includes useful information from HTCondor schedulers, central managers, the glideinWMS frontend, and factories. It also incorporates information about users and tasks making it easy for operators to provide support and debug issues.

  1. SU-E-T-406: Use of TrueBeam Developer Mode and API to Increase the Efficiency and Accuracy of Commissioning Measurements for the Varian EDGE Stereotactic Linac

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, S; Gulam, M; Song, K

    2014-06-01

    Purpose: The Varian EDGE machine is a new stereotactic platform, combining Calypso and VisionRT localization systems with a stereotactic linac. The system includes TrueBeam DeveloperMode, making possible the use of XML-scripting for automation of linac-related tasks. This study details the use of DeveloperMode to automate commissioning tasks for Varian EDGE, thereby improving efficiency and measurement consistency. Methods: XML-scripting was used for various commissioning tasks,including couch model verification,beam-scanning,and isocenter verification. For couch measurements, point measurements were acquired for several field sizes (2×2,4×4,10×10cm{sup 2}) at 42 gantry angles for two couch-models. Measurements were acquired with variations in couch position(rails in/out,couch shifted inmore » each of motion axes) compared to treatment planning system(TPS)-calculated values,which were logged automatically through advanced planning interface(API) scripting functionality. For beam scanning, XML-scripts were used to create custom MLC-apertures. For isocenter verification, XML-scripts were used to automate various Winston-Lutz-type tests. Results: For couch measurements, the time required for each set of angles was approximately 9 minutes. Without scripting, each set required approximately 12 minutes. Automated measurements required only one physicist, while manual measurements required at least two physicists to handle linac positions/beams and data recording. MLC apertures were generated outside of the TPS,and with the .xml file format, double-checking without use of TPS/operator console was possible. Similar time efficiency gains were found for isocenter verification measurements Conclusion: The use of XML scripting in TrueBeam DeveloperMode allows for efficient and accurate data acquisition during commissioning. The efficiency improvement is most pronounced for iterative measurements, exemplified by the time savings for couch modeling measurements(approximately 10 hours). The scripting also allowed for creation of the files in advance without requiring access to TPS. The API scripting functionality enabled efficient creation/mining of TPS data. Finally, automation reduces the potential for human error in entering linac values at the machine console,and the script provides a log of measurements acquired for each session. This research was supported in part by a grant from Varian Medical Systems, Palo Alto, CA.« less

  2. DMFS: A Data Migration File System for NetBSD

    NASA Technical Reports Server (NTRS)

    Studenmund, William

    1999-01-01

    I have recently developed dmfs, a Data Migration File System, for NetBSD. This file system is based on the overlay file system, which is discussed in a separate paper, and provides kernel support for the data migration system being developed by my research group here at NASA/Ames. The file system utilizes an underlying file store to provide the file backing, and coordinates user and system access to the files. It stores its internal meta data in a flat file, which resides on a separate file system. Our data migration system provides archiving and file migration services. System utilities scan the dmfs file system for recently modified files, and archive them to two separate tape stores. Once a file has been doubly archived, files larger than a specified size will be truncated to that size, potentially freeing up large amounts of the underlying file store. Some sites will choose to retain none of the file (deleting its contents entirely from the file system) while others may choose to retain a portion, for instance a preamble describing the remainder of the file. The dmfs layer coordinates access to the file, retaining user-perceived access and modification times, file size, and restricting access to partially migrated files to the portion actually resident. When a user process attempts to read from the non-resident portion of a file, it is blocked and the dmfs layer sends a request to a system daemon to restore the file. As more of the file becomes resident, the user process is permitted to begin accessing the now-resident portions of the file. For simplicity, our data migration system divides a file into two portions, a resident portion followed by an optional non-resident portion. Also, a file is in one of three states: fully resident, fully resident and archived, and (partially) non-resident and archived. For a file which is only partially resident, any attempt to write or truncate the file, or to read a non-resident portion, will trigger a file restoration. Truncations and writes are blocked until the file is fully restored so that a restoration which only partially succeed does not leave the file in an indeterminate state with portions existing only on tape and other portions only in the disk file system. We chose layered file system technology as it permits us to focus on the data migration functionality, and permits end system administrators to choose the underlying file store technology. We chose the overlay layered file system instead of the null layer for two reasons: first to permit our layer to better preserve meta data integrity and second to prevent even root processes from accessing migrated files. This is achieved as the underlying file store becomes inaccessible once the dmfs layer is mounted. We are quite pleased with how the layered file system has turned out. Of the 45 vnode operations in NetBSD, 20 (forty-four percent) required no intervention by our file layer - they are passed directly to the underlying file store. Of the twenty five we do intercept, nine (such as vop_create()) are intercepted only to ensure meta data integrity. Most of the functionality was concentrated in five operations: vop_read, vop_write, vop_getattr, vop_setattr, and vop_fcntl. The first four are the core operations for controlling access to migrated files and preserving the user experience. vop_fcntl, a call generated for a certain class of fcntl codes, provides the command channel used by privileged user programs to communicate with the dmfs layer.

  3. Cloud Based Drive Forensic and DDoS Analysis on Seafile as Case Study

    NASA Astrophysics Data System (ADS)

    Bahaweres, R. B.; Santo, N. B.; Ningsih, A. S.

    2017-01-01

    The rapid development of Internet due to increasing data rates through both broadband cable networks and 4G wireless mobile, make everyone easily connected to the internet. Storages as Services (StaaS) is more popular and many users want to store their data in one place so that whenever they need they can easily access anywhere, any place and anytime in the cloud. The use of the service makes it vulnerable to use by someone to commit a crime or can do Denial of Service (DoS) on cloud storage services. The criminals can use the cloud storage services to store, upload and download illegal file or document to the cloud storage. In this study, we try to implement a private cloud storage using Seafile on Raspberry Pi and perform simulations in Local Area Network and Wi-Fi environment to analyze forensically to discover or open a criminal act can be traced and proved forensically. Also, we can identify, collect and analyze the artifact of server and client, such as a registry of the desktop client, the file system, the log of seafile, the cache of the browser, and database forensic.

  4. Accessing files in an Internet: The Jade file system

    NASA Technical Reports Server (NTRS)

    Peterson, Larry L.; Rao, Herman C.

    1991-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  5. Accessing files in an internet - The Jade file system

    NASA Technical Reports Server (NTRS)

    Rao, Herman C.; Peterson, Larry L.

    1993-01-01

    Jade is a new distribution file system that provides a uniform way to name and access files in an internet environment. It makes two important contributions. First, Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Jade is designed under the restriction that the underlying file system may not be modified. Second, rather than providing a global name space, Jade permits each user to define a private name space. These private name spaces support two novel features: they allow multiple file systems to be mounted under one directory, and they allow one logical name space to mount other logical name spaces. A prototype of the Jade File System was implemented on Sun Workstations running Unix. It consists of interfaces to the Unix file system, the Sun Network File System, the Andrew File System, and FTP. This paper motivates Jade's design, highlights several aspects of its implementation, and illustrates applications that can take advantage of its features.

  6. No3CoGP: non-conserved and conserved coexpressed gene pairs.

    PubMed

    Mal, Chittabrata; Aftabuddin, Md; Kundu, Sudip

    2014-12-08

    Analyzing the microarray data of different conditions, one can identify the conserved and condition-specific genes and gene modules, and thus can infer the underlying cellular activities. All the available tools based on Bioconductor and R packages differ in how they extract differential coexpression and at what level they study. There is a need for a user-friendly, flexible tool which can start analysis using raw or preprocessed microarray data and can report different levels of useful information. We present a GUI software, No3CoGP: Non-Conserved and Conserved Coexpressed Gene Pairs which takes Affymetrix microarray data (.CEL files or log2 normalized.txt files) along with annotation file (.csv file), Chip Definition File (CDF file) and probe file as inputs, utilizes the concept of network density cut-off and Fisher's z-test to extract biologically relevant information. It can identify four possible types of gene pairs based on their coexpression relationships. These are (i) gene pair showing coexpression in one condition but not in the other, (ii) gene pair which is positively coexpressed in one condition but negatively coexpressed in the other condition, (iii) positively and (iv) negatively coexpressed in both the conditions. Further, it can generate modules of coexpressed genes. Easy-to-use GUI interface enables researchers without knowledge in R language to use No3CoGP. Utilization of one or more CPU cores, depending on the availability, speeds up the program. The output files stored in the respective directories under the user-defined project offer the researchers to unravel condition-specific functionalities of gene, gene sets or modules.

  7. Technoeconomic analysis of conventional logging systems operating from stump to landing

    Treesearch

    Raymond L. Sarles; William G. Luppold; William G. Luppold

    1986-01-01

    Analyzes technical and economic factors for six conventional logging systems suitable for operation in eastern forests. Discusses financial risks and business implications for loggers investing in high-production, state-of-the-art logging systems. Provides logging contractors with information useful as a preliminary guide for selection of equipment and systems....

  8. COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.

    PubMed

    Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas

    2014-12-14

    With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.

  9. The Jade File System. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Rao, Herman Chung-Hwa

    1991-01-01

    File systems have long been the most important and most widely used form of shared permanent storage. File systems in traditional time-sharing systems, such as Unix, support a coherent sharing model for multiple users. Distributed file systems implement this sharing model in local area networks. However, most distributed file systems fail to scale from local area networks to an internet. Four characteristics of scalability were recognized: size, wide area, autonomy, and heterogeneity. Owing to size and wide area, techniques such as broadcasting, central control, and central resources, which are widely adopted by local area network file systems, are not adequate for an internet file system. An internet file system must also support the notion of autonomy because an internet is made up by a collection of independent organizations. Finally, heterogeneity is the nature of an internet file system, not only because of its size, but also because of the autonomy of the organizations in an internet. The Jade File System, which provides a uniform way to name and access files in the internet environment, is presented. Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Because of autonomy, Jade is designed under the restriction that the underlying file systems may not be modified. In order to avoid the complexity of maintaining an internet-wide, global name space, Jade permits each user to define a private name space. In Jade's design, we pay careful attention to avoiding unnecessary network messages between clients and file servers in order to achieve acceptable performance. Jade's name space supports two novel features: (1) it allows multiple file systems to be mounted under one direction; and (2) it permits one logical name space to mount other logical name spaces. A prototype of Jade was implemented to examine and validate its design. The prototype consists of interfaces to the Unix File System, the Sun Network File System, and the File Transfer Protocol.

  10. Open source software to control Bioflo bioreactors.

    PubMed

    Burdge, David A; Libourel, Igor G L

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.

  11. Open Source Software to Control Bioflo Bioreactors

    PubMed Central

    Burdge, David A.; Libourel, Igor G. L.

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828

  12. Console Log Keeping Made Easier - Tools and Techniques for Improving Quality of Flight Controller Activity Logs

    NASA Technical Reports Server (NTRS)

    Scott, David W.; Underwood, Debrah (Technical Monitor)

    2002-01-01

    At the Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) for International Space Station (ISS), each flight controller maintains detailed logs of activities and communications at their console position. These logs are critical for accurately controlling flight in real-time as well as providing a historical record and troubleshooting tool. This paper describes logging methods and electronic formats used at the POIC and provides food for thought on their strengths and limitations, plus proposes some innovative extensions. It also describes an inexpensive PC-based scheme for capturing and/or transcribing audio clips from communications consoles. Flight control activity (e.g. interpreting computer displays, entering data/issuing electronic commands, and communicating with others) can become extremely intense. It's essential to document it well, but the effort to do so may conflict with actual activity. This can be more than just annoying, as what's in the logs (or just as importantly not in them) often feeds back directly into the quality of future operations, whether short-term or long-term. In earlier programs, such as Spacelab, log keeping was done on paper, often using position-specific shorthand, and the other reader was at the mercy of the writer's penmanship. Today, user-friendly software solves the legibility problem and can automate date/time entry, but some content may take longer to finish due to individual typing speed and less use of symbols. File layout can be used to great advantage in making types of information easy to find, and creating searchable master logs for a given position is very easy and a real lifesaver in reconstructing events or researching a given topic. We'll examine log formats from several console position, and the types of information that are included and (just as importantly) excluded. We'll also look at when a summary or synopsis is effective, and when extensive detail is needed.

  13. Please Move Inactive Files Off the /projects File System | High-Performance

    Science.gov Websites

    Computing | NREL Please Move Inactive Files Off the /projects File System Please Move Inactive Files Off the /projects File System January 11, 2018 The /projects file system is a shared resource . This year this has created a space crunch - the file system is now about 90% full and we need your help

  14. Geonucleus, the freeware application for managing geological mapping data in GIS

    NASA Astrophysics Data System (ADS)

    Albert, Gáspár

    2016-04-01

    Geological mapping is the most traditional way of collecting information from the deposits and rocks. The traditional technique of the documentation was refined by generations of geologists. These traditions were implemented into Geonucleus to create a tool for precise data-recording after fieldwork, but giving the freedom of pondering the details of the observation as well. In 2012 a general xml-based data structure was worked out for storing field observations for the Geological Institute of Hungary (Albert et al. 2012). This structure was implemented into the desktop version of Geonucleus, which creates a database of the recorded data on the client computer. The application saves the complete database in one file, which can be loaded into a GIS. The observations can be saved in simple text format as well, but primarily the kml (Keyhole Markup Languege) is supported. This way, the observations are visualized in comprehensible forms (e.g. on a 3D surface model with satellite photos in Google Earth). If the kml is directly visualized in Google Earth, an info-bubble will appear via clicking on a pinpoint. It displays all the metadata (e.g. index, coordinates, date, logger name, etc.), the descriptions and the photos of the observed site. If a more general GIS application is the aim (e.g. Global Mapper or QGIS), the file can be saved in a different format, but still in a kml-structure. The simple text format is recommended if the observations are to be imported in a user-defined relational database system (RDB). Report text-type is also available if a detailed description of one or more observed site is needed. Importing waypoint gpx-files can quicken the logging. The code was written in VisualBasic.Net. The app is freely accessible from the geonucleus.elte.hu site and it can be installed on any system, which has the .Net framework 4.0 or higher. The software is bilingual (English and Hungarian), and the app is designed for general geological mapping purposes (e.g. quick logging of field trips). The layout of the GUI has three components: 1) metadata area, 2) general description area with unlimited storing capacity, 3) switchable panels for observations, measurements, photos and notes. The latter includes panels for stratigraphy, structures, fossils, samples, photo uploads and general notes. Details like the sequence and contact type of layers, the parameters of structures and slickensides, name and condition of fossils and purpose of sampling are also available to log (but not compulsorily). It is also a tool for teaching geological mapping, since the available parameters - listed in the app - draws attention to the details, which are to be observed on the field. Reference: Albert G, Csillag G, Fodor L, Zentai L. 2012: Visualisation of Geological Observations on Web 2.0 Based Maps, in: Zentai, L. and Reyes-Nunez, J (eds.): Maps for the Future - Children, Education and Internet, Series: Lecture Notes in Geoinformation and Cartography, Tentative volume 5 - Springer, pp. 165-178.

  15. An integrated 3D log processing optimization system for small sawmills in central Appalachia

    Treesearch

    Wenshu Lin; Jingxin Wang

    2013-01-01

    An integrated 3D log processing optimization system was developed to perform 3D log generation, opening face determination, headrig log sawing simulation, fl itch edging and trimming simulation, cant resawing, and lumber grading. A circular cross-section model, together with 3D modeling techniques, was used to reconstruct 3D virtual logs. Internal log defects (knots)...

  16. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  17. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  18. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  19. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  20. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  1. 105-KE Isolation Barrier Leak Rate Acceptance Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCracken, K.J.

    1995-06-14

    This Acceptance Test Report (ATR) contains the completed and signed Acceptance Procedure (ATP) for the 105-KE Isolations Barrier Leak Rate Test. The Test Engineer`s log, the completed sections of the ATP in the Appendix for Repeat Testing (Appendix K), the approved WHC J-7s (Appendix H), the data logger files (Appendices T and U), and the post test calibration checks (Appendix V) are included.

  2. Gigabit Network Communications Research

    DTIC Science & Technology

    1992-12-31

    additional BPF channels, raw bytesync support for video codecs, and others. All source file modifications were logged with RCS. Source and object trees were...34 (RFCs). 20 RFCs were published this quarter: RFC 1366: Gerich, E., " Guidelines for Management of IP Address Space", Merit, October 1992. RFC 1367...Topolcic, C., "Schedule for IP Address Space Management Guidelines ", CNRI, October 1992. RFC 1368: McMaster, D. (Synoptics Communications, Inc.), K

  3. A rule-based approach for the correlation of alarms to support Disaster and Emergency Management

    NASA Astrophysics Data System (ADS)

    Gloria, M.; Minei, G.; Lersi, V.; Pasquariello, D.; Monti, C.; Saitto, A.

    2009-04-01

    Key words: Simple Event Correlator, Agent Platform, Ontology, Semantic Web, Distributed Systems, Emergency Management The importance of recognition of emergency's typology to control the critical situation for security of citizens has been always recognized. It follows this aspect is very important for proper management of a hazardous event. In this work we present a solution for the recognition of emergency's typology adopted by an Italian research project, called CI6 (Centro Integrato per Servizi di Emergenza Innovativi). In our approach, CI6 receives alarms by citizen or people involved in the work (for example: police, operator of 112, and so on). CI6 represents any alarm by a set of information, including a text that describes it and obtained when the user points out the danger, and a pair of coordinates for its location. The system realizes an analysis of text and automatically infers information on the type of emergencies by means a set of parsing rules and rules of inference applied by a independent module: a correlator of events based on their log and called Simple Event Correlator (SEC). SEC, integrated in CI6's platform, is an open source and platform independent event correlation tool. SEC accepts input both files and text derived from standard input, making it flexible because it can be matched to any application that is able to write its output to a file stream. The SEC configuration is stored in text files as rules, each rule specifying an event matching condition, an action list, and optionally a Boolean expression whose truth value decides whether the rule can be applied at a given moment. SEC can produce output events by executing user-specified shell scripts or programs, by writing messages to files, and by various other means. SEC has been successfully applied in various domains like network management, system monitoring, data security, intrusion detection, log file monitoring and analysis, etc; it has been used or integrated with many application as CiscoWorks, HP OpenView NNM and Operation, BMC Patrol, etc. Analysis of text of an alarm can detect some keywords that allow to classify the particular event. The inference rules were developed by means an analysis about news regard real emergency found by web reaserches. We have seen that often a kind of emergency is characterized by more keyword. Keywords are not uniquely associated with a specific emergency, but they can be shared by different types of emergencies (such as. keyword "landslide" can be associated both emergency "landslide" and emergency "Flood"). However, the identification of two or more keywords associated with a particular type of emergency identified in most cases the correct type of emergency. So, for example, if text contains words as "water", "flood", "overflowing", "landslide" o other words belonging to the set of defined keywords or words that have some root of keywords, the system "decides" that this alarm belongs to specific typology, in this case "flood typology". The system has the memory of this information, so if a new alarm is reported and belongs to one of the typology already identified, it proceeds with the comparison of coordinates. The comparison between the centers of the alarms allows to see if they describe an area inscribed in an ideal circle that has centered on the first alarm and radius defined by the typology above mentioned. If this happens the system CI6 creates an emergency that has centered on the centre of that area and typology equal to that of the alarms. It follows that an emergency is represented by at least two alarms. Thus, the system suggests to manager (CI6's user) the possibility that most alarms can concern same events and makes a classification of this event. It is important to stress that CI6 is a system of decision support, hence also this service is limited to providing advice to the user to facilitate his task, leaving him the decision to accept it or not. REFERENCES SEC (Simple Event Correlator), http://kodu.neti.ee/~risto/sec/ M. Gloria,V. Lersi, G. Minei, D. Pasquariello, C. Monti, A. Saitto, "A Semantic WEB Services Platform to support Disaster and Emergency Management", 4th biennial Meeting of International Environmental Modelling and Software Society (iEMSs), 2008

  4. Highway Safety Information System guidebook for the Minnesota state data files. Volume 1 : SAS file formats

    DOT National Transportation Integrated Search

    2001-02-01

    The Minnesota data system includes the following basic files: Accident data (Accident File, Vehicle File, Occupant File); Roadlog File; Reference Post File; Traffic File; Intersection File; Bridge (Structures) File; and RR Grade Crossing File. For ea...

  5. Long-Term file activity patterns in a UNIX workstation environment

    NASA Technical Reports Server (NTRS)

    Gibson, Timothy J.; Miller, Ethan L.

    1998-01-01

    As mass storage technology becomes more affordable for sites smaller than supercomputer centers, understanding their file access patterns becomes crucial for developing systems to store rarely used data on tertiary storage devices such as tapes and optical disks. This paper presents a new way to collect and analyze file system statistics for UNIX-based file systems. The collection system runs in user-space and requires no modification of the operating system kernel. The statistics package provides details about file system operations at the file level: creations, deletions, modifications, etc. The paper analyzes four months of file system activity on a university file system. The results confirm previously published results gathered from supercomputer file systems, but differ in several important areas. Files in this study were considerably smaller than those at supercomputer centers, and they were accessed less frequently. Additionally, the long-term creation rate on workstation file systems is sufficiently low so that all data more than a day old could be cheaply saved on a mass storage device, allowing the integration of time travel into every file system.

  6. SU-F-T-306: Validation of Mobius 3D and FX for Elekta Linear Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, C; Garcia, M; Calderon, E

    2016-06-15

    Purpose: Log file based IMRT and VMAT QA is a system that analyzes treatment log files and uses delivery parameters to compute the dose to the patient/phantom. This system was previously commissioned for Varian machines, the purpose of this work is to describe the process for commissioning Mobius for use with Elekta machines. Methods: Twelve IMRT and VMAT plans (6×) were planned and delivered and dose was measured using MapCheck, the results were compared to that computed by Mobius. For 10x and 18x, plans were generated, copied to a phantom and delivered, the dose was measured using a single ionmore » chamber. The difference in measured dose to computed dose (Mobius) was used to adjust the dynamic leaf gap (DLG) in Mobius to achieve optimal agreement between measurements, Mobius and treatment plans. Results: For the measured dose comparison, the average 3%/3mm gamma 97.1% of pixels passed criteria using MapCheck where Mobius computed 96.9% of voxels passing. For 10×, a DLG of −5.5 was determined to achieve optimal results for TPS and measured ion chamber data with an average 0.1% difference and −1.7% respectively. For 18×, a DLG of −3 was determined to achieve optimal results from the TPS and measured data with an average of −0.7% and −1.4% difference on average from a set of IMRT and VMAT plans. The 6x data needed no DLG correction to arrive at agreement with the TPS and the MapCheck measured data. Conclusion: We have validated with measurements for IMRT and VMAT cases the use of Mobius FX with Elekta treatment machines for IMRT and VMAT QA. For 6×, no adjustments to the DLG were required to obtain good results utilizing Mobius whereas for 10× and 18×, the DLG had to be adjusted to obtain optimum agreement with measured data and our TPS.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blackwell, David D.; Chickering Pace, Cathy; Richards, Maria C.

    The National Geothermal Data System (NGDS) is a Department of Energy funded effort to create a single cataloged source for a variety of geothermal information through a distributed network of databases made available via web services. The NGDS will help identify regions suitable for potential development and further scientific data collection and analysis of geothermal resources as a source for clean, renewable energy. A key NGDS repository or ‘node’ is located at Southern Methodist University developed by a consortium made up of: • SMU Geothermal Laboratory • Siemens Corporate Technology, a division of Siemens Corporation • Bureau of Economic Geologymore » at the University of Texas at Austin • Cornell Energy Institute, Cornell University • Geothermal Resources Council • MLKay Technologies • Texas Tech University • University of North Dakota. The focus of resources and research encompass the United States with particular emphasis on the Gulf Coast (on and off shore), the Great Plains, and the Eastern U.S. The data collection includes the thermal, geological and geophysical characteristics of these area resources. Types of data include, but are not limited to, temperature, heat flow, thermal conductivity, radiogenic heat production, porosity, permeability, geological structure, core geophysical logs, well tests, estimated reservoir volume, in situ stress, oil and gas well fluid chemistry, oil and gas well information, and conventional and enhanced geothermal system related resources. Libraries of publications and reports are combined into a unified, accessible, catalog with links for downloading non-copyrighted items. Field notes, individual temperature logs, site maps and related resources are included to increase data collection knowledge. Additional research based on legacy data to improve quality increases our understanding of the local and regional geology and geothermal characteristics. The software to enable the integration, analysis, and dissemination of this team’s NGDS contributions was developed by Siemens Corporate Technology. The SMU Node interactive application is accessible at http://geothermal.smu.edu. Additionally, files may be downloaded from either http://geothermal.smu.edu:9000/geoserver/web/ or through http://geothermal.smu.edu/static/DownloadFilesButtonPage.htm. The Geothermal Resources Council Library is available at https://www.geothermal-library.org/.« less

  8. Historical files from Federal government mineral exploration-assistance programs, 1950 to 1974

    USGS Publications Warehouse

    Frank, David G.

    2010-01-01

    Congress enacted the Defense Production Act in 1950 to provide funding and support for the exploration and development of critical mineral resources. From 1950 to 1974, three Department of the Interior agencies carried out this mission. Contracts with mine owners provided financial assistance for mineral exploration on a joint-participation basis. These contracts are documented in more than 5,000 'dockets' now archived online by the U.S. Geological Survey. This archive provides access to unique and difficult to recreate information, such as drill logs, assay results, and underground geologic maps, that is invaluable to land and resource management organizations and the minerals industry. An effort to preserve the data began in 2009, and the entire collection of dockets was electronically scanned. The scanning process used optical character recognition (OCR) when possible, and files were converted into Portable Document Format (.pdf) files, which require Adobe Reader or similar software for viewing. In 2010, the scans were placed online (http://minerals.usgs.gov/dockets/) and are available to download free of charge.

  9. VizieR Online Data Catalog: X-ray sources in Hickson Compact Groups (Tzanavaris+, 2014)

    NASA Astrophysics Data System (ADS)

    Tzanavaris, P.; Gallagher, S. C.; Hornschemeier, A. E.; Fedotov, K.; Eracleous, M.; Brandt, W. N.; Desjardins, T. D.; Charlton, J. C.; Gronwall, C.

    2014-06-01

    By virtue of their selection criteria, Hickson Compact Groups (HCGs) constitute a distinct class among small galaxy agglomerations. The Hickson catalog (Hickson et al. 1992, Cat. VII/213) comprises 92 spectroscopically confirmed nearby compact groups with three or more members with accordant redshifts (i.e., within 1000km/s of the group mean). In this paper we present nine of these groups, for which both archival Chandra X-ray and Swift UVOT ultraviolet data are available. An observation log for the Chandra data is presented in Table 1. An observation log for the Swift UVOT data is presented in Tzanavaris et al. (2010ApJ...716..556T). In addition, note that in the present work we have included UVOT data for HCGs 90 and 92. (3 data files).

  10. 10 CFR 13.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... identity when filing documents and serving participants electronically through the E-Filing system, and... transmitted electronically from the E-Filing system to the submitter confirming receipt of electronic filing... presentation of the docket and a link to its files. E-Filing System means an electronic system that receives...

  11. Self-optimizing Monte Carlo method for nuclear well logging simulation

    NASA Astrophysics Data System (ADS)

    Liu, Lianyan

    1997-09-01

    In order to increase the efficiency of Monte Carlo simulation for nuclear well logging problems, a new method has been developed for variance reduction. With this method, an importance map is generated in the regular Monte Carlo calculation as a by-product, and the importance map is later used to conduct the splitting and Russian roulette for particle population control. By adopting a spatial mesh system, which is independent of physical geometrical configuration, the method allows superior user-friendliness. This new method is incorporated into the general purpose Monte Carlo code MCNP4A through a patch file. Two nuclear well logging problems, a neutron porosity tool and a gamma-ray lithology density tool are used to test the performance of this new method. The calculations are sped up over analog simulation by 120 and 2600 times, for the neutron porosity tool and for the gamma-ray lithology density log, respectively. The new method enjoys better performance by a factor of 4~6 times than that of MCNP's cell-based weight window, as per the converged figure-of-merits. An indirect comparison indicates that the new method also outperforms the AVATAR process for gamma-ray density tool problems. Even though it takes quite some time to generate a reasonable importance map from an analog run, a good initial map can create significant CPU time savings. This makes the method especially suitable for nuclear well logging problems, since one or several reference importance maps are usually available for a given tool. Study shows that the spatial mesh sizes should be chosen according to the mean-free-path. The overhead of the importance map generator is 6% and 14% for neutron and gamma-ray cases. The learning ability towards a correct importance map is also demonstrated. Although false-learning may happen, physical judgement can help diagnose with contributon maps. Calibration and analysis are performed for the neutron tool and the gamma-ray tool. Due to the fact that a very good initial importance map is always available after the first point has been calculated, high computing efficiency is maintained. The availability of contributon maps provides an easy way of understanding the logging measurement and analyzing for the depth of investigation.

  12. Co-PylotDB - A Python-Based Single-Window User Interface for Transmitting Information to a Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnette, Daniel W.

    2012-01-05

    Co-PylotDB, written completely in Python, provides a user interface (UI) with which to select user and data file(s), directories, and file content, and provide or capture various other information for sending data collected from running any computer program to a pre-formatted database table for persistent storage. The interface allows the user to select input, output, make, source, executable, and qsub files. It also provides fields for specifying the machine name on which the software was run, capturing compile and execution lines, and listing relevant user comments. Data automatically captured by Co-PylotDB and sent to the database are user, current directory,more » local hostname, current date, and time of send. The UI provides fields for logging into a local or remote database server, specifying a database and a table, and sending the information to the selected database table. If a server is not available, the UI provides for saving the command that would have saved the information to a database table for either later submission or for sending via email to a collaborator who has access to the desired database.« less

  13. Designing a data portal for synthesis modeling

    NASA Astrophysics Data System (ADS)

    Holmes, M. A.

    2006-12-01

    Processing of field and model data in multi-disciplinary integrated science studies is a vital part of synthesis modeling. Collection and storage techniques for field data vary greatly between the participating scientific disciplines due to the nature of the data being collected, whether it be in situ, remotely sensed, or recorded by automated data logging equipment. Spreadsheets, personal databases, text files and binary files are used in the initial storage and processing of the raw data. In order to be useful to scientists, engineers and modelers the data need to be stored in a format that is easily identifiable, accessible and transparent to a variety of computing environments. The Model Operations and Synthesis (MOAS) database and associated web portal were created to provide such capabilities. The industry standard relational database is comprised of spatial and temporal data tables, shape files and supporting metadata accessible over the network, through a menu driven web-based portal or spatially accessible through ArcSDE connections from the user's local GIS desktop software. A separate server provides public access to spatial data and model output in the form of attributed shape files through an ArcIMS web-based graphical user interface.

  14. Measuring driver satisfaction with an urban arterial before and after deployment of an adaptive timing signal system

    DOT National Transportation Integrated Search

    2001-02-01

    The Minnesota data system includes the following basic files: Accident data (Accident File, Vehicle File, Occupant File); Roadlog File; Reference Post File; Traffic File; Intersection File; Bridge (Structures) File; and RR Grade Crossing File. For ea...

  15. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    USGS Publications Warehouse

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

  16. Midwest Consortium for Wind Turbine Reliability and Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott R. Dana; Douglas E. Adams; Noah J. Myrent

    2012-05-11

    This report provides an overview of the efforts aimed to establish a student focused laboratory apparatus that will enhance Purdue's ability to recruit and train students in topics related to the dynamics, operations and economics of wind turbines. The project also aims to facilitate outreach to students at Purdue and in grades K-12 in the State of Indiana by sharing wind turbine operational data. For this project, a portable wind turbine test apparatus was developed and fabricated utilizing an AirX 400W wind energy converter. This turbine and test apparatus was outfitted with an array of sensors used to monitor windmore » speed, turbine rotor speed, power output and the tower structural dynamics. A major portion of this project included the development of a data logging program used to display real-time sensor data and the recording and creation of output files for data post-processing. The apparatus was tested in an open field to subject the turbine to typical operating conditions and the data acquisition system was adjusted to obtain desired functionality to facilitate use for student projects in existing courses offered at Purdue University and Indiana University. Data collected using the data logging program is analyzed and presented to demonstrate the usefulness of the test apparatus related to wind turbine dynamics and operations.« less

  17. DMFS: A Data Migration File System for NetBSD

    NASA Technical Reports Server (NTRS)

    Studenmund, William

    2000-01-01

    I have recently developed DMFS, a Data Migration File System, for NetBSD. This file system provides kernel support for the data migration system being developed by my research group at NASA/Ames. The file system utilizes an underlying file store to provide the file backing, and coordinates user and system access to the files. It stores its internal metadata in a flat file, which resides on a separate file system. This paper will first describe our data migration system to provide a context for DMFS, then it will describe DMFS. It also will describe the changes to NetBSD needed to make DMFS work. Then it will give an overview of the file archival and restoration procedures, and describe how some typical user actions are modified by DMFS. Lastly, the paper will present simple performance measurements which indicate that there is little performance loss due to the use of the DMFS layer.

  18. Using a Formal Approach for Reverse Engineering and Design Recovery to Support Software Reuse

    NASA Technical Reports Server (NTRS)

    Gannod, Gerald C.

    2002-01-01

    This document describes 3rd year accomplishments and summarizes overall project accomplishments. Included as attachments are all published papers from year three. Note that the budget for this project was discontinued after year two, but that a residual budget from year two allowed minimal continuance into year three. Accomplishments include initial investigations into log-file based reverse engineering, service-based software reuse, and a source to XML generator.

  19. Dose calculation and verification of the Vero gimbal tracking treatment delivery

    NASA Astrophysics Data System (ADS)

    Prasetio, H.; Wölfelschneider, J.; Ziegler, M.; Serpa, M.; Witulla, B.; Bert, C.

    2018-02-01

    The Vero linear accelerator delivers dynamic tumor tracking (DTT) treatment using a gimbal motion. However, the availability of treatment planning systems (TPS) to simulate DTT is limited. This study aims to implement and verify the gimbal tracking beam geometry in the dose calculation. Gimbal tracking was implemented by rotating the reference CT outside the TPS according to the ring, gantry, and gimbal tracking position obtained from the tracking log file. The dose was calculated using these rotated CTs. The geometric accuracy was verified by comparing calculated and measured film response using a ball bearing phantom. The dose was verified by comparing calculated 2D dose distributions and film measurements in a ball bearing and a homogeneous phantom using a gamma criterion of 2%/2 mm. The effect of implementing the gimbal tracking beam geometry in a 3D patient data dose calculation was evaluated using dose volume histograms (DVH). Geometrically, the gimbal tracking implementation accuracy was  <0.94 mm. The isodose lines agreed with the film measurement. The largest dose difference of 9.4% was observed at maximum tilt positions with an isocenter and target separation of 17.51 mm. Dosimetrically, gamma passing rates were  >98.4%. The introduction of the gimbal tracking beam geometry in the dose calculation shifted the DVH curves by 0.05%-1.26% for the phantom geometry and by 5.59% for the patient CT dataset. This study successfully demonstrates a method to incorporate the gimbal tracking beam geometry into dose calculations. By combining CT rotation and MU distribution according to the log file, the TPS was able to simulate the Vero tracking treatment dose delivery. The DVH analysis from the gimbal tracking dose calculation revealed changes in the dose distribution during gimbal DTT that are not visible with static dose calculations.

  20. New method for calculating a mathematical expression for streamflow recession

    USGS Publications Warehouse

    Rutledge, Albert T.

    1991-01-01

    An empirical method has been devised to calculate the master recession curve, which is a mathematical expression for streamflow recession during times of negligible direct runoff. The method is based on the assumption that the storage-delay factor, which is the time per log cycle of streamflow recession, varies linearly with the logarithm of streamflow. The resulting master recession curve can be nonlinear. The method can be executed by a computer program that reads a data file of daily mean streamflow, then allows the user to select several near-linear segments of streamflow recession. The storage-delay factor for each segment is one of the coefficients of the equation that results from linear least-squares regression. Using results for each recession segment, a mathematical expression of the storage-delay factor as a function of the log of streamflow is determined by linear least-squares regression. The master recession curve, which is a second-order polynomial expression for time as a function of log of streamflow, is then derived using the coefficients of this function.

  1. PACS quality control and automatic problem notifier

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.

    1997-05-01

    One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established as are expected on other equipment used in the diagnostic process.

  2. Balloon logging with the inverted skyline

    NASA Technical Reports Server (NTRS)

    Mosher, C. F.

    1975-01-01

    There is a gap in aerial logging techniques that has to be filled. The need for a simple, safe, sizeable system has to be developed before aerial logging will become effective and accepted in the logging industry. This paper presents such a system designed on simple principles with realistic cost and ecological benefits.

  3. Logging Student Learning via a Puerto Rico-based Geologic Mapping Game on the Google Earth Virtual Globe

    NASA Astrophysics Data System (ADS)

    Gobert, J.; Toto, E.; Wild, S. C.; Dordevic, M. M.; De Paor, D. G.

    2013-12-01

    A hindrance to migrating undergraduate geoscience courses online is the challenge of giving students a quasi-authentic field experience. As part of an NSF TUES Type 2 project (# NSF-DUE 1022755), we addressed this challenge by designing a Google Earth (GE) mapping game centered on Puerto Rico, a place we chose in order to connect with underrepresented minorities but also because its simple geologic divisions minimized map complexity. The game invites student groups to explore the island and draw a geological map with these divisions: Rugged Volcanic Terrain, Limestone Karst Topography, and Surficial Sands & Gravels. Students, represented as avatars via COLLADA models and the GE browser plugin, can move about, text fellow students, and click a 'drill here' button that tells them what lies underground. They need to learn to read the topography because the number of holes they can drill is limited to 30. Then using the GE Polygon tool, they create a map, aided by a custom 'snapping' algorithm that stitches adjacent contacts, preventing gaps and overlaps, and they submit this map for evaluation by their instructor, an evaluation we purposefully did not automate. Initially we assigned students to groups of 4 and gave each group a field vehicle avatar with a designated driver, however students hated the experience unless they were the designated driver, so we revised the game to allow all students to roam independently, however we retained the mutual texting feature amongst students in groups. We implemented the activity with undergraduates from a university in South East USA. All student movements and actions on the GE terrain were logged. We wrote algorithms to evaluate student learning processes via log files, including, but not limited to, number of places drilled and their locations. Pre-post gains were examined, as well as correlations between data from log files and pre-post data. There was a small but statistically significant post-pre gain including a positive correlation between diagram-based post-test questions and: 1) total number of drills; 2) number of correct within-polygon identifications (Evidently those who did more drilling inside polygons and drew boundaries accordingly, learn more. Drills 'mistakingly' plotted outside formation polygons were negatively correlated with extra post-test questions but this was not statistically significant --likely due to low statistical power because there were few students who did this); and 3) average distance between drills (Students whose drill holes were further apart, learn more. This makes sense since more information can be gleaned this way and this may also be indicative of a skilled learning strategy because there is little point to doing close/overlapping drills when the permitted number is small and the region is large.) No significant correlation between pre-test score and diagram-based post-test questions was found; this suggests that prior knowledge is not accounting for above correlations. Data will be discussed with respect to GE's utility to convey geoscience principles to geology undergraduates, as well as the affordances for analyzing students' log files in order to better understand their learning processes.

  4. Small file aggregation in a parallel computing system

    DOEpatents

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  5. Archive of Digital boomer subbottom data collected during USGS cruises 99FGS01 and 99FGS02 offshore southeast and southwest Florida, July and November, 1999

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Wiese, Dana S.; Phelps, Daniel C.

    2013-01-01

    In July (19 - 26) and November (17 - 18) of 1999, the USGS, in cooperation with the Florida Geological Survey (FGS), conducted two geophysical surveys in: (1) the Atlantic Ocean offshore of Florida's east coast from Orchid to Jupiter, FL, and (2) the Gulf of Mexico offshore of Venice, FL. This report serves as an archive of unprocessed digital boomer subbottom data, trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Filtered and gained (showing a relative increase in signal amplitude) digital images of the subbottom profiles are also provided. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, identifiers 99FGS01 and 99FGS02 refer to field data collected in 1999 for cooperative work with the FGS. The numbers 01 and 02 indicate the data were collected during the first and second field activities for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID).

  6. TH-A-9A-10: Prostate SBRT Delivery with Flattening-Filter-Free Mode: Benefit and Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, T; Yuan, L; Sheng, Y

    Purpose: Flattening-filter-free (FFF) beam mode offered on TrueBeam™ linac enables delivering IMRT at 2400 MU/min dose rate. This study investigates the benefit and delivery accuracy of using high dose rate in the context of prostate SBRT. Methods: 8 prostate SBRT patients were retrospectively studied. In 5 cases treated with 600-MU/min dose rate, continuous prostate motion data acquired during radiation-beam-on was used to analyze motion range. In addition, the initial 1/3 of prostate motion trajectories during each radiation-beam-on was separated to simulate motion range if 2400-MU/min were used. To analyze delivery accuracy in FFF mode, MLC trajectory log files from anmore » additional 3 cases treated at 2400-MU/min were acquired. These log files record MLC expected and actual positions every 20ms, and therefore can be used to reveal delivery accuracy. Results: (1) Benefit. On average treatment at 600-MU/min takes 30s per beam; whereas 2400-MU/min requires only 11s. When shortening delivery time to ~1/3, the prostate motion range was significantly smaller (p<0.001). Largest motion reduction occurred in Sup-Inf direction, from [−3.3mm, 2.1mm] to [−1.7mm, 1.7mm], followed by reduction from [−2.1mm, 2.4mm] to [−1.0mm, 2.4mm] in Ant-Pos direction. No change observed in LR direction [−0.8mm, 0.6mm]. The combined motion amplitude (vector norm) confirms that average motion and ranges are significantly smaller when beam-on was limited to the 1st 1/3 of actual delivery time. (2) Accuracy. Trajectory log file analysis showed excellent delivery accuracy with at 2400 MU/min. Most leaf deviations during beam-on were within 0.07mm (99-percentile). Maximum leaf-opening deviations during each beam-on were all under 0.1mm for all leaves. Dose-rate was maintained at 2400-MU/min during beam-on without dipping. Conclusion: Delivery prostate SBRT with 2400 MU/min is both beneficial and accurate. High dose rates significantly reduced both treatment time and intra-beam prostate motion range. Excellent delivery accuracy was confirmed with very small leaf motion deviation.« less

  7. PIYAS-proceeding to intelligent service oriented memory allocation for flash based data centric sensor devices in wireless sensor networks.

    PubMed

    Rizvi, Sanam Shahla; Chung, Tae-Sun

    2010-01-01

    Flash memory has become a more widespread storage medium for modern wireless devices because of its effective characteristics like non-volatility, small size, light weight, fast access speed, shock resistance, high reliability and low power consumption. Sensor nodes are highly resource constrained in terms of limited processing speed, runtime memory, persistent storage, communication bandwidth and finite energy. Therefore, for wireless sensor networks supporting sense, store, merge and send schemes, an efficient and reliable file system is highly required with consideration of sensor node constraints. In this paper, we propose a novel log structured external NAND flash memory based file system, called Proceeding to Intelligent service oriented memorY Allocation for flash based data centric Sensor devices in wireless sensor networks (PIYAS). This is the extended version of our previously proposed PIYA [1]. The main goals of the PIYAS scheme are to achieve instant mounting and reduced SRAM space by keeping memory mapping information to a very low size of and to provide high query response throughput by allocation of memory to the sensor data by network business rules. The scheme intelligently samples and stores the raw data and provides high in-network data availability by keeping the aggregate data for a longer period of time than any other scheme has done before. We propose effective garbage collection and wear-leveling schemes as well. The experimental results show that PIYAS is an optimized memory management scheme allowing high performance for wireless sensor networks.

  8. Collective operations in a file system based execution model

    DOEpatents

    Shinde, Pravin; Van Hensbergen, Eric

    2013-02-12

    A mechanism is provided for group communications using a MULTI-PIPE synthetic file system. A master application creates a multi-pipe synthetic file in the MULTI-PIPE synthetic file system, the master application indicating a multi-pipe operation to be performed. The master application then writes a header-control block of the multi-pipe synthetic file specifying at least one of a multi-pipe synthetic file system name, a message type, a message size, a specific destination, or a specification of the multi-pipe operation. Any other application participating in the group communications then opens the same multi-pipe synthetic file. A MULTI-PIPE file system module then implements the multi-pipe operation as identified by the master application. The master application and the other applications then either read or write operation messages to the multi-pipe synthetic file and the MULTI-PIPE synthetic file system module performs appropriate actions.

  9. Collective operations in a file system based execution model

    DOEpatents

    Shinde, Pravin; Van Hensbergen, Eric

    2013-02-19

    A mechanism is provided for group communications using a MULTI-PIPE synthetic file system. A master application creates a multi-pipe synthetic file in the MULTI-PIPE synthetic file system, the master application indicating a multi-pipe operation to be performed. The master application then writes a header-control block of the multi-pipe synthetic file specifying at least one of a multi-pipe synthetic file system name, a message type, a message size, a specific destination, or a specification of the multi-pipe operation. Any other application participating in the group communications then opens the same multi-pipe synthetic file. A MULTI-PIPE file system module then implements the multi-pipe operation as identified by the master application. The master application and the other applications then either read or write operation messages to the multi-pipe synthetic file and the MULTI-PIPE synthetic file system module performs appropriate actions.

  10. Design and Implementation of a Metadata-rich File System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ames, S; Gokhale, M B; Maltzahn, C

    2010-01-19

    Despite continual improvements in the performance and reliability of large scale file systems, the management of user-defined file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and semantic metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address thesemore » problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, user-defined attributes, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS incorporates Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the de facto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.« less

  11. Geologic Map of Prescott National Forest and the Headwaters of the Verde River, Yavapai and Coconino Counties, Arizona

    USGS Publications Warehouse

    DeWitt, Ed; Langenheim, V.E.; Force, Eric; Vance, R.K.; Lindberg, P.A.; Driscoll, R.L.

    2008-01-01

    This 1:100,000-scale digital geologic map details the complex Early Proterozoic metavolcanic and plutonic basement of north-central Arizona; shows the mildly deformed cover of Paleozoic rocks; reveals where Laramide to mid-Tertiary plutonic rocks associated with base- and precious-metals deposits are exposed; subdivides the Tertiary volcanic rocks according to chemically named units; and maps the Pliocene to Miocene fill of major basins. Associated digital files include more than 1,300 geochemical analyses of all rock units; 1,750 logs of water wells deeper than 300 feet; and interpreted logs of 300 wells that define the depth to basement in major basins. Geophysically interpreted buried features include normal faults defining previous unknown basins, mid-Tertiary intrusive rocks, and half-grabens within shallow bains.

  12. A Job Monitoring and Accounting Tool for the LSF Batch System

    NASA Astrophysics Data System (ADS)

    Sarkar, Subir; Taneja, Sonia

    2011-12-01

    This paper presents a web based job monitoring and group-and-user accounting tool for the LSF Batch System. The user oriented job monitoring displays a simple and compact quasi real-time overview of the batch farm for both local and Grid jobs. For Grid jobs the Distinguished Name (DN) of the Grid users is shown. The overview monitor provides the most up-to-date status of a batch farm at any time. The accounting tool works with the LSF accounting log files. The accounting information is shown for a few pre-defined time periods by default. However, one can also compute the same information for any arbitrary time window. The tool already proved to be an extremely useful means to validate more extensive accounting tools available in the Grid world. Several sites have already been using the present tool and more sites running the LSF batch system have shown interest. We shall discuss the various aspects that make the tool essential for site administrators and end-users alike and outline the current status of development as well as future plans.

  13. Checking Flight Rules with TraceContract: Application of a Scala DSL for Trace Analysis

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Havelund, Klaus; Morris, Robert A.

    2011-01-01

    Typically during the design and development of a NASA space mission, rules and constraints are identified to help reduce reasons for failure during operations. These flight rules are usually captured in a set of indexed tables, containing rule descriptions, rationales for the rules, and other information. Flight rules can be part of manual operations procedures carried out by humans. However, they can also be automated, and either implemented as on-board monitors, or as ground based monitors that are part of a ground data system. In the case of automated flight rules, one considerable expense to be addressed for any mission is the extensive process by which system engineers express flight rules in prose, software developers translate these requirements into code, and then both experts verify that the resulting application is correct. This paper explores the potential benefits of using an internal Scala DSL for general trace analysis, named TRACECONTRACT, to write executable specifications of flight rules. TRACECONTRACT can generally be applied to analysis of for example log files or for monitoring executing systems online.

  14. Secretary | Center for Cancer Research

    Cancer.gov

    The Basic Science Program (BSP) pursues independent, multidisciplinary research programs in basic or applied molecular biology, immunology, retrovirology, cancer biology, or human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick national Laboratory for Cancer Research (FNLCR). The BSP Office provides procurement and logistical assistance in support of the research activities of the Center for Cancer Research.KEY ROLES/RESPONSIBILITIES The Secretary III will: Provide heavy-volume procurement support to a large customer base of laboratory staff, both Leidos Biomed and CCR (gov’t), using blanket orders, purchase requisitions, credit card, and online warehouse system Data entry into appropriate financial system component (CostPoint, Cor360), status checks on orders, maintenance of orders log, reconciliation of credit card transactions, maintenance of electronic filing systems Providing logistical support for the facilitation of travel packages (both pre-travel and post travel) for Leidos Biomed employees, as well as the coordination of seminar speakers and subsequent reimbursements Composing and answering emails/correspondence Communicating with all levels of personnel, both verbally and in writing, to gather and clearly convey information

  15. Methods and apparatus for multi-resolution replication of files in a parallel computing system using semantic information

    DOEpatents

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Torres, Aaron

    2015-10-20

    Techniques are provided for storing files in a parallel computing system using different resolutions. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a sub-file. The method comprises the steps of obtaining semantic information related to the file; generating a plurality of replicas of the file with different resolutions based on the semantic information; and storing the file and the plurality of replicas of the file in one or more storage nodes of the parallel computing system. The different resolutions comprise, for example, a variable number of bits and/or a different sub-set of data elements from the file. A plurality of the sub-files can be merged to reproduce the file.

  16. High-throughput determination of octanol/water partition coefficients using a shake-flask method and novel two-phase solvent system.

    PubMed

    Morikawa, Go; Suzuka, Chihiro; Shoji, Atsushi; Shibusawa, Yoichi; Yanagida, Akio

    2016-01-05

    A high-throughput method for determining the octanol/water partition coefficient (P(o/w)) of a large variety of compounds exhibiting a wide range in hydrophobicity was established. The method combines a simple shake-flask method with a novel two-phase solvent system comprising an acetonitrile-phosphate buffer (0.1 M, pH 7.4)-1-octanol (25:25:4, v/v/v; AN system). The AN system partition coefficients (K(AN)) of 51 standard compounds for which log P(o/w) (at pH 7.4; log D) values had been reported were determined by single two-phase partitioning in test tubes, followed by measurement of the solute concentration in both phases using an automatic flow injection-ultraviolet detection system. The log K(AN) values were closely related to reported log D values, and the relationship could be expressed by the following linear regression equation: log D=2.8630 log K(AN) -0.1497(n=51). The relationship reveals that log D values (+8 to -8) for a large variety of highly hydrophobic and/or hydrophilic compounds can be estimated indirectly from the narrow range of log K(AN) values (+3 to -3) determined using the present method. Furthermore, log K(AN) values for highly polar compounds for which no log D values have been reported, such as amino acids, peptides, proteins, nucleosides, and nucleotides, can be estimated using the present method. The wide-ranging log D values (+5.9 to -7.5) of these molecules were estimated for the first time from their log K(AN) values and the above regression equation. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Reciproc versus Twisted file for root canal filling removal: assessment of apically extruded debris.

    PubMed

    Altunbas, Demet; Kutuk, Betul; Toyoglu, Mustafa; Kutlu, Gizem; Kustarci, Alper; Er, Kursat

    2016-01-01

    The aim of this study was to evaluate the amount of apically extruded debris during endodontic retreatment with different file systems. Sixty extracted human mandibular premolar teeth were used in this study. Root canals of the teeth were instrumented and filled before being randomly assigned to three groups. Guttapercha was removed using the Reciproc system, the Twisted File system (TF), and Hedström-files (H-file). Apically extruded debris was collected and dried in pre-weighed Eppendorf tubes. The amount of extruded debris was assessed with an electronic balance. Data were statistically analyzed using one-way ANOVA, Kruskal-Wallis, and Mann-Whitney U tests. The Reciproc and TF systems extruded significantly less debris than the H-file (p<0.05). However, no significant difference was found between the Reciproc and TF systems. All tested file systems caused apical extrusion of debris. Both the rotary file (TF) and the reciprocating single-file (Reciproc) systems were associated with less apical extrusion compared with the H-file.

  18. SU-F-T-465: Two Years of Radiotherapy Treatments Analyzed Through MLC Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Defoor, D; Kabat, C; Papanikolaou, N

    Purpose: To present treatment statistics of a Varian Novalis Tx using more than 90,000 Varian Dynalog files collected over the past 2 years. Methods: Varian Dynalog files are recorded for every patient treated on our Varian Novalis Tx. The files are collected and analyzed daily to check interfraction agreement of treatment deliveries. This is accomplished by creating fluence maps from the data contained in the Dynalog files. From the Dynalog files we have also compiled statistics for treatment delivery times, MLC errors, gantry errors and collimator errors. Results: The mean treatment time for VMAT patients was 153 ± 86 secondsmore » while the mean treatment time for step & shoot was 256 ± 149 seconds. Patient’s treatment times showed a variation of 0.4% over there treatment course for VMAT and 0.5% for step & shoot. The average field sizes were 40 cm2 and 26 cm2 for VMAT and step & shoot respectively. VMAT beams contained and average overall leaf travel of 34.17 meters and step & shoot beams averaged less than half of that at 15.93 meters. When comparing planned and delivered fluence maps generated using the Dynalog files VMAT plans showed an average gamma passing percentage of 99.85 ± 0.47. Step & shoot plans showed an average gamma passing percentage of 97.04 ± 0.04. 5.3% of beams contained an MLC error greater than 1 mm and 2.4% had an error greater than 2mm. The mean gantry speed for VMAT plans was 1.01 degrees/s with a maximum of 6.5 degrees/s. Conclusion: Varian Dynalog files are useful for monitoring machine performance treatment parameters. The Dynalog files have shown that the performance of the Novalis Tx is consistent over the course of a patients treatment with only slight variations in patient treatment times and a low rate of MLC errors.« less

  19. Parental perceptions of the learner driver log book system in two Australian states.

    PubMed

    Bates, Lyndel; Watson, Barry; King, Mark Johann

    2014-01-01

    Though many jurisdictions internationally now require learner drivers to complete a specified number of hours of supervised driving practice before being able to drive unaccompanied, very few require learner drivers to complete a log book to record this practice and then present it to the licensing authority. Learner drivers in most Australian jurisdictions must complete a log book that records their practice, thereby confirming to the licensing authority that they have met the mandated hours of practice requirement. These log books facilitate the management and enforcement of minimum supervised hours of driving requirements. Parents of learner drivers in 2 Australian states, Queensland and New South Wales, completed an online survey assessing a range of factors, including their perceptions of the accuracy of their child's learner log book and the effectiveness of the log book system. The study indicates that the large majority of parents believe that their child's learner log book is accurate. However, they generally report that the log book system is only moderately effective as a system to measure the number of hours of supervised practice a learner driver has completed. The results of this study suggest the presence of a paradox, with many parents possibly believing that others are not as diligent in the use of log books as they are or that the system is too open to misuse. Given that many parents report that their child's log book is accurate, this study has important implications for the development and ongoing monitoring of hours of practice requirements in graduated driver licensing systems.

  20. Geometric Verification of Dynamic Wave Arc Delivery With the Vero System Using Orthogonal X-ray Fluoroscopic Imaging.

    PubMed

    Burghelea, Manuela; Verellen, Dirk; Poels, Kenneth; Gevaert, Thierry; Depuydt, Tom; Tournel, Koen; Hung, Cecilia; Simon, Viorica; Hiraoka, Masahiro; de Ridder, Mark

    2015-07-15

    The purpose of this study was to define an independent verification method based on on-board orthogonal fluoroscopy to determine the geometric accuracy of synchronized gantry-ring (G/R) rotations during dynamic wave arc (DWA) delivery available on the Vero system. A verification method for DWA was developed to calculate O-ring-gantry (G/R) positional information from ball-bearing positions retrieved from fluoroscopic images of a cubic phantom acquired during DWA delivery. Different noncoplanar trajectories were generated in order to investigate the influence of path complexity on delivery accuracy. The G/R positions detected from the fluoroscopy images (DetPositions) were benchmarked against the G/R angulations retrieved from the control points (CP) of the DWA RT plan and the DWA log files recorded by the treatment console during DWA delivery (LogActed). The G/R rotational accuracy was quantified as the mean absolute deviation ± standard deviation. The maximum G/R absolute deviation was calculated as the maximum 3-dimensional distance between the CP and the closest DetPositions. In the CP versus DetPositions comparison, an overall mean G/R deviation of 0.13°/0.16° ± 0.16°/0.16° was obtained, with a maximum G/R deviation of 0.6°/0.2°. For the LogActed versus DetPositions evaluation, the overall mean deviation was 0.08°/0.15° ± 0.10°/0.10° with a maximum G/R of 0.3°/0.4°. The largest decoupled deviations registered for gantry and ring were 0.6° and 0.4° respectively. No directional dependence was observed between clockwise and counterclockwise rotations. Doubling the dose resulted in a double number of detected points around each CP, and an angular deviation reduction in all cases. An independent geometric quality assurance approach was developed for DWA delivery verification and was successfully applied on diverse trajectories. Results showed that the Vero system is capable of following complex G/R trajectories with maximum deviations during DWA below 0.6°. Copyright © 2015 Elsevier Inc. All rights reserved.

Top