Bent, John M.; Faibish, Sorin; Grider, Gary
2016-04-19
Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.
Bent, John M.; Faibish, Sorin; Grider, Gary
2015-06-30
Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.
Parallel checksumming of data chunks of a shared data object using a log-structured file system
Bent, John M.; Faibish, Sorin; Grider, Gary
2016-09-06
Checksum values are generated and used to verify the data integrity. A client executing in a parallel computing system stores a data chunk to a shared data object on a storage node in the parallel computing system. The client determines a checksum value for the data chunk; and provides the checksum value with the data chunk to the storage node that stores the shared object. The data chunk can be stored on the storage node with the corresponding checksum value as part of the shared object. The storage node may be part of a Parallel Log-Structured File System (PLFS), and the client may comprise, for example, a Log-Structured File System client on a compute node or burst buffer. The checksum value can be evaluated when the data chunk is read from the storage node to verify the integrity of the data that is read.
Cooperative storage of shared files in a parallel computing system with dynamic block size
Bent, John M.; Faibish, Sorin; Grider, Gary
2015-11-10
Improved techniques are provided for parallel writing of data to a shared object in a parallel computing system. A method is provided for storing data generated by a plurality of parallel processes to a shared object in a parallel computing system. The method is performed by at least one of the processes and comprises: dynamically determining a block size for storing the data; exchanging a determined amount of the data with at least one additional process to achieve a block of the data having the dynamically determined block size; and writing the block of the data having the dynamically determined block size to a file system. The determined block size comprises, e.g., a total amount of the data to be stored divided by the number of parallel processes. The file system comprises, for example, a log structured virtual parallel file system, such as a Parallel Log-Structured File System (PLFS).
Storage of sparse files using parallel log-structured file system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bent, John M.; Faibish, Sorin; Grider, Gary
A sparse file is stored without holes by storing a data portion of the sparse file using a parallel log-structured file system; and generating an index entry for the data portion, the index entry comprising a logical offset, physical offset and length of the data portion. The holes can be restored to the sparse file upon a reading of the sparse file. The data portion can be stored at a logical end of the sparse file. Additional storage efficiency can optionally be achieved by (i) detecting a write pattern for a plurality of the data portions and generating a singlemore » patterned index entry for the plurality of the patterned data portions; and/or (ii) storing the patterned index entries for a plurality of the sparse files in a single directory, wherein each entry in the single directory comprises an identifier of a corresponding sparse file.« less
SU-E-T-142: Automatic Linac Log File: Analysis and Reporting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gainey, M; Rothe, T
Purpose: End to end QA for IMRT/VMAT is time consuming. Automated linac log file analysis and recalculation of daily recorded fluence, and hence dose, distribution bring this closer. Methods: Matlab (R2014b, Mathworks) software was written to read in and analyse IMRT/VMAT trajectory log files (TrueBeam 1.5, Varian Medical Systems) overnight, and are archived on a backed-up network drive (figure). A summary report (PDF) is sent by email to the duty linac physicist. A structured summary report (PDF) for each patient is automatically updated for embedding into the R&V system (Mosaiq 2.5, Elekta AG). The report contains cross-referenced hyperlinks to easemore » navigation between treatment fractions. Gamma analysis can be performed on planned (DICOM RTPlan) and treated (trajectory log) fluence distributions. Trajectory log files can be converted into RTPlan files for dose distribution calculation (Eclipse, AAA10.0.28, VMS). Results: All leaf positions are within +/−0.10mm: 57% within +/−0.01mm; 89% within 0.05mm. Mean leaf position deviation is 0.02mm. Gantry angle variations lie in the range −0.1 to 0.3 degrees, mean 0.04 degrees. Fluence verification shows excellent agreement between planned and treated fluence. Agreement between planned and treated dose distribution, the derived from log files, is very good. Conclusion: Automated log file analysis is a valuable tool for the busy physicist, enabling potential treated fluence distribution errors to be quickly identified. In the near future we will correlate trajectory log analysis with routine IMRT/VMAT QA analysis. This has the potential to reduce, but not eliminate, the QA workload.« less
Comparing Web and Touch Screen Transaction Log Files
Huntington, Paul; Williams, Peter
2001-01-01
Background Digital health information is available on a wide variety of platforms including PC-access of the Internet, Wireless Application Protocol phones, CD-ROMs, and touch screen public kiosks. All these platforms record details of user sessions in transaction log files, and there is a growing body of research into the evaluation of this data. However, there is very little research that has examined the problems of comparing the transaction log files of kiosks and the Internet. Objectives To provide a first step towards examining the problems of comparing the transaction log files of kiosks and the Internet. Methods We studied two platforms: touch screen kiosks and a comparable Web site. For both of these platforms, we examined the menu structure (which affects transaction log file data), the log-file structure, and the metrics derived from log-file records. Results We found substantial differences between the generated metrics. Conclusions None of the metrics discussed can be regarded as an effective way of comparing the use of kiosks and Web sites. Two metrics stand out as potentially comparable and valuable: the number of user sessions per hour and user penetration of pages. PMID:11720960
ERIC Educational Resources Information Center
Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm
2016-01-01
Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…
NASA Technical Reports Server (NTRS)
Reardon, John E.; Violett, Duane L., Jr.
1991-01-01
The AFAS Database System was developed to provide the basic structure of a comprehensive database system for the Marshall Space Flight Center (MSFC) Structures and Dynamics Laboratory Aerophysics Division. The system is intended to handle all of the Aerophysics Division Test Facilities as well as data from other sources. The system was written for the DEC VAX family of computers in FORTRAN-77 and utilizes the VMS indexed file system and screen management routines. Various aspects of the system are covered, including a description of the user interface, lists of all code structure elements, descriptions of the file structures, a description of the security system operation, a detailed description of the data retrieval tasks, a description of the session log, and a description of the archival system.
Parallel compression of data chunks of a shared data object using a log-structured file system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bent, John M.; Faibish, Sorin; Grider, Gary
2016-10-25
Techniques are provided for parallel compression of data chunks being written to a shared object. A client executing on a compute node or a burst buffer node in a parallel computing system stores a data chunk generated by the parallel computing system to a shared data object on a storage node by compressing the data chunk; and providing the data compressed data chunk to the storage node that stores the shared object. The client and storage node may employ Log-Structured File techniques. The compressed data chunk can be de-compressed by the client when the data chunk is read. A storagemore » node stores a data chunk as part of a shared object by receiving a compressed version of the data chunk from a compute node; and storing the compressed version of the data chunk to the shared data object on the storage node.« less
Parallel file system with metadata distributed across partitioned key-value store c
Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron
2017-09-19
Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).
Zebra: A striped network file system
NASA Technical Reports Server (NTRS)
Hartman, John H.; Ousterhout, John K.
1992-01-01
The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.
Log-less metadata management on metadata server for parallel file systems.
Liao, Jianwei; Xiao, Guoqiang; Peng, Xiaoning
2014-01-01
This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally.
Log-Less Metadata Management on Metadata Server for Parallel File Systems
Xiao, Guoqiang; Peng, Xiaoning
2014-01-01
This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally. PMID:24892093
TH-AB-201-12: Using Machine Log-Files for Treatment Planning and Delivery QA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stanhope, C; Liang, J; Drake, D
2016-06-15
Purpose: To determine the segment reduction and dose resolution necessary for machine log-files to effectively replace current phantom-based patient-specific quality assurance, while minimizing computational cost. Methods: Elekta’s Log File Convertor R3.2 records linac delivery parameters (dose rate, gantry angle, leaf position) every 40ms. Five VMAT plans [4 H&N, 1 Pulsed Brain] comprised of 2 arcs each were delivered on the ArcCHECK phantom. Log-files were reconstructed in Pinnacle on the phantom geometry using 1/2/3/4° control point spacing and 2/3/4mm dose grid resolution. Reconstruction effectiveness was quantified by comparing 2%/2mm gamma passing rates of the original and log-file plans. Modulation complexity scoresmore » (MCS) were calculated for each beam to correlate reconstruction accuracy and beam modulation. Percent error in absolute dose for each plan-pair combination (log-file vs. ArcCHECK, original vs. ArcCHECK, log-file vs. original) was calculated for each arc and every diode greater than 10% of the maximum measured dose (per beam). Comparing standard deviations of the three plan-pair distributions, relative noise of the ArcCHECK and log-file systems was elucidated. Results: The original plans exhibit a mean passing rate of 95.1±1.3%. The eight more modulated H&N arcs [MCS=0.088±0.014] and two less modulated brain arcs [MCS=0.291±0.004] yielded log-file pass rates most similar to the original plan when using 1°/2mm [0.05%±1.3% lower] and 2°/3mm [0.35±0.64% higher] log-file reconstructions respectively. Log-file and original plans displayed percent diode dose errors 4.29±6.27% and 3.61±6.57% higher than measurement. Excluding the phantom eliminates diode miscalibration and setup errors; log-file dose errors were 0.72±3.06% higher than the original plans – significantly less noisy. Conclusion: For log-file reconstructed VMAT arcs, 1° control point spacing and 2mm dose resolution is recommended, however, less modulated arcs may allow less stringent reconstructions. Following the aforementioned reconstruction recommendations, the log-file technique is capable of detecting delivery errors with equivalent accuracy and less noise than ArcCHECK QA. I am funded by an Elekta Research Grant.« less
Log file-based patient dose calculations of double-arc VMAT for head-and-neck radiotherapy.
Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Majima, Kazuhiro; Matsushita, Haruo; Takeda, Ken; Jingu, Keiichi
2018-04-01
The log file-based method cannot display dosimetric changes due to linac component miscalibration because of the insensitivity of log files to linac component miscalibration. The purpose of this study was to supply dosimetric changes in log file-based patient dose calculations for double-arc volumetric-modulated arc therapy (VMAT) in head-and-neck cases. Fifteen head-and-neck cases participated in this study. For each case, treatment planning system (TPS) doses were produced by double-arc and single-arc VMAT. Miscalibration-simulated log files were generated by inducing a leaf miscalibration of ±0.5 mm into the log files that were acquired during VMAT irradiation. Subsequently, patient doses were estimated using the miscalibration-simulated log files. For double-arc VMAT, regarding planning target volume (PTV), the change from TPS dose to miscalibration-simulated log file dose in D mean was 0.9 Gy and that for tumor control probability was 1.4%. As for organ-at-risks (OARs), the change in D mean was <0.7 Gy and normal tissue complication probability was <1.8%. A comparison between double-arc and single-arc VMAT for PTV showed statistically significant differences in the changes evaluated by D mean and radiobiological metrics (P < 0.01), even though the magnitude of these differences was small. Similarly, for OARs, the magnitude of these changes was found to be small. Using the log file-based method for PTV and OARs, the log file-based method estimate of patient dose using the double-arc VMAT has accuracy comparable to that obtained using the single-arc VMAT. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, S; Ho, M; Chen, C
Purpose: The use of log files to perform patient specific quality assurance for both protons and IMRT has been established. Here, we extend that approach to a proprietary log file format and compare our results to measurements in phantom. Our goal was to generate a system that would permit gross errors to be found within 3 fractions until direct measurements. This approach could eventually replace direct measurements. Methods: Spot scanning protons pass through multi-wire ionization chambers which provide information about the charge, location, and size of each delivered spot. We have generated a program that calculates the dose in phantommore » from these log files and compares the measurements with the plan. The program has 3 different spot shape models: single Gaussian, double Gaussian and the ASTROID model. The program was benchmarked across different treatment sites for 23 patients and 74 fields. Results: The dose calculated from the log files were compared to those generate by the treatment planning system (Raystation). While the dual Gaussian model often gave better agreement, overall, the ASTROID model gave the most consistent results. Using a 5%–3 mm gamma with a 90% passing criteria and excluding doses below 20% of prescription all patient samples passed. However, the degree of agreement of the log file approach was slightly worse than that of the chamber array measurement approach. Operationally, this implies that if the beam passes the log file model, it should pass direct measurement. Conclusion: We have established and benchmarked a model for log file QA in an IBA proteus plus system. The choice of optimal spot model for a given class of patients may be affected by factors such as site, field size, and range shifter and will be investigated further.« less
Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Matsunaga, Kenichi; Matsushita, Haruo; Majima, Kazuhiro; Jingu, Keiichi
2017-10-01
A log file-based method cannot detect dosimetric changes due to linac component miscalibration because log files are insensitive to miscalibration. Herein, clinical impacts of dosimetric changes on a log file-based method were determined. Five head-and-neck and five prostate plans were applied. Miscalibration-simulated log files were generated by inducing a linac component miscalibration into the log file. Miscalibration magnitudes for leaf, gantry, and collimator at the general tolerance level were ±0.5mm, ±1°, and ±1°, respectively, and at a tighter tolerance level achievable on current linac were ±0.3mm, ±0.5°, and ±0.5°, respectively. Re-calculations were performed on patient anatomy using log file data. Changes in tumor control probability/normal tissue complication probability from treatment planning system dose to re-calculated dose at the general tolerance level was 1.8% on planning target volume (PTV) and 2.4% on organs at risk (OARs) in both plans. These changes at the tighter tolerance level were improved to 1.0% on PTV and to 1.5% on OARs, with a statistically significant difference. We determined the clinical impacts of dosimetric changes on a log file-based method using a general tolerance level and a tighter tolerance level for linac miscalibration and found that a tighter tolerance level significantly improved the accuracy of the log file-based method. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
WE-G-213CD-03: A Dual Complementary Verification Method for Dynamic Tumor Tracking on Vero SBRT.
Poels, K; Depuydt, T; Verellen, D; De Ridder, M
2012-06-01
to use complementary cine EPID and gimbals log file analysis for in-vivo tracking accuracy monitoring. A clinical prototype of dynamic tracking (DT) was installed on the Vero SBRT system. This prototype version allowed tumor tracking by gimballed linac rotations using an internal-external correspondence model. The DT prototype software allowed the detailed logging of all applied gimbals rotations during tracking. The integration of an EPID on the vero system allowed the acquisition of cine EPID images during DT. We quantified the tracking error on cine EPID (E-EPID) by subtracting the target center (fiducial marker detection) and the field centroid. Dynamic gimbals log file information was combined with orthogonal x-ray verification images to calculate the in-vivo tracking error (E-kVLog). The correlation between E-kVLog and E-EPID was calculated for validation of the gimbals log file. Further, we investigated the sensitivity of the log file tracking error by introducing predefined systematic tracking errors. As an application we calculate gimbals log file tracking error for dynamic hidden target tests to investigate gravity effects and decoupled gimbals rotation from gantry rotation. Finally, calculating complementary cine EPID and log file tracking errors evaluated the clinical accuracy of dynamic tracking. A strong correlation was found between log file and cine EPID tracking error distribution during concurrent measurements (R=0.98). We found sensitivity in the gimbals log files to detect a systematic tracking error up to 0.5 mm. Dynamic hidden target tests showed no gravity influence on tracking performance and high degree of decoupled gimbals and gantry rotation during dynamic arc dynamic tracking. A submillimetric agreement between clinical complementary tracking error measurements was found. Redundancy of the internal gimbals log file with x-ray verification images with complementary independent cine EPID images was implemented to monitor the accuracy of gimballed tumor tracking on Vero SBRT. Research was financially supported by the Flemish government (FWO), Hercules Foundation and BrainLAB AG. © 2012 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Madiraju, Praveen; Zhang, Yanqing
2002-03-01
When a user logs in to a website, behind the scenes the user leaves his/her impressions, usage patterns and also access patterns in the web servers log file. A web usage mining agent can analyze these web logs to help web developers to improve the organization and presentation of their websites. They can help system administrators in improving the system performance. Web logs provide invaluable help in creating adaptive web sites and also in analyzing the network traffic analysis. This paper presents the design and implementation of a Web usage mining agent for digging in to the web log files.
Williams, Lester J.; Raines, Jessica E.; Lanning, Amanda E.
2013-04-04
A database of borehole geophysical logs and other types of data files were compiled as part of ongoing studies of water availability and assessment of brackish- and saline-water resources. The database contains 4,883 logs from 1,248 wells in Florida, Georgia, Alabama, South Carolina, and from a limited number of offshore wells of the eastern Gulf of Mexico and the Atlantic Ocean. The logs can be accessed through a download directory organized by state and county for onshore wells and in a single directory for the offshore wells. A flat file database is provided that lists the wells, their coordinates, and the file listings.
Who Goes There? Measuring Library Web Site Usage.
ERIC Educational Resources Information Center
Bauer, Kathleen
2000-01-01
Discusses how libraries can gather data on the use of their Web sites. Highlights include Web server log files, including the common log file, referrer log file, and agent log file; log file limitations; privacy concerns; and choosing log analysis software, both free and commercial. (LRW)
Replication in the Harp File System
1981-07-01
Shrira Michael Williams iadly 1991 © Massachusetts Institute of Technology (To appear In the Proceedings of the Thirteenth ACM Symposium on Operating...S., Spector, A. Z., and Thompson, D. S. Distributed Logging for Transaction Processing. ACM Special Interest Group on Management of Data 1987 Annual ...System. USENIX Conference Proceedings , June, 1990, pp. 63-71. 15. Hagmann, R. Reimplementing the Cedar File System Using Logging and Group Commit
SU-F-T-469: A Clinically Observed Discrepancy Between Image-Based and Log- Based MLC Position
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neal, B; Ahmed, M; Siebers, J
2016-06-15
Purpose: To present a clinical case which challenges the base assumption of log-file based QA, by showing that the actual position of a MLC leaf can suddenly deviate from its programmed and logged position by >1 mm as observed with real-time imaging. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used in cine mode to capture portal images during treatment. Visual monitoring identified an anomalous MLC leaf pair gap not otherwise detected by the automatic position verification. The position of the erred leaf was measured on EPID images and log files were analyzed for themore » treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days. Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3±0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusion: It has been clinically observed that log-file derived leaf positions can differ from their actual positions by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trusting log file records. Intra-treatment EPID imaging provides a method to capture departures from MLC planned positions. Work was supported in part by Varian Medical Systems.« less
Monte Carlo based, patient-specific RapidArc QA using Linac log files.
Teke, Tony; Bergman, Alanah M; Kwa, William; Gill, Bradford; Duzenli, Cheryl; Popescu, I Antoniu
2010-01-01
A Monte Carlo (MC) based QA process to validate the dynamic beam delivery accuracy for Varian RapidArc (Varian Medical Systems, Palo Alto, CA) using Linac delivery log files (DynaLog) is presented. Using DynaLog file analysis and MC simulations, the goal of this article is to (a) confirm that adequate sampling is used in the RapidArc optimization algorithm (177 static gantry angles) and (b) to assess the physical machine performance [gantry angle and monitor unit (MU) delivery accuracy]. Ten clinically acceptable RapidArc treatment plans were generated for various tumor sites and delivered to a water-equivalent cylindrical phantom on the treatment unit. Three Monte Carlo simulations were performed to calculate dose to the CT phantom image set: (a) One using a series of static gantry angles defined by 177 control points with treatment planning system (TPS) MLC control files (planning files), (b) one using continuous gantry rotation with TPS generated MLC control files, and (c) one using continuous gantry rotation with actual Linac delivery log files. Monte Carlo simulated dose distributions are compared to both ionization chamber point measurements and with RapidArc TPS calculated doses. The 3D dose distributions were compared using a 3D gamma-factor analysis, employing a 3%/3 mm distance-to-agreement criterion. The dose difference between MC simulations, TPS, and ionization chamber point measurements was less than 2.1%. For all plans, the MC calculated 3D dose distributions agreed well with the TPS calculated doses (gamma-factor values were less than 1 for more than 95% of the points considered). Machine performance QA was supplemented with an extensive DynaLog file analysis. A DynaLog file analysis showed that leaf position errors were less than 1 mm for 94% of the time and there were no leaf errors greater than 2.5 mm. The mean standard deviation in MU and gantry angle were 0.052 MU and 0.355 degrees, respectively, for the ten cases analyzed. The accuracy and flexibility of the Monte Carlo based RapidArc QA system were demonstrated. Good machine performance and accurate dose distribution delivery of RapidArc plans were observed. The sampling used in the TPS optimization algorithm was found to be adequate.
Catching errors with patient-specific pretreatment machine log file analysis.
Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa
2013-01-01
A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bronevetsky, G.
2014-09-01
Enables applications to emit log information into an output file and produced a structured visual summary of the log data, as well as various statistical analyses of it. This makes it easier for developers to understand the behavior of their applications.
Model Analyst’s Toolkit User Guide, Version 7.1.0
2015-08-01
Help > About) Environment details ( operating system ) metronome.log file, located in your MAT 7.1.0 installation folder Any log file that...requirements to run the Model Analyst’s Toolkit: Windows XP operating system (or higher) with Service Pack 2 and all critical Windows updates installed...application icon on your desktop Create a Quick Launch icon – Creates a MAT application icon on the taskbar for operating systems released
NASA Astrophysics Data System (ADS)
Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko
2017-02-01
A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were ⩽3 mm and ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases. This work was partly presented at the 58th Annual meeting of American Association of Physicists in Medicine.
Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko
2017-02-21
A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were ⩽3 mm and ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases.
20 CFR 401.85 - Exempt systems.
Code of Federal Regulations, 2011 CFR
2011-04-01
... subsection (k)(2) of the Privacy Act: (A) The General Criminal Investigation Files, SSA; (B) The Criminal Investigations File, SSA; and, (C) The Program Integrity Case Files, SSA. (D) Civil and Administrative Investigative Files of the Inspector General, SSA/OIG. (E) Complaint Files and Log. SSA/OGC. (iii) Pursuant to...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gunter, Dan; Lee, Jason; Stoufer, Martin
2003-03-28
The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation ofmore » application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and penodically try to reconnect broken TCP pipe. A typical use for this is to store data on local disk while net is down. An event archiver can log one or more incoming NetLogger streams to a local disk file (netlogd) or to a mySQL database (netarchd). We have found exploratory, visual analysis of the log event data to be the most useful means of determining the causes of performance anomalies The NetLogger Visualization tool, niv, has been developed to provide a flexible and interactive graphical representation of system-level and application-level events.« less
SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeMarco, J; McCloskey, S; Low, D
Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT planmore » file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment.« less
SU-E-T-184: Clinical VMAT QA Practice Using LINAC Delivery Log Files
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, H; Jacobson, T; Gu, X
2015-06-15
Purpose: To evaluate the accuracy of volumetric modulated arc therapy (VMAT) treatment delivery dose clouds by comparing linac log data to doses measured using an ionization chamber and film. Methods: A commercial IMRT quality assurance (QA) process utilizing a DICOM-RT framework was tested for clinical practice using 30 prostate and 30 head and neck VMAT plans. Delivered 3D VMAT dose distributions were independently checked using a PinPoint ionization chamber and radiographic film in a solid water phantom. DICOM RT coordinates were used to extract the corresponding point and planar doses from 3D log file dose distributions. Point doses were evaluatedmore » by computing the percent error between log file and chamber measured values. A planar dose evaluation was performed for each plan using a 2D gamma analysis with 3% global dose difference and 3 mm isodose point distance criteria. The same analysis was performed to compare treatment planning system (TPS) doses to measured values to establish a baseline assessment of agreement. Results: The mean percent error between log file and ionization chamber dose was 1.0%±2.1% for prostate VMAT plans and −0.2%±1.4% for head and neck plans. The corresponding TPS calculated and measured ionization chamber values agree within 1.7%±1.6%. The average 2D gamma passing rates for the log file comparison to film are 98.8%±1.0% and 96.2%±4.2% for the prostate and head and neck plans, respectively. The corresponding passing rates for the TPS comparison to film are 99.4%±0.5% and 93.9%±5.1%. Overall, the point dose and film data indicate that log file determined doses are in excellent agreement with measured values. Conclusion: Clinical VMAT QA practice using LINAC treatment log files is a fast and reliable method for patient-specific plan evaluation.« less
Real-Time Population Health Detector
2004-11-01
military and civilian populations. General Dynamics (then Veridian Systems Division), in cooperation with Stanford University, won a competitive DARPA...via the sequence of one-step ahead forecast errors from the Kalman recursions: 1| −−= tttt Hye µ The log-likelihood then follows by treating the... parking in the transient parking structure. Norfolk Area Military Treatment Facility Patient Files GDAIS received historic CHCS data from all
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiong, Z; Vijayan, S; Rana, V
2015-06-15
Purpose: A system was developed that automatically calculates the organ and effective dose for individual fluoroscopically-guided procedures using a log of the clinical exposure parameters. Methods: We have previously developed a dose tracking system (DTS) to provide a real-time color-coded 3D- mapping of skin dose. This software produces a log file of all geometry and exposure parameters for every x-ray pulse during a procedure. The data in the log files is input into PCXMC, a Monte Carlo program that calculates organ and effective dose for projections and exposure parameters set by the user. We developed a MATLAB program to readmore » data from the log files produced by the DTS and to automatically generate the definition files in the format used by PCXMC. The processing is done at the end of a procedure after all exposures are completed. Since there are thousands of exposure pulses with various parameters for fluoroscopy, DA and DSA and at various projections, the data for exposures with similar parameters is grouped prior to entry into PCXMC to reduce the number of Monte Carlo calculations that need to be performed. Results: The software developed automatically transfers data from the DTS log file to PCXMC and runs the program for each grouping of exposure pulses. When the dose from all exposure events are calculated, the doses for each organ and all effective doses are summed to obtain procedure totals. For a complicated interventional procedure, the calculations can be completed on a PC without manual intervention in less than 30 minutes depending on the level of data grouping. Conclusion: This system allows organ dose to be calculated for individual procedures for every patient without tedious calculations or data entry so that estimates of stochastic risk can be obtained in addition to the deterministic risk estimate provided by the DTS. Partial support from NIH grant R01EB002873 and Toshiba Medical Systems Corp.« less
17 CFR 274.402 - Form ID, uniform application for access codes to file on EDGAR.
Code of Federal Regulations, 2010 CFR
2010-04-01
... for access codes to file on EDGAR. 274.402 Section 274.402 Commodity and Securities Exchanges... Forms for Electronic Filing § 274.402 Form ID, uniform application for access codes to file on EDGAR..., filing agent or training agent to log on to the EDGAR system, submit filings, and change its CCC. (d...
17 CFR 274.402 - Form ID, uniform application for access codes to file on EDGAR.
Code of Federal Regulations, 2011 CFR
2011-04-01
... for access codes to file on EDGAR. 274.402 Section 274.402 Commodity and Securities Exchanges... Forms for Electronic Filing § 274.402 Form ID, uniform application for access codes to file on EDGAR..., filing agent or training agent to log on to the EDGAR system, submit filings, and change its CCC. (d...
Crangle, Robert D.
2007-01-01
Introduction The U.S. Geological Survey (USGS) uses geophysical wireline well logs for a variety of purposes, including stratigraphic correlation (Hettinger, 2001, Ryder, 2002), petroleum reservoir analyses (Nelson and Bird, 2005), aquifer studies (Balch, 1988), and synthetic seismic profiles (Kulander and Ryder, 2005). Commonly, well logs are easier to visualize, manipulate, and interpret when available in a digital format. In recent geologic cross sections E-E' and D-D', constructed through the central Appalachian basin (Ryder, Swezey, and others, in press; Ryder, Crangle, and others, in press), gamma ray well log traces and lithologic logs were used to correlate key stratigraphic intervals (Fig. 1). The stratigraphy and structure of the cross sections are illustrated through the use of graphical software applications (e.g., Adobe Illustrator). The gamma ray traces were digitized in Neuralog (proprietary software) from paper well logs and converted to a Log ASCII Standard (LAS) format. Once converted, the LAS files were transformed to images through an LAS-reader application (e.g., GeoGraphix Prizm) and then overlain in positions adjacent to well locations, used for stratigraphic control, on each cross section. This report summarizes the procedures used to convert paper logs to a digital LAS format using a third-party software application, Neuralog. Included in this report are LAS files for sixteen wells used in geologic cross section E-E' (Table 1) and thirteen wells used in geologic cross section D-D' (Table 2).
Workload Characterization and Performance Implications of Large-Scale Blog Servers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeon, Myeongjae; Kim, Youngjae; Hwang, Jeaho
With the ever-increasing popularity of social network services (SNSs), an understanding of the characteristics of these services and their effects on the behavior of their host servers is critical. However, there has been a lack of research on the workload characterization of servers running SNS applications such as blog services. To fill this void, we empirically characterized real-world web server logs collected from one of the largest South Korean blog hosting sites for 12 consecutive days. The logs consist of more than 96 million HTTP requests and 4.7 TB of network traffic. Our analysis reveals the followings: (i) The transfermore » size of non-multimedia files and blog articles can be modeled using a truncated Pareto distribution and a log-normal distribution, respectively; (ii) User access for blog articles does not show temporal locality, but is strongly biased towards those posted with image or audio files. We additionally discuss the potential performance improvement through clustering of small files on a blog page into contiguous disk blocks, which benefits from the observed file access patterns. Trace-driven simulations show that, on average, the suggested approach achieves 60.6% better system throughput and reduces the processing time for file access by 30.8% compared to the best performance of the Ext4 file system.« less
Jorde, Ulrich P; Aaronson, Keith D; Najjar, Samer S; Pagani, Francis D; Hayward, Christopher; Zimpfer, Daniel; Schlöglhofer, Thomas; Pham, Duc T; Goldstein, Daniel J; Leadley, Katrin; Chow, Ming-Jay; Brown, Michael C; Uriel, Nir
2015-11-01
The study sought to characterize patterns in the HeartWare (HeartWare Inc., Framingham, Massachusetts) ventricular assist device (HVAD) log files associated with successful medical treatment of device thrombosis. Device thrombosis is a serious adverse event for mechanical circulatory support devices and is often preceded by increased power consumption. Log files of the pump power are easily accessible on the bedside monitor of HVAD patients and may allow early diagnosis of device thrombosis. Furthermore, analysis of the log files may be able to predict the success rate of thrombolysis or the need for pump exchange. The log files of 15 ADVANCE trial patients (algorithm derivation cohort) with 16 pump thrombus events treated with tissue plasminogen activator (tPA) were assessed for changes in the absolute and rate of increase in power consumption. Successful thrombolysis was defined as a clinical resolution of pump thrombus including normalization of power consumption and improvement in biochemical markers of hemolysis. Significant differences in log file patterns between successful and unsuccessful thrombolysis treatments were verified in 43 patients with 53 pump thrombus events implanted outside of clinical trials (validation cohort). The overall success rate of tPA therapy was 57%. Successful treatments had significantly lower measures of percent of expected power (130.9% vs. 196.1%, p = 0.016) and rate of increase in power (0.61 vs. 2.87, p < 0.0001). Medical therapy was successful in 77.7% of the algorithm development cohort and 81.3% of the validation cohort when the rate of power increase and percent of expected power values were <1.25% and 200%, respectively. Log file parameters can potentially predict the likelihood of successful tPA treatments and if validated prospectively, could substantially alter the approach to thrombus management. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
A clinically observed discrepancy between image-based and log-based MLC positions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neal, Brian, E-mail: bpn2p@virginia.edu; Ahmed, Mahmoud; Kathuria, Kunal
2016-06-15
Purpose: To present a clinical case in which real-time intratreatment imaging identified an multileaf collimator (MLC) leaf to be consistently deviating from its programmed and logged position by >1 mm. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used to capture cine during treatment images. The author serendipitously visually identified a suspected MLC leaf displacement that was not otherwise detected. The leaf position as recorded on the EPID images was measured and log-files were analyzed for the treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days.more » Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log-file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3 ± 0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusions: It has been clinically observed that log-file derived leaf positions can differ from their actual position by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trust log-file records. Intratreatment EPID imaging provides a method to capture departures from MLC planned positions.« less
Politi, Liran; Codish, Shlomi; Sagy, Iftach; Fink, Lior
2014-12-01
Insights about patterns of system use are often gained through the analysis of system log files, which record the actual behavior of users. In a clinical context, however, few attempts have been made to typify system use through log file analysis. The present study offers a framework for identifying, describing, and discerning among patterns of use of a clinical information retrieval system. We use the session attributes of volume, diversity, granularity, duration, and content to define a multidimensional space in which each specific session can be positioned. We also describe an analytical method for identifying the common archetypes of system use in this multidimensional space. We demonstrate the value of the proposed framework with a log file of the use of a health information exchange (HIE) system by physicians in an emergency department (ED) of a large Israeli hospital. The analysis reveals five distinct patterns of system use, which have yet to be described in the relevant literature. The results of this study have the potential to inform the design of HIE systems for efficient and effective use, thus increasing their contribution to the clinical decision-making process. Copyright © 2014 Elsevier Inc. All rights reserved.
An analysis of image storage systems for scalable training of deep neural networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, Seung-Hwan; Young, Steven R; Patton, Robert M
This study presents a principled empirical evaluation of image storage systems for training deep neural networks. We employ the Caffe deep learning framework to train neural network models for three different data sets, MNIST, CIFAR-10, and ImageNet. While training the models, we evaluate five different options to retrieve training image data: (1) PNG-formatted image files on local file system; (2) pushing pixel arrays from image files into a single HDF5 file on local file system; (3) in-memory arrays to hold the pixel arrays in Python and C++; (4) loading the training data into LevelDB, a log-structured merge tree based key-valuemore » storage; and (5) loading the training data into LMDB, a B+tree based key-value storage. The experimental results quantitatively highlight the disadvantage of using normal image files on local file systems to train deep neural networks and demonstrate reliable performance with key-value storage based storage systems. When training a model on the ImageNet dataset, the image file option was more than 17 times slower than the key-value storage option. Along with measurements on training time, this study provides in-depth analysis on the cause of performance advantages/disadvantages of each back-end to train deep neural networks. We envision the provided measurements and analysis will shed light on the optimal way to architect systems for training neural networks in a scalable manner.« less
SU-F-T-233: Evaluation of Treatment Delivery Parameters Using High Resolution ELEKTA Log Files
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kabat, C; Defoor, D; Alexandrian, A
2016-06-15
Purpose: As modern linacs have become more technologically advanced with the implementation of IGRT and IMRT with HDMLCs, a requirement for more elaborate tracking techniques to monitor components’ integrity is paramount. ElektaLog files are generated every 40 milliseconds, which can be analyzed to track subtle changes and provide another aspect of quality assurance. This allows for constant monitoring of fraction consistency in addition to machine reliability. With this in mind, it was the aim of the study to evaluate if ElektaLog files can be utilized for linac consistency QA. Methods: ElektaLogs were reviewed for 16 IMRT patient plans with >16more » fractions. Logs were analyzed by creating fluence maps from recorded values of MLC locations, jaw locations, and dose per unit time. Fluence maps were then utilized to calculate a 2D gamma index with a 2%–2mm criteria for each fraction. ElektaLogs were also used to analyze positional errors for MLC leaves and jaws, which were used to compute an overall error for the MLC banks, Y-jaws, and X-jaws by taking the root-meansquare value of the individual recorded errors during treatment. Additionally, beam on time was calculated using the number of ElektaLog file entries within the file. Results: The average 2D gamma for all 16 patient plans was found to be 98.0±2.0%. Recorded gamma index values showed an acceptable correlation between fractions. Average RMS values for MLC leaves and the jaws resulted in a leaf variation of roughly 0.3±0.08 mm and jaw variation of about 0.15±0.04 mm, both of which fall within clinical tolerances. Conclusion: The use of ElektaLog files for day-to-day evaluation of linac integrity and patient QA can be utilized to allow for reliable analysis of system accuracy and performance.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ilsche, Thomas; Schuchart, Joseph; Cope, Joseph
Event tracing is an important tool for understanding the performance of parallel applications. As concurrency increases in leadership-class computing systems, the quantity of performance log data can overload the parallel file system, perturbing the application being observed. In this work we present a solution for event tracing at leadership scales. We enhance the I/O forwarding system software to aggregate and reorganize log data prior to writing to the storage system, significantly reducing the burden on the underlying file system for this type of traffic. Furthermore, we augment the I/O forwarding system with a write buffering capability to limit the impactmore » of artificial perturbations from log data accesses on traced applications. To validate the approach, we modify the Vampir tracing tool to take advantage of this new capability and show that the approach increases the maximum traced application size by a factor of 5x to more than 200,000 processors.« less
17 CFR 239.63 - Form ID, uniform application for access codes to file on EDGAR.
Code of Federal Regulations, 2011 CFR
2011-04-01
... for access codes to file on EDGAR. 239.63 Section 239.63 Commodity and Securities Exchanges SECURITIES... Statements § 239.63 Form ID, uniform application for access codes to file on EDGAR. Form ID must be filed by... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification...
17 CFR 239.63 - Form ID, uniform application for access codes to file on EDGAR.
Code of Federal Regulations, 2010 CFR
2010-04-01
... for access codes to file on EDGAR. 239.63 Section 239.63 Commodity and Securities Exchanges SECURITIES... Statements § 239.63 Form ID, uniform application for access codes to file on EDGAR. Form ID must be filed by... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification...
Quantification of residual dose estimation error on log file-based patient dose calculation.
Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Matsunaga, Kenichi; Matsushita, Haruo; Majima, Kazuhiro; Jingu, Keiichi
2016-05-01
The log file-based patient dose estimation includes a residual dose estimation error caused by leaf miscalibration, which cannot be reflected on the estimated dose. The purpose of this study is to determine this residual dose estimation error. Modified log files for seven head-and-neck and prostate volumetric modulated arc therapy (VMAT) plans simulating leaf miscalibration were generated by shifting both leaf banks (systematic leaf gap errors: ±2.0, ±1.0, and ±0.5mm in opposite directions and systematic leaf shifts: ±1.0mm in the same direction) using MATLAB-based (MathWorks, Natick, MA) in-house software. The generated modified and non-modified log files were imported back into the treatment planning system and recalculated. Subsequently, the generalized equivalent uniform dose (gEUD) was quantified for the definition of the planning target volume (PTV) and organs at risks. For MLC leaves calibrated within ±0.5mm, the quantified residual dose estimation errors that obtained from the slope of the linear regression of gEUD changes between non- and modified log file doses per leaf gap are in head-and-neck plans 1.32±0.27% and 0.82±0.17Gy for PTV and spinal cord, respectively, and in prostate plans 1.22±0.36%, 0.95±0.14Gy, and 0.45±0.08Gy for PTV, rectum, and bladder, respectively. In this work, we determine the residual dose estimation errors for VMAT delivery using the log file-based patient dose calculation according to the MLC calibration accuracy. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
User's manual for SEDCALC, a computer program for computation of suspended-sediment discharge
Koltun, G.F.; Gray, John R.; McElhone, T.J.
1994-01-01
Sediment-Record Calculations (SEDCALC), a menu-driven set of interactive computer programs, was developed to facilitate computation of suspended-sediment records. The programs comprising SEDCALC were developed independently in several District offices of the U.S. Geological Survey (USGS) to minimize the intensive labor associated with various aspects of sediment-record computations. SEDCALC operates on suspended-sediment-concentration data stored in American Standard Code for Information Interchange (ASCII) files in a predefined card-image format. Program options within SEDCALC can be used to assist in creating and editing the card-image files, as well as to reformat card-image files to and from formats used by the USGS Water-Quality System. SEDCALC provides options for creating card-image files containing time series of equal-interval suspended-sediment concentrations from 1. digitized suspended-sediment-concentration traces, 2. linear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals, and 3. nonlinear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals. Suspended-sediment discharge can be computed from the streamflow and suspended-sediment-concentration data or by application of transport relations derived by regressing log-transformed instantaneous streamflows on log-transformed instantaneous suspended-sediment concentrations or discharges. The computed suspended-sediment discharge data are stored in card-image files that can be either directly imported to the USGS Automated Data Processing System or used to generate plots by means of other SEDCALC options.
Building analytical platform with Big Data solutions for log files of PanDA infrastructure
NASA Astrophysics Data System (ADS)
Alekseev, A. A.; Barreiro Megino, F. G.; Klimentov, A. A.; Korchuganova, T. A.; Maendo, T.; Padolski, S. V.
2018-05-01
The paper describes the implementation of a high-performance system for the processing and analysis of log files for the PanDA infrastructure of the ATLAS experiment at the Large Hadron Collider (LHC), responsible for the workload management of order of 2M daily jobs across the Worldwide LHC Computing Grid. The solution is based on the ELK technology stack, which includes several components: Filebeat, Logstash, ElasticSearch (ES), and Kibana. Filebeat is used to collect data from logs. Logstash processes data and export to Elasticsearch. ES are responsible for centralized data storage. Accumulated data in ES can be viewed using a special software Kibana. These components were integrated with the PanDA infrastructure and replaced previous log processing systems for increased scalability and usability. The authors will describe all the components and their configuration tuning for the current tasks, the scale of the actual system and give several real-life examples of how this centralized log processing and storage service is used to showcase the advantages for daily operations.
A Scientific Data Provenance Harvester for Distributed Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stephan, Eric G.; Raju, Bibi; Elsethagen, Todd O.
Data provenance provides a way for scientists to observe how experimental data originates, conveys process history, and explains influential factors such as experimental rationale and associated environmental factors from system metrics measured at runtime. The US Department of Energy Office of Science Integrated end-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project has developed a provenance harvester that is capable of collecting observations from file based evidence typically produced by distributed applications. To achieve this, file based evidence is extracted and transformed into an intermediate data format inspired in part by W3C CSV on the Web recommendations, calledmore » the Harvester Provenance Application Interface (HAPI) syntax. This syntax provides a general means to pre-stage provenance into messages that are both human readable and capable of being written to a provenance store, Provenance Environment (ProvEn). HAPI is being applied to harvest provenance from climate ensemble runs for Accelerated Climate Modeling for Energy (ACME) project funded under the U.S. Department of Energy’s Office of Biological and Environmental Research (BER) Earth System Modeling (ESM) program. ACME informally provides provenance in a native form through configuration files, directory structures, and log files that contain success/failure indicators, code traces, and performance measurements. Because of its generic format, HAPI is also being applied to harvest tabular job management provenance from Belle II DIRAC scheduler relational database tables as well as other scientific applications that log provenance related information.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shiinoki, T; Hanazawa, H; Shibuya, K
Purpose: The respirato ry gating system combined the TrueBeam and a new real-time tumor-tracking radiotherapy system (RTRT) was installed. The RTRT system consists of two x-ray tubes and color image intensifiers. Using fluoroscopic images, the fiducial marker which was implanted near the tumor was tracked and was used as the internal surrogate for respiratory gating. The purposes of this study was to develop the verification technique of the respiratory gating with the new RTRT using cine electronic portal image device images (EPIDs) of TrueBeam and log files of the RTRT. Methods: A patient who underwent respiratory gated SBRT of themore » lung using the RTRT were enrolled in this study. For a patient, the log files of three-dimensional coordinate of fiducial marker used as an internal surrogate were acquired using the RTRT. Simultaneously, the cine EPIDs were acquired during respiratory gated radiotherapy. The data acquisition was performed for one field at five sessions during the course of SBRT. The residual motion errors were calculated using the log files (E{sub log}). The fiducial marker used as an internal surrogate into the cine EPIDs was automatically extracted by in-house software based on the template-matching algorithm. The differences between the the marker positions of cine EPIDs and digitally reconstructed radiograph were calculated (E{sub EPID}). Results: Marker detection on EPID using in-house software was influenced by low image contrast. For one field during the course of SBRT, the respiratory gating using the RTRT showed the mean ± S.D. of 95{sup th} percentile E{sub EPID} were 1.3 ± 0.3 mm,1.1 ± 0.5 mm,and those of E{sub log} were 1.5 ± 0.2 mm, 1.1 ± 0.2 mm in LR and SI directions, respectively. Conclusion: We have developed the verification method of respiratory gating combined TrueBeam and new real-time tumor-tracking radiotherapy system using EPIDs and log files.« less
17 CFR 259.602 - Form ID, uniform application for access codes to file on EDGAR.
Code of Federal Regulations, 2011 CFR
2011-04-01
...)—allows a filer, filing agent or training agent to log on to the EDGAR system, submit filings, and change... agent to change its Password. [69 FR 22710, Apr. 26, 2004] Editorial Note: For Federal Register... section of the printed volume and at at www.fdsys.gov. ...
17 CFR 259.602 - Form ID, uniform application for access codes to file on EDGAR.
Code of Federal Regulations, 2010 CFR
2010-04-01
...)—allows a filer, filing agent or training agent to log on to the EDGAR system, submit filings, and change... agent to change its Password. [69 FR 22710, Apr. 26, 2004] Editorial Note: For Federal Register... section of the printed volume and on GPO Access. ...
Comparing image search behaviour in the ARRS GoldMiner search engine and a clinical PACS/RIS.
De-Arteaga, Maria; Eggel, Ivan; Do, Bao; Rubin, Daniel; Kahn, Charles E; Müller, Henning
2015-08-01
Information search has changed the way we manage knowledge and the ubiquity of information access has made search a frequent activity, whether via Internet search engines or increasingly via mobile devices. Medical information search is in this respect no different and much research has been devoted to analyzing the way in which physicians aim to access information. Medical image search is a much smaller domain but has gained much attention as it has different characteristics than search for text documents. While web search log files have been analysed many times to better understand user behaviour, the log files of hospital internal systems for search in a PACS/RIS (Picture Archival and Communication System, Radiology Information System) have rarely been analysed. Such a comparison between a hospital PACS/RIS search and a web system for searching images of the biomedical literature is the goal of this paper. Objectives are to identify similarities and differences in search behaviour of the two systems, which could then be used to optimize existing systems and build new search engines. Log files of the ARRS GoldMiner medical image search engine (freely accessible on the Internet) containing 222,005 queries, and log files of Stanford's internal PACS/RIS search called radTF containing 18,068 queries were analysed. Each query was preprocessed and all query terms were mapped to the RadLex (Radiology Lexicon) terminology, a comprehensive lexicon of radiology terms created and maintained by the Radiological Society of North America, so the semantic content in the queries and the links between terms could be analysed, and synonyms for the same concept could be detected. RadLex was mainly created for the use in radiology reports, to aid structured reporting and the preparation of educational material (Lanlotz, 2006) [1]. In standard medical vocabularies such as MeSH (Medical Subject Headings) and UMLS (Unified Medical Language System) specific terms of radiology are often underrepresented, therefore RadLex was considered to be the best option for this task. The results show a surprising similarity between the usage behaviour in the two systems, but several subtle differences can also be noted. The average number of terms per query is 2.21 for GoldMiner and 2.07 for radTF, the used axes of RadLex (anatomy, pathology, findings, …) have almost the same distribution with clinical findings being the most frequent and the anatomical entity the second; also, combinations of RadLex axes are extremely similar between the two systems. Differences include a longer length of the sessions in radTF than in GoldMiner (3.4 and 1.9 queries per session on average). Several frequent search terms overlap but some strong differences exist in the details. In radTF the term "normal" is frequent, whereas in GoldMiner it is not. This makes intuitive sense, as in the literature normal cases are rarely described whereas in clinical work the comparison with normal cases is often a first step. The general similarity in many points is likely due to the fact that users of the two systems are influenced by their daily behaviour in using standard web search engines and follow this behaviour in their professional search. This means that many results and insights gained from standard web search can likely be transferred to more specialized search systems. Still, specialized log files can be used to find out more on reformulations and detailed strategies of users to find the right content. Copyright © 2015 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owen, R. K.
2007-04-04
A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.
Patterns of usage for a Web-based clinical information system.
Chen, Elizabeth S; Cimino, James J
2004-01-01
Understanding how clinicians are using clinical information systems to assist with their everyday tasks is valuable to the system design and development process. Developers of such systems are interested in monitoring usage in order to make enhancements. System log files are rich resources for gaining knowledge about how the system is being used. We have analyzed the log files of our Web-based clinical information system (WebCIS) to obtain various usage statistics including which WebCIS features are frequently being used. We have also identified usage patterns, which convey how the user is traversing the system. We present our method and these results as well as describe how the results can be used to customize menus, shortcut lists, and patient reports in WebCIS and similar systems.
Sawmill: A Logging File System for a High-Performance RAID Disk Array
1995-01-01
from limiting disk performance, new controller architectures connect the disks directly to the network so that data movement bypasses the file server...These developments raise two questions for file systems: how to get the best performance from a RAID, and how to use such a controller architecture ...the RAID-II storage system; this architecture provides a fast data path that moves data rapidly among the disks, high-speed controller memory, and the
Predicting Correctness of Problem Solving from Low-Level Log Data in Intelligent Tutoring Systems
ERIC Educational Resources Information Center
Cetintas, Suleyman; Si, Luo; Xin, Yan Ping; Hord, Casey
2009-01-01
This paper proposes a learning based method that can automatically determine how likely a student is to give a correct answer to a problem in an intelligent tutoring system. Only log files that record students' actions with the system are used to train the model, therefore the modeling process doesn't require expert knowledge for identifying…
Tsukamoto, Takafumi; Yasunaga, Takuo
2014-11-01
Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also need to provide common workspace for analysis because the client is physically separated from a server. We solved the file format problem by extension of rules of OptionControlFile of Eos. Furthermore, to solve workspace problems, we have developed two type of system. The first system is to use only local environments. The user runs a web server provided by Eos, access to a web client through a web browser, and manipulate the local files with GUI on the web browser. The second system is employing PIONE (Process-rule for Input/Output Negotiation Environment), which is our developing platform that works under heterogenic distributed environment. The users can put their resources, such as microscopic images, text files and so on, into the server-side environment supported by PIONE, and so experts can write PIONE rule definition, which defines a workflow of image processing. PIONE run each image processing on suitable computers, following the defined rule. PIONE has the ability of interactive manipulation, and user is able to try a command with various setting values. In this situation, we contribute to auto-generation of GUI for a PIONE workflow.As advanced functions, we have developed a module to log user actions. The logs include information such as setting values in image processing, procedure of commands and so on. If we use the logs effectively, we can get a lot of advantages. For example, when an expert may discover some know-how of image processing, other users can also share logs including his know-hows and so we may obtain recommendation workflow of image analysis, if we analyze logs. To implement social platform of image processing for electron microscopists, we have developed system infrastructure, as well. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
REPHLEX II: An information management system for the ARS Water Data Base
NASA Astrophysics Data System (ADS)
Thurman, Jane L.
1993-08-01
The REPHLEX II computer system is an on-line information management system which allows scientists, engineers, and other researchers to retrieve data from the ARS Water Data Base using asynchronous communications. The system features two phone lines handling baud rates from 300 to 2400, customized menus to facilitate browsing, help screens, direct access to information and data files, electronic mail processing, file transfers using the XMODEM protocol, and log-in procedures which capture information on new users, process passwords, and log activity for a permanent audit trail. The primary data base on the REPHLEX II system is the ARS Water Data Base which consists of rainfall and runoff data from experimental agricultural watersheds located in the United States.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-15
...) screened intake structures; (3) a concrete powerhouse containing three turbine-generator units with a total... structures; (3) a concrete powerhouse containing three turbine-generator units with a total installed... by a log boom; (2) screened intake structures; (3) a concrete powerhouse containing three turbine...
17 CFR 249.446 - Form ID, uniform application for access codes to file on EDGAR.
Code of Federal Regulations, 2010 CFR
2010-04-01
... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification Authorization Code (PMAC)—allows a filer, filing agent or training agent to change its Password. [69 FR 22710... Sections Affected, which appears in the Finding Aids section of the printed volume and on GPO Access. ...
A Clustering Methodology of Web Log Data for Learning Management Systems
ERIC Educational Resources Information Center
Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros
2012-01-01
Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…
SU-E-T-392: Evaluation of Ion Chamber/film and Log File Based QA to Detect Delivery Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, C; Mason, B; Kirsner, S
2015-06-15
Purpose: Ion chamber and film (ICAF) is a method used to verify patient dose prior to treatment. More recently, log file based QA has been shown as an alternative for measurement based QA. In this study, we delivered VMAT plans with and without errors to determine if ICAF and/or log file based QA was able to detect the errors. Methods: Using two VMAT patients, the original treatment plan plus 7 additional plans with delivery errors introduced were generated and delivered. The erroneous plans had gantry, collimator, MLC, gantry and collimator, collimator and MLC, MLC and gantry, and gantry, collimator, andmore » MLC errors. The gantry and collimator errors were off by 4{sup 0} for one of the two arcs. The MLC error introduced was one in which the opening aperture didn’t move throughout the delivery of the field. For each delivery, an ICAF measurement was made as well as a dose comparison based upon log files. Passing criteria to evaluate the plans were ion chamber less and 5% and film 90% of pixels pass the 3mm/3% gamma analysis(GA). For log file analysis 90% of voxels pass the 3mm/3% 3D GA and beam parameters match what was in the plan. Results: Two original plans were delivered and passed both ICAF and log file base QA. Both ICAF and log file QA met the dosimetry criteria on 4 of the 12 erroneous cases analyzed (2 cases were not analyzed). For the log file analysis, all 12 erroneous plans alerted a mismatch in delivery versus what was planned. The 8 plans that didn’t meet criteria all had MLC errors. Conclusion: Our study demonstrates that log file based pre-treatment QA was able to detect small errors that may not be detected using an ICAF and both methods of were able to detect larger delivery errors.« less
Teaching an Old Log New Tricks with Machine Learning.
Schnell, Krista; Puri, Colin; Mahler, Paul; Dukatz, Carl
2014-03-01
To most people, the log file would not be considered an exciting area in technology today. However, these relatively benign, slowly growing data sources can drive large business transformations when combined with modern-day analytics. Accenture Technology Labs has built a new framework that helps to expand existing vendor solutions to create new methods of gaining insights from these benevolent information springs. This framework provides a systematic and effective machine-learning mechanism to understand, analyze, and visualize heterogeneous log files. These techniques enable an automated approach to analyzing log content in real time, learning relevant behaviors, and creating actionable insights applicable in traditionally reactive situations. Using this approach, companies can now tap into a wealth of knowledge residing in log file data that is currently being collected but underutilized because of its overwhelming variety and volume. By using log files as an important data input into the larger enterprise data supply chain, businesses have the opportunity to enhance their current operational log management solution and generate entirely new business insights-no longer limited to the realm of reactive IT management, but extending from proactive product improvement to defense from attacks. As we will discuss, this solution has immediate relevance in the telecommunications and security industries. However, the most forward-looking companies can take it even further. How? By thinking beyond the log file and applying the same machine-learning framework to other log file use cases (including logistics, social media, and consumer behavior) and any other transactional data source.
Coastal bathymetry data collected in 2011 from the Chandeleur Islands, Louisiana
DeWitt, Nancy T.; Pfeiffer, William R.; Bernier, Julie C.; Buster, Noreen A.; Miselis, Jennifer L.; Flocks, James G.; Reynolds, Billy J.; Wiese, Dana S.; Kelso, Kyle W.
2014-01-01
This report serves as an archive of processed interferometric swath and single-beam bathymetry data. Geographic Iinformation System data products include a 50-meter cell-size interpolated bathymetry grid surface, trackline maps, and point data files. Additional files include error analysis maps, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.
17 CFR 249.446 - Form ID, uniform application for access codes to file on EDGAR.
Code of Federal Regulations, 2011 CFR
2011-04-01
... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification Authorization Code (PMAC)—allows a filer, filing agent or training agent to change its Password. [69 FR 22710... Sections Affected, which appears in the Finding Aids section of the printed volume and at at www.fdsys.gov. ...
ERIC Educational Resources Information Center
Gobert, Janice D.; Sao Pedro, Michael; Raziuddin, Juelaila; Baker, Ryan S.
2013-01-01
We present a method for assessing science inquiry performance, specifically for the inquiry skill of designing and conducting experiments, using educational data mining on students' log data from online microworlds in the Inq-ITS system (Inquiry Intelligent Tutoring System; www.inq-its.org). In our approach, we use a 2-step process: First we use…
Analysis of the request patterns to the NSSDC on-line archive
NASA Technical Reports Server (NTRS)
Johnson, Theodore
1994-01-01
NASA missions, both for earth science and for space science, collect huge amounts of data, and the rate at which data is being gathered is increasing. For example, the EOSDIS project is expected to collect petabytes per year. In addition, these archives are being made available to remote users over the Internet. The ability to manage the growth of the size and request activity of scientific archives depends on an understanding of the access patterns of scientific users. The National Space Science Data Center (NSSDC) of NASA Goddard Space Flight Center has run their on-line mass storage archive of space data, the National Data Archive and Distribution Service (NDADS), since November 1991. A large world-wide space research community makes use of NSSDC, requesting more than 20,000 files per month. Since the initiation of their service, they have maintained log files which record all accesses the archive. In this report, we present an analysis of the NDADS log files. We analyze the log files, and discuss several issues, including caching, reference patterns, clustering, and system loading.
ERIC Educational Resources Information Center
Lee, Young-Jin
2015-01-01
This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex
2012-01-01
LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takahashi, R; Kamima, T; Tachibana, H
2016-06-15
Purpose: To investigate the effect of the trajectory files from linear accelerator for Clarkson-based independent dose verification in IMRT and VMAT plans. Methods: A CT-based independent dose verification software (Simple MU Analysis: SMU, Triangle Products, Japan) with a Clarksonbased algorithm was modified to calculate dose using the trajectory log files. Eclipse with the three techniques of step and shoot (SS), sliding window (SW) and Rapid Arc (RA) was used as treatment planning system (TPS). In this study, clinically approved IMRT and VMAT plans for prostate and head and neck (HN) at two institutions were retrospectively analyzed to assess the dosemore » deviation between DICOM-RT plan (PL) and trajectory log file (TJ). An additional analysis was performed to evaluate MLC error detection capability of SMU when the trajectory log files was modified by adding systematic errors (0.2, 0.5, 1.0 mm) and random errors (5, 10, 30 mm) to actual MLC position. Results: The dose deviations for prostate and HN in the two sites were 0.0% and 0.0% in SS, 0.1±0.0%, 0.1±0.1% in SW and 0.6±0.5%, 0.7±0.9% in RA, respectively. The MLC error detection capability shows the plans for HN IMRT were the most sensitive and 0.2 mm of systematic error affected 0.7% dose deviation on average. Effect of the MLC random error did not affect dose error. Conclusion: The use of trajectory log files including actual information of MLC location, gantry angle, etc should be more effective for an independent verification. The tolerance level for the secondary check using the trajectory file may be similar to that of the verification using DICOM-RT plan file. From the view of the resolution of MLC positional error detection, the secondary check could detect the MLC position error corresponding to the treatment sites and techniques. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less
A Prototype Implementation of a Time Interval File Protection System in Linux
2006-09-01
when a user logs in, the /etc/ passwd file is read by the system to get the user’s home directory. The user’s login shell then changes the directory...and don. • Users can be added with the command: # useradd – m <username> • Set the password by: # passwd <username> • Make a copy of the
PDB explorer -- a web based algorithm for protein annotation viewer and 3D visualization.
Nayarisseri, Anuraj; Shardiwal, Rakesh Kumar; Yadav, Mukesh; Kanungo, Neha; Singh, Pooja; Shah, Pratik; Ahmed, Sheaza
2014-12-01
The PDB file format, is a text format characterizing the three dimensional structures of macro molecules available in the Protein Data Bank (PDB). Determined protein structure are found in coalition with other molecules or ions such as nucleic acids, water, ions, Drug molecules and so on, which therefore can be described in the PDB format and have been deposited in PDB database. PDB is a machine generated file, it's not human readable format, to read this file we need any computational tool to understand it. The objective of our present study is to develop a free online software for retrieval, visualization and reading of annotation of a protein 3D structure which is available in PDB database. Main aim is to create PDB file in human readable format, i.e., the information in PDB file is converted in readable sentences. It displays all possible information from a PDB file including 3D structure of that file. Programming languages and scripting languages like Perl, CSS, Javascript, Ajax, and HTML have been used for the development of PDB Explorer. The PDB Explorer directly parses the PDB file, calling methods for parsed element secondary structure element, atoms, coordinates etc. PDB Explorer is freely available at http://www.pdbexplorer.eminentbio.com/home with no requirement of log-in.
Detection of Anomalous Insiders in Collaborative Environments via Relational Analysis of Access Logs
Chen, You; Malin, Bradley
2014-01-01
Collaborative information systems (CIS) are deployed within a diverse array of environments, ranging from the Internet to intelligence agencies to healthcare. It is increasingly the case that such systems are applied to manage sensitive information, making them targets for malicious insiders. While sophisticated security mechanisms have been developed to detect insider threats in various file systems, they are neither designed to model nor to monitor collaborative environments in which users function in dynamic teams with complex behavior. In this paper, we introduce a community-based anomaly detection system (CADS), an unsupervised learning framework to detect insider threats based on information recorded in the access logs of collaborative environments. CADS is based on the observation that typical users tend to form community structures, such that users with low a nity to such communities are indicative of anomalous and potentially illicit behavior. The model consists of two primary components: relational pattern extraction and anomaly detection. For relational pattern extraction, CADS infers community structures from CIS access logs, and subsequently derives communities, which serve as the CADS pattern core. CADS then uses a formal statistical model to measure the deviation of users from the inferred communities to predict which users are anomalies. To empirically evaluate the threat detection model, we perform an analysis with six months of access logs from a real electronic health record system in a large medical center, as well as a publicly-available dataset for replication purposes. The results illustrate that CADS can distinguish simulated anomalous users in the context of real user behavior with a high degree of certainty and with significant performance gains in comparison to several competing anomaly detection models. PMID:25485309
[Investigation of Elekta linac characteristics for VMAT].
Luo, Guangwen; Zhang, Kunyi
2012-01-01
The aim of this study is to investigate the characteristics of Elekta delivery system for volumetric modulated arc therapy (VMAT). Five VMAT plans were delivered in service mode and dose rates, and speed of gantry and MLC leaves were analyzed by log files. Results showed that dose rates varied between 6 dose rates. Gantry and MLC leaf speed dynamically varied during delivery. The technique of VMAT requires linac to dynamically control more parameters, and these key dynamic variables during VMAT delivery can be checked by log files. Quality assurance procedure should be carried out for VMAT related parameter.
Creative Analytics of Mission Ops Event Messages
NASA Technical Reports Server (NTRS)
Smith, Dan
2017-01-01
Historically, tremendous effort has been put into processing and displaying mission health and safety telemetry data; and relatively little attention has been paid to extracting information from missions time-tagged event log messages. Todays missions may log tens of thousands of messages per day and the numbers are expected to dramatically increase as satellite fleets and constellations are launched, as security monitoring continues to evolve, and as the overall complexity of ground system operations increases. The logs may contain information about orbital events, scheduled and actual observations, device status and anomalies, when operators were logged on, when commands were resent, when there were data drop outs or system failures, and much much more. When dealing with distributed space missions or operational fleets, it becomes even more important to systematically analyze this data. Several advanced information systems technologies make it appropriate to now develop analytic capabilities which can increase mission situational awareness, reduce mission risk, enable better event-driven automation and cross-mission collaborations, and lead to improved operations strategies: Industry Standard for Log Messages. The Object Management Group (OMG) Space Domain Task Force (SDTF) standards organization is in the process of creating a formal standard for industry for event log messages. The format is based on work at NASA GSFC. Open System Architectures. The DoD, NASA, and others are moving towards common open system architectures for mission ground data systems based on work at NASA GSFC with the full support of the commercial product industry and major integration contractors. Text Analytics. A specific area of data analytics which applies statistical, linguistic, and structural techniques to extract and classify information from textual sources. This presentation describes work now underway at NASA to increase situational awareness through the collection of non-telemetry mission operations information into a common log format and then providing display and analytics tools to provide in-depth assessment of the log contents. The work includes: Common interface formats for acquiring time-tagged text messages Conversion of common files for schedules, orbital events, and stored commands to the common log format Innovative displays to depict thousands of messages on a single display Structured English text queries against the log message data store, extensible to a more mature natural language query capability Goal of speech-to-text and text-to-speech additions to create a personal mission operations assistant to aid on-console operations. A wide variety of planned uses identified by the mission operations teams will be discussed.
46 CFR Appendix A to Part 530 - Instructions for the Filing of Service Contracts
Code of Federal Regulations, 2011 CFR
2011-10-01
... file service contracts. BTCL will direct OIRM to provide approved filers with a log-on ID and password. Filers who wish a third party (publisher) to file their service contracts must so indicate on Form FMC-83... home page, http://www.fmc.gov. A. Registration, Log-on ID and Password To register for filing, a...
46 CFR Appendix A to Part 530 - Instructions for the Filing of Service Contracts
Code of Federal Regulations, 2010 CFR
2010-10-01
... file service contracts. BTCL will direct OIRM to provide approved filers with a log-on ID and password. Filers who wish a third party (publisher) to file their service contracts must so indicate on Form FMC-83... home page, http://www.fmc.gov. A. Registration, Log-on ID and Password To register for filing, a...
Ambrozy, C; Kolar, N A; Rattay, F
2010-01-01
For measurement value logging of board angle values during balance training, it is necessary to develop a measurement system. This study will provide data for a balance study using the smartcard. The data acquisition comes automatically. An individually training plan for each proband is necessary. To store the proband identification a smartcard with an I2C data bus protocol and an E2PROM memory system is used. For reading the smartcard data a smartcard reader is connected via universal serial bus (USB) to a notebook. The data acquisition and smartcard read programme is designed with Microsoft® Visual C#. A training plan file contains the individual training plan for each proband. The data of the test persons are saved in a proband directory. Each event is automatically saved as a log-file for the exact documentation. This system makes study development easy and time-saving.
Rowe, Steven P; Siddiqui, Adeel; Bonekamp, David
2014-07-01
To create novel radiology key image software that is easy to use for novice users, incorporates elements adapted from social networking Web sites, facilitates resident and fellow education, and can serve as the engine for departmental sharing of interesting cases and follow-up studies. Using open-source programming languages and software, radiology key image software (the key image and case log application, KICLA) was developed. This system uses a lightweight interface with the institutional picture archiving and communications systems and enables the storage of key images, image series, and cine clips. It was designed to operate with minimal disruption to the radiologists' daily workflow. Many features of the user interface have been inspired by social networking Web sites, including image organization into private or public folders, flexible sharing with other users, and integration of departmental teaching files into the system. We also review the performance, usage, and acceptance of this novel system. KICLA was implemented at our institution and achieved widespread popularity among radiologists. A large number of key images have been transmitted to the system since it became available. After this early experience period, the most commonly encountered radiologic modalities are represented. A survey distributed to users revealed that most of the respondents found the system easy to use (89%) and fast at allowing them to record interesting cases (100%). Hundred percent of respondents also stated that they would recommend a system such as KICLA to their colleagues. The system described herein represents a significant upgrade to the Digital Imaging and Communications in Medicine teaching file paradigm with efforts made to maximize its ease of use and inclusion of characteristics inspired by social networking Web sites that allow the system additional functionality such as individual case logging. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
An EXCEL macro for importing log ASCII standard (LAS) files into EXCEL worksheets
NASA Astrophysics Data System (ADS)
Özkaya, Sait Ismail
1996-02-01
An EXCEL 5.0 macro is presented for converting a LAS text file into an EXCEL worksheet. Although EXCEL has commands for importing text files and parsing text lines, LAS files must be decoded line-by-line because three different delimiters are used to separate fields of differing length. The macro is intended to eliminate manual decoding of LAS version 2.0. LAS is a floppy disk format for storage and transfer of log data as text files. LAS was proposed by the Canadian Well Logging Society. The present EXCEL macro decodes different sections of a LAS file, separates, and places the fields into different columns of an EXCEL worksheet. To import a LAS file into EXCEL without errors, the file must not contain any unrecognized symbols, and the data section must be the last section. The program does not check for the presence of mandatory sections or fields as required by LAS rules. Once a file is incorporated into EXCEL, mandatory sections and fields may be inspected visually.
75 FR 27051 - Privacy Act of 1974: System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-13
... address and appears below: DOT/FMCSA 004 SYSTEM NAME: National Consumer Complaint Database (NCCDB.... A system, database, and procedures for filing and logging consumer complaints relating to household... are stored in an automated system operated and maintained at the Volpe National Transportation Systems...
75 FR 76426 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-08
..., access control lists, file system permissions, intrusion detection and prevention systems and log..., address, mailing address, country, organization, phone, fax, mobile, pager, Defense Switched Network (DSN..., address, mailing address, country, organization, phone, fax, mobile, pager, Defense Switched Network (DSN...
Visual behavior characterization for intrusion and misuse detection
NASA Astrophysics Data System (ADS)
Erbacher, Robert F.; Frincke, Deborah
2001-05-01
As computer and network intrusions become more and more of a concern, the need for better capabilities, to assist in the detection and analysis of intrusions also increase. System administrators typically rely on log files to analyze usage and detect misuse. However, as a consequence of the amount of data collected by each machine, multiplied by the tens or hundreds of machines under the system administrator's auspices, the entirety of the data available is neither collected nor analyzed. This is compounded by the need to analyze network traffic data as well. We propose a methodology for analyzing network and computer log information visually based on the analysis of the behavior of the users. Each user's behavior is the key to determining their intent and overriding activity, whether they attempt to hide their actions or not. Proficient hackers will attempt to hide their ultimate activities, which hinders the reliability of log file analysis. Visually analyzing the users''s behavior however, is much more adaptable and difficult to counteract.
Online Courses Assessment through Measuring and Archetyping of Usage Data
ERIC Educational Resources Information Center
Kazanidis, Ioannis; Theodosiou, Theodosios; Petasakis, Ioannis; Valsamidis, Stavros
2016-01-01
Database files and additional log files of Learning Management Systems (LMSs) contain an enormous volume of data which usually remain unexploited. A new methodology is proposed in order to analyse these data both on the level of both the courses and the learners. Specifically, "regression analysis" is proposed as a first step in the…
Automating linear accelerator quality assurance.
Eckhause, Tobias; Al-Hallaq, Hania; Ritter, Timothy; DeMarco, John; Farrey, Karl; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Perez, Mario; Park, SungYong; Booth, Jeremy T; Thorwarth, Ryan; Moran, Jean M
2015-10-01
The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The standard deviation in MLC position, as determined by EPID measurements, across the consortium was 0.33 mm for IMRT fields. With respect to the log files, the deviations between expected and actual positions for parameters were small (<0.12 mm) for all Linacs. Considering both log files and EPID measurements, all parameters were well within published tolerance values. Variations in collimator angle, MLC position, and gantry sag were also evaluated for all Linacs. The performance of the TrueBeam Linac model was shown to be consistent based on automated analysis of trajectory log files and EPID images acquired during delivery of a standardized test suite. The results can be compared directly to tolerance thresholds. In addition, sharing of results from standard tests across institutions can facilitate the identification of QA process and Linac changes. These reference values are presented along with the standard deviation for common tests so that the test suite can be used by other centers to evaluate their Linac performance against those in this consortium.
Sujansky, Walter; Wilson, Tom
2015-04-01
This report describes a grant-funded project to explore the use of DIRECT secure messaging for the electronic delivery of laboratory test results to outpatient physicians and electronic health record systems. The project seeks to leverage the inherent attributes of DIRECT secure messaging and electronic provider directories to overcome certain barriers to the delivery of lab test results in the outpatient setting. The described system enables laboratories that generate test results as HL7 messages to deliver these results as structured or unstructured documents attached to DIRECT secure messages. The system automatically analyzes generated HL7 messages and consults an electronic provider directory to determine the appropriate DIRECT address and delivery format for each indicated recipient. The system also enables lab results delivered to providers as structured attachments to be consumed by HL7 interface engines and incorporated into electronic health record systems. Lab results delivered as unstructured attachments may be printed or incorporated into patient records as PDF files. The system receives and logs acknowledgement messages to document the status of each transmitted lab result, and a graphical interface allows searching and review of this logged information. The described system is a fully implemented prototype that has been tested in a laboratory setting. Although this approach is promising, further work is required to pilot test the system in production settings with clinical laboratories and outpatient provider organizations. Copyright © 2015 Elsevier Inc. All rights reserved.
Pasler, Marlies; Kaas, Jochem; Perik, Thijs; Geuze, Job; Dreindl, Ralf; Künzler, Thomas; Wittkamper, Frits; Georg, Dietmar
2015-12-01
To systematically evaluate machine specific quality assurance (QA) for volumetric modulated arc therapy (VMAT) based on log files by applying a dynamic benchmark plan. A VMAT benchmark plan was created and tested on 18 Elekta linacs (13 MLCi or MLCi2, 5 Agility) at 4 different institutions. Linac log files were analyzed and a delivery robustness index was introduced. For dosimetric measurements an ionization chamber array was used. Relative dose deviations were assessed by mean gamma for each control point and compared to the log file evaluation. Fourteen linacs delivered the VMAT benchmark plan, while 4 linacs failed by consistently terminating the delivery. The mean leaf error (±1SD) was 0.3±0.2 mm for all linacs. Large MLC maximum errors up to 6.5 mm were observed at reversal positions. Delivery robustness index accounting for MLC position correction (0.8-1.0) correlated with delivery time (80-128 s) and depended on dose rate performance. Dosimetric evaluation indicated in general accurate plan reproducibility with γ(mean)(±1 SD)=0.4±0.2 for 1 mm/1%. However single control point analysis revealed larger deviations and attributed well to log file analysis. The designed benchmark plan helped identify linac related malfunctions in dynamic mode for VMAT. Log files serve as an important additional QA measure to understand and visualize dynamic linac parameters. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
A Varian DynaLog file-based procedure for patient dose-volume histogram-based IMRT QA.
Calvo-Ortega, Juan F; Teke, Tony; Moragues, Sandra; Pozo, Miquel; Casals-Farran, Joan
2014-03-06
In the present study, we describe a method based on the analysis of the dynamic MLC log files (DynaLog) generated by the controller of a Varian linear accelerator in order to perform patient-specific IMRT QA. The DynaLog files of a Varian Millennium MLC, recorded during an IMRT treatment, can be processed using a MATLAB-based code in order to generate the actual fluence for each beam and so recalculate the actual patient dose distribution using the Eclipse treatment planning system. The accuracy of the DynaLog-based dose reconstruction procedure was assessed by introducing ten intended errors to perturb the fluence of the beams of a reference plan such that ten subsequent erroneous plans were generated. In-phantom measurements with an ionization chamber (ion chamber) and planar dose measurements using an EPID system were performed to investigate the correlation between the measured dose changes and the expected ones detected by the reconstructed plans for the ten intended erroneous cases. Moreover, the method was applied to 20 cases of clinical plans for different locations (prostate, lung, breast, and head and neck). A dose-volume histogram (DVH) metric was used to evaluate the impact of the delivery errors in terms of dose to the patient. The ionometric measurements revealed a significant positive correlation (R² = 0.9993) between the variations of the dose induced in the erroneous plans with respect to the reference plan and the corresponding changes indicated by the DynaLog-based reconstructed plans. The EPID measurements showed that the accuracy of the DynaLog-based method to reconstruct the beam fluence was comparable with the dosimetric resolution of the portal dosimetry used in this work (3%/3 mm). The DynaLog-based reconstruction method described in this study is a suitable tool to perform a patient-specific IMRT QA. This method allows us to perform patient-specific IMRT QA by evaluating the result based on the DVH metric of the planning CT image (patient DVH-based IMRT QA).
ERIC Educational Resources Information Center
Cho, Moon-Heum; Yoo, Jin Soung
2017-01-01
Many researchers who are interested in studying students' online self-regulated learning (SRL) have heavily relied on self-reported surveys. Data mining is an alternative technique that can be used to discover students' SRL patterns from large data logs saved on a course management system. The purpose of this study was to identify students' online…
75 FR 69644 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-15
..., organization, phone, fax, mobile, pager, Defense Switched Network (DSN) phone, other fax, other mobile, other.../Transport Layer Security (SSL/ TLS) connections, access control lists, file system permissions, intrusion detection and prevention systems and log monitoring. Complete access to all records is restricted to and...
Ryder, Robert T.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.
2014-01-01
The appendixes in chapters E.4.1 and E.4.2 include (1) Log ASCII Standard (LAS) files, which encode gamma-ray, neutron, density, and other logs in text files that can be used by most well-logging software programs; and (2) graphic well-log traces. In the appendix to chapter E.4.1, the well-log traces are accompanied by lithologic descriptions with formation tops.
Improved grading system for structural logs for log homes
D.W. Green; T.M. Gorman; J.W. Evans; J.F. Murphy
2004-01-01
Current grading standards for logs used in log home construction use visual criteria to sort logs into either âwall logsâ or structural logs (round and sawn round timbers). The conservative nature of this grading system, and the grouping of stronger and weaker species for marketing purposes, probably results in the specification of logs with larger diameter than would...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orrell, S.; Ralstin, S.
1992-04-01
Many computer security plans specify that only a small percentage of the data processed will be classified. Thus, the bulk of the data on secure systems must be unclassified. Secure limited access sites operating approved classified computing systems sometimes also have a system ostensibly containing only unclassified files but operating within the secure environment. That system could be networked or otherwise connected to a classified system(s) in order that both be able to use common resources for file storage or computing power. Such a system must operate under the same rules as the secure classified systems. It is in themore » nature of unclassified files that they either came from, or will eventually migrate to, a non-secure system. Today, unclassified files are exported from systems within the secure environment typically by loading transport media and carrying them to an open system. Import of unclassified files is handled similarly. This media transport process, sometimes referred to as sneaker net, often is manually logged and controlled only by administrative procedures. A comprehensive system for secure bi-directional transfer of unclassified files between secure and open environments has yet to be developed. Any such secure file transport system should be required to meet several stringent criteria. It is the purpose of this document to begin a definition of these criteria.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orrell, S.; Ralstin, S.
1992-01-01
Many computer security plans specify that only a small percentage of the data processed will be classified. Thus, the bulk of the data on secure systems must be unclassified. Secure limited access sites operating approved classified computing systems sometimes also have a system ostensibly containing only unclassified files but operating within the secure environment. That system could be networked or otherwise connected to a classified system(s) in order that both be able to use common resources for file storage or computing power. Such a system must operate under the same rules as the secure classified systems. It is in themore » nature of unclassified files that they either came from, or will eventually migrate to, a non-secure system. Today, unclassified files are exported from systems within the secure environment typically by loading transport media and carrying them to an open system. Import of unclassified files is handled similarly. This media transport process, sometimes referred to as sneaker net, often is manually logged and controlled only by administrative procedures. A comprehensive system for secure bi-directional transfer of unclassified files between secure and open environments has yet to be developed. Any such secure file transport system should be required to meet several stringent criteria. It is the purpose of this document to begin a definition of these criteria.« less
Grider, Gary A.; Poole, Stephen W.
2015-09-01
Collective buffering and data pattern solutions are provided for storage, retrieval, and/or analysis of data in a collective parallel processing environment. For example, a method can be provided for data storage in a collective parallel processing environment. The method comprises receiving data to be written for a plurality of collective processes within a collective parallel processing environment, extracting a data pattern for the data to be written for the plurality of collective processes, generating a representation describing the data pattern, and saving the data and the representation.
Development of a Methodology for Customizing Insider Threat Auditing on a Linux Operating System
2010-03-01
information /etc/group, passwd ,gshadow,shadow,/security/opasswd 16 User A attempts to access User B directory 17 User A attempts to access User B file w/o...configuration Handled by audit rules for root actions Audit user write attempts to system files -w /etc/group –p wxa -w /etc/ passwd –p wxa -w /etc/gshadow –p...information (/etc/group, /etc/ passwd , /etc/gshadow, /etc/shadow, /etc/sudoers, /etc/security/opasswd) Procedure: 1. User2 logs into the system
The medium is NOT the message or Indefinitely long-term file storage at Leeds University
NASA Technical Reports Server (NTRS)
Holdsworth, David
1996-01-01
Approximately 3 years ago we implemented an archive file storage system which embodies experiences gained over more than 25 years of using and writing file storage systems. It is the third in-house system that we have written, and all three systems have been adopted by other institutions. This paper discusses the requirements for long-term data storage in a university environment, and describes how our present system is designed to meet these requirements indefinitely. Particular emphasis is laid on experiences from past systems, and their influence on current system design. We also look at the influence of the IEEE-MSS standard. We currently have the system operating in five UK universities. The system operates in a multi-server environment, and is currently operational with UNIX (SunOS4, Solaris2, SGI-IRIX, HP-UX), NetWare3 and NetWare4. PCs logged on to NetWare can also archive and recover files that live on their hard disks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shiinoki, T; Hanazawa, H; Park, S
2015-06-15
Purpose: We aim to achieve new four-dimensional radiotherapy (4DRT) using the next generation real-time tumor-tracking (RTRT) system and flattening-filter-free techniques. To achieve new 4DRT, it is necessary to understand the respiratory motion of tumor. The purposes of this study were: 1.To develop the respiratory motion analysis tool using log files. 2.To evaluate the reproducibility of tumor motion probability distribution function (PDF) during stereotactic body RT (SBRT) of lung tumor. Methods: Seven patients having fiducial markers closely implanted to the lung tumor were enrolled in this study. The positions of fiducial markers were measured using the RTRT system (Mitsubishi Electronics Co.,more » JP) and recorded as two types of log files during the course of SBRT. For each patients, tumor motion range and tumor motion PDFs in left-right (LR), anterior-posterior (AP) and superior-inferior (SI) directions were calculated using log files of all beams per fraction (PDFn). Fractional PDF reproducibility (Rn) was calculated as Kullback-Leibler (KL) divergence between PDF1 and PDFn of tumor motion. The mean of Rn (Rm) was calculated for each patient and correlated to the patient’s mean tumor motion range (Am). The change of Rm during the course of SBRT was also evluated. These analyses were performed using in-house developed software. Results: The Rm were 0.19 (0.07–0.30), 0.14 (0.07–0.32) and 0.16 (0.09–0.28) in LR, AP and SI directions, respectively. The Am were 5.11 mm (2.58–9.99 mm), 7.81 mm (2.87–15.57 mm) and 11.26 mm (3.80–21.27 mm) in LR, AP and SI directions, respectively. The PDF reproducibility decreased as the tumor motion range increased in AP and SI direction. That decreased slightly through the course of RT in SI direction. Conclusion: We developed the respiratory motion analysis tool for 4DRT using log files and quantified the range and reproducibility of respiratory motion for lung tumors.« less
The RIACS Intelligent Auditing and Categorizing System
NASA Technical Reports Server (NTRS)
Bishop, Matt
1988-01-01
The organization of the RIACS auditing package is described along with how to installation instructions and how to interpret the output. How to set up both local and remote file system auditing is given. Logging is done on a time driven basis, and auditing in a passive mode.
The design and implementation of the HY-1B Product Archive System
NASA Astrophysics Data System (ADS)
Liu, Shibin; Liu, Wei; Peng, Hailong
2010-11-01
Product Archive System (PAS), as a background system, is the core part of the Product Archive and Distribution System (PADS) which is the center for data management of the Ground Application System of HY-1B satellite hosted by the National Satellite Ocean Application Service of China. PAS integrates a series of updating methods and technologies, such as a suitable data transmittal mode, flexible configuration files and log information in order to make the system with several desirable characteristics, such as ease of maintenance, stability, minimal complexity. This paper describes seven major components of the PAS (Network Communicator module, File Collector module, File Copy module, Task Collector module, Metadata Extractor module, Product data Archive module, Metadata catalogue import module) and some of the unique features of the system, as well as the technical problems encountered and resolved.
Recommendations for Benchmarking Web Site Usage among Academic Libraries.
ERIC Educational Resources Information Center
Hightower, Christy; Sih, Julie; Tilghman, Adam
1998-01-01
To help library directors and Web developers create a benchmarking program to compare statistics of academic Web sites, the authors analyzed the Web server log files of 14 university science and engineering libraries. Recommends a centralized voluntary reporting structure coordinated by the Association of Research Libraries (ARL) and a method for…
Elementary School Students' Strategic Learning: Does Task-Type Matter?
ERIC Educational Resources Information Center
Malmberg, Jonna; Järvelä, Sanna; Kirschner, Paul A.
2014-01-01
This study investigated what types of learning patterns and strategies elementary school students use to carry out ill- and well-structured tasks. Specifically, it was investigated which and when learning patterns actually emerge with respect to students' task solutions. The present study uses computer log file traces to investigate how…
TU-CD-304-11: Veritas 2.0: A Cloud-Based Tool to Facilitate Research and Innovation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishra, P; Patankar, A; Etmektzoglou, A
Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verifiedmore » via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto.« less
46 CFR 97.35-3 - Logbooks and records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... voyage is completed, the master or person in charge shall file the logbook with the Officer in Charge.... Such logs or records are not filed with the Officer in Charge, Marine Inspection, but must be kept... logs for the period of validity of the vessel's certificate of inspection. [CGD 95-027, 61 FR 26007...
46 CFR 97.35-3 - Logbooks and records.
Code of Federal Regulations, 2011 CFR
2011-10-01
... voyage is completed, the master or person in charge shall file the logbook with the Officer in Charge.... Such logs or records are not filed with the Officer in Charge, Marine Inspection, but must be kept... logs for the period of validity of the vessel's certificate of inspection. [CGD 95-027, 61 FR 26007...
Structure of the top of the Karnak Limestone Member (Ste. Genevieve) in Illinois
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bristol, H.M.; Howard, R.H.
1976-01-01
To facilitate petroleum exploration in Illinois, the Illinois State Geological Survey presents a structure map (for most of southern Illinois) of the Karnak Limestone Member--a relatively pure persistent limestone unit (generally 10 to 35 ft thick) in the Ste. Genevieve Limestone of Genevievian age. All available electric logs and selected studies of well cuttings were used in constructing the map. Oil and gas development maps containing Karnak-structure contours are on open file at the ISGS.
SU-C-BRD-03: Analysis of Accelerator Generated Text Logs for Preemptive Maintenance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Able, CM; Baydush, AH; Nguyen, C
2014-06-15
Purpose: To develop a model to analyze medical accelerator generated parameter and performance data that will provide an early warning of performance degradation and impending component failure. Methods: A robust 6 MV VMAT quality assurance treatment delivery was used to test the constancy of accelerator performance. The generated text log files were decoded and analyzed using statistical process control (SPC) methodology. The text file data is a single snapshot of energy specific and overall systems parameters. A total of 36 system parameters were monitored which include RF generation, electron gun control, energy control, beam uniformity control, DC voltage generation, andmore » cooling systems. The parameters were analyzed using Individual and Moving Range (I/MR) charts. The chart limits were calculated using a hybrid technique that included the use of the standard 3σ limits and the parameter/system specification. Synthetic errors/changes were introduced to determine the initial effectiveness of I/MR charts in detecting relevant changes in operating parameters. The magnitude of the synthetic errors/changes was based on: the value of 1 standard deviation from the mean operating parameter of 483 TB systems, a small fraction (≤ 5%) of the operating range, or a fraction of the minor fault deviation. Results: There were 34 parameters in which synthetic errors were introduced. There were 2 parameters (radial position steering coil, and positive 24V DC) in which the errors did not exceed the limit of the I/MR chart. The I chart limit was exceeded for all of the remaining parameters (94.2%). The MR chart limit was exceeded in 29 of the 32 parameters (85.3%) in which the I chart limit was exceeded. Conclusion: Statistical process control I/MR evaluation of text log file parameters may be effective in providing an early warning of performance degradation or component failure for digital medical accelerator systems. Research is Supported by Varian Medical Systems, Inc.« less
Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.
Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen
2015-01-01
Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.
Self-Regulation during E-Learning: Using Behavioural Evidence from Navigation Log Files
ERIC Educational Resources Information Center
Jeske, D.; Backhaus, J.; Stamov Roßnagel, C.
2014-01-01
The current paper examined the relationship between perceived characteristics of the learning environment in an e-module in relation to test performance among a group of e-learners. Using structural equation modelling, the relationship between these variables is further explored in terms of the proposed double mediation as outlined by Ning and…
ERIC Educational Resources Information Center
Wise, Alyssa Friend; Perera, Nishan; Hsiao, Ying-Ting; Speer, Jennifer; Marbouti, Farshid
2012-01-01
This study presents three case studies of students' participation patterns in an online discussion to address the gap in our current understanding of how "individuals" experience asynchronous learning environments. Cases were constructed via microanalysis of log-file data, post contents, and the evolving discussion structure. The first student was…
Kelders, Saskia M.; Braakman-Jansen, Louise M. A.; van Gemert-Pijnen, Julia E. W. C.
2014-01-01
The electronic personal health record (PHR) is a promising technology for improving the quality of chronic disease management. Until now, evaluations of such systems have provided only little insight into why a particular outcome occurred. The aim of this study is to gain insight into the navigation process (what functionalities are used, and in what sequence) of e-Vita, a PHR for patients with type 2 diabetes mellitus (T2DM), to increase the efficiency of the system and improve the long-term adherence. Log data of the first visits in the first 6 weeks after the release of a renewed version of e-Vita were analyzed to identify the usage patterns that emerge when users explore a new application. After receiving the invitation, 28% of all registered users visited e-Vita. In total, 70 unique usage patterns could be identified. When users visited the education service first, 93% of all users ended their session. Most users visited either 1 or 5 or more services during their first session, but the distribution of the routes was diffuse. In conclusion, log file analyses can provide valuable prompts for improving the system design of a PHR. In this way, the match between the system and its users and the long-term adherence has the potential to increase. PMID:24876574
SU-F-T-295: MLCs Performance and Patient-Specific IMRT QA Using Log File Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osman, A; American University of Biuret Medical Center, Biuret; Maalej, N
2016-06-15
Purpose: To analyze the performance of the multi-leaf collimators (MLCs) from the log files recorded during the intensity modulated radiotherapy (IMRT) treatment and to construct the relative fluence maps and do the gamma analysis to compare the planned and executed MLCs movement. Methods: We developed a program to extract and analyze the data from dynamic log files (dynalog files) generated from sliding window IMRT delivery treatments. The program extracts the planned and executed (actual or delivered) MLCs movement, calculates and compares the relative planned and executed fluences. The fluence maps were used to perform the gamma analysis (with 3% dosemore » difference and 3 mm distance to agreement) for 3 IMR patients. We compared our gamma analysis results with those obtained from portal dose image prediction (PDIP) algorithm performed using the EPID. Results: For 3 different IMRT patient treatments, the maximum difference between the planned and the executed MCLs positions was 1.2 mm. The gamma analysis results of the planned and delivered fluences were in good agreement with the gamma analysis from portal dosimetry. The maximum difference for number of pixels passing the gamma criteria (3%/3mm) was 0.19% with respect to portal dosimetry results. Conclusion: MLC log files can be used to verify the performance of the MLCs. Patientspecific IMRT QA based on MLC movement log files gives similar results to EPID dosimetry results. This promising method for patient-specific IMRT QA is fast, does not require dose measurements in a phantom, can be done before the treatment and for every fraction, and significantly reduces the IMRT workload. The author would like to thank King Fahd University of petroleum and Minerals for the support.« less
20 CFR 658.410 - Establishment of State agency JS complaint system.
Code of Federal Regulations, 2011 CFR
2011-04-01
... system. At the local office level, the local office manager shall be responsible for the management of... related), the local office manager shall transmit a copy of that portion of the log containing the... established for the handling of complaints and files relating to the handling of complaints. The Manager or...
AliEn—ALICE environment on the GRID
NASA Astrophysics Data System (ADS)
Saiz, P.; Aphecetche, L.; Bunčić, P.; Piskač, R.; Revsbech, J.-E.; Šego, V.; Alice Collaboration
2003-04-01
AliEn ( http://alien.cern.ch) (ALICE Environment) is a Grid framework built on top of the latest Internet standards for information exchange and authentication (SOAP, PKI) and common Open Source components. AliEn provides a virtual file catalogue that allows transparent access to distributed datasets and a number of collaborating Web services which implement the authentication, job execution, file transport, performance monitor and event logging. In the paper we will present the architecture and components of the system.
NASA Technical Reports Server (NTRS)
Ferrara, Jeffrey; Calk, William; Atwell, William; Tsui, Tina
2013-01-01
MPISS is an automatic file transfer system that implements a combination of standard and mission-unique transfer protocols required by the Global Precipitation Measurement Mission (GPM) Precipitation Processing System (PPS) to control the flow of data between the MOC and the PPS. The primary features of MPISS are file transfers (both with and without PPS specific protocols), logging of file transfer and system events to local files and a standard messaging bus, short term storage of data files to facilitate retransmissions, and generation of file transfer accounting reports. The system includes a graphical user interface (GUI) to control the system, allow manual operations, and to display events in real time. The PPS specific protocols are an enhanced version of those that were developed for the Tropical Rainfall Measuring Mission (TRMM). All file transfers between the MOC and the PPS use the SSH File Transfer Protocol (SFTP). For reports and data files generated within the MOC, no additional protocols are used when transferring files to the PPS. For observatory data files, an additional handshaking protocol of data notices and data receipts is used. MPISS generates and sends to the PPS data notices containing data start and stop times along with a checksum for the file for each observatory data file transmitted. MPISS retrieves the PPS generated data receipts that indicate the success or failure of the PPS to ingest the data file and/or notice. MPISS retransmits the appropriate files as indicated in the receipt when required. MPISS also automatically retrieves files from the PPS. The unique feature of this software is the use of both standard and PPS specific protocols in parallel. The advantage of this capability is that it supports users that require the PPS protocol as well as those that do not require it. The system is highly configurable to accommodate the needs of future users.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tateoka, K; Graduate School of Medicine, Sapporo Medical University, Sapporo, JP; Fujimomo, K
2014-06-01
Purpose: The aim of the study is to evaluate the use of Varian DynaLog files to verify VMAT plans delivery and modulation complexity score (MCS) of VMAT. Methods: Delivery accuracy of machine performance was quantified by multileaf collimator (MLC) position errors, gantry angle errors and fluence delivery accuracy for volumetric modulated arc therapy (VMAT). The relationship between machine performance and plan complexity were also investigated using the modulation complexity score (MCS). Plan and Actual MLC positions, gantry angles and delivered fraction of monitor units were extracted from Varian DynaLog files. These factors were taken from the record and verify systemmore » of MLC control file. Planned and delivered beam data were compared to determine leaf position errors and gantry angle errors. Analysis was also performed on planned and actual fluence maps reconstructed from those of the DynaLog files. This analysis was performed for all treatment fractions of 5 prostate VMAT plans. The analysis of DynaLog files have been carried out by in-house programming in Visual C++. Results: The root mean square of leaf position and gantry angle errors were about 0.12 and 0.15, respectively. The Gamma of planned and actual fluence maps at 3%/3 mm criterion was about 99.21. The gamma of the leaf position errors were not directly related to plan complexity as determined by the MCS. Therefore, the gamma of the gantry angle errors were directly related to plan complexity as determined by the MCS. Conclusion: This study shows Varian dynalog files for VMAT plan can be diagnosed delivery errors not possible with phantom based quality assurance. Furthermore, the MCS of VMAT plan can evaluate delivery accuracy for patients receiving of VMAT. Machine performance was found to be directly related to plan complexity but this is not the dominant determinant of delivery accuracy.« less
VizieR Online Data Catalog: GOALS sample PACS and SPIRE fluxes (Chu+, 2017)
NASA Astrophysics Data System (ADS)
Chu, J. K.; Sanders, D. B.; Larson, K. L.; Mazzarella, J. M.; Howell, J. H.; Diaz-Santos, T.; Xu, K. C.; Paladini, R.; Schulz, B.; Shupe, D.; Appleton, P.; Armus, L.; Billot, N.; Chan, B. H. P.; Evans, A. S.; Fadda, D.; Frayer, D. T.; Haan, S.; Ishida, C. M.; Iwasawa, K.; Kim, D.-C.; Lord, S.; Murphy, E.; Petric, A.; Privon, G. C.; Surace, J. A.; Treister, E.
2017-06-01
The IRAS RBGS contains 179 LIRGs (log(LIR/Lȯ)= 22 ultra-luminous infrared galaxies (ULIRGs: log(LIR/Lȯ)>=12.0); these 201 total objects comprise the GOALS sample (Armus et al. 2009), a statistically complete flux-limited sample of infrared-luminous galaxies in the local universe. This paper presents imaging and photometry for all 201 LIRGs and LIRG systems in the IRAS RBGS that were observed during our GOALS Herschel OT1 program. (4 data files).
Nagi, Sana Ehsen; Khan, Farhan Raza; Rahman, Munawar
2016-03-01
This experimental study was done on extracted human teeth to compare the fracture and deformation of the two rotary endodontic files system namely K-3 and Protapers. It was conducted at the dental clinics of the Aga Khan University Hospital, Karachi, A log of file deformation or fracture during root canal preparation was kept. The location of fracture was noted along with the identity of the canal in which fracture took place. The fracture in the two rotary systems was compared. SPSS 20 was used for data analysis. Of the 172(80.4%) teeth possessing more than 15 degrees of curvature, fracture occurred in 7(4.1%) cases and deformation in 10(5.8%). Of the 42(19.6%) teeth possessing less than 15 degrees of curvature, fracture occurred in none of them while deformation was seen in 1(2.4%). There was no difference in K-3 and Protaper files with respect to file deformation and fracture. Most of the fractures occurred in mesiobuccal canals of maxillary molars, n=3(21.4%). The likelihood of file fracture increased 5.65-fold when the same file was used more than 3 times. Irrespective of the rotary system, apical third of the root canal space was the most common site for file fracture.
47 CFR 76.1706 - Signal leakage logs and repair records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... the probable cause of the leakage. The log shall be kept on file for a period of two years and shall... 47 Telecommunication 4 2010-10-01 2010-10-01 false Signal leakage logs and repair records. 76.1706... leakage logs and repair records. Cable operators shall maintain a log showing the date and location of...
47 CFR 76.1706 - Signal leakage logs and repair records.
Code of Federal Regulations, 2011 CFR
2011-10-01
... the probable cause of the leakage. The log shall be kept on file for a period of two years and shall... 47 Telecommunication 4 2011-10-01 2011-10-01 false Signal leakage logs and repair records. 76.1706... leakage logs and repair records. Cable operators shall maintain a log showing the date and location of...
Code of Federal Regulations, 2013 CFR
2013-04-01
... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...
Code of Federal Regulations, 2014 CFR
2014-04-01
... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...
Kim, Seok; Lee, Kee-Hyuck; Hwang, Hee; Yoo, Sooyoung
2016-01-30
Although the factors that affect the end-user's intention to use a new system and technology have been researched, the previous studies have been theoretical and do not verify the factors that affected the adoption of a new system. Thus, this study aimed to confirm the factors that influence users' intentions to utilize a mobile electronic health records (EMR) system using both a questionnaire survey and a log file analysis that represented the real use of the system. After observing the operation of a mobile EMR system in a tertiary university hospital for seven months, we performed an offline survey regarding the user acceptance of the system based on the Unified Theory of Acceptance and Use of Technology (UTAUT) and the Technology Acceptance Model (TAM). We surveyed 942 healthcare professionals over two weeks and performed a structural equation modeling (SEM) analysis to identify the intention to use the system among the participants. Next, we compared the results of the SEM analysis with the results of the analyses of the actual log files for two years to identify further insights into the factors that affected the intention of use. For these analyses, we used SAS 9.0 and AMOS 21. Of the 942 surveyed end-users, 48.3 % (23.2 % doctors and 68.3 % nurses) responded. After eliminating six subjects who completed the survey insincerely, we conducted the SEM analyses on the data from 449 subjects (65 doctors and 385 nurses). The newly suggested model satisfied the standards of model fitness, and the intention to use it was especially high due to the influences of Performance Expectancy on Attitude and Attitude. Based on the actual usage log analyses, both the doctors and nurses used the menus to view the inpatient lists, alerts, and patients' clinical data with high frequency. Specifically, the doctors frequently retrieved laboratory results, and the nurses frequently retrieved nursing notes and used the menu to assume the responsibilities of nursing work. In this study, the end-users' intentions to use the mobile EMR system were particularly influenced by Performance Expectancy and Attitude. In reality, the usage log revealed high-frequency use of the functions to improve the continuity of care and work efficiency. These results indicate the influence of the factor of performance expectancy on the intention to use the mobile EMR system. Consequently, we suggest that when determining the implementation of mobile EMR systems, the functions that are related to workflow with ability to increase performance should be considered first.
Development of Cross-Platform Software for Well Logging Data Visualization
NASA Astrophysics Data System (ADS)
Akhmadulin, R. K.; Miraev, A. I.
2017-07-01
Well logging data processing is one of the main sources of information in the oil-gas field analysis and is of great importance in the process of its development and operation. Therefore, it is important to have the software which would accurately and clearly provide the user with processed data in the form of well logs. In this work, there have been developed a software product which not only has the basic functionality for this task (loading data from .las files, well log curves display, etc.), but can be run in different operating systems and on different devices. In the article a subject field analysis and task formulation have been performed, and the software design stage has been considered. At the end of the work the resulting software product interface has been described.
Extracting the Textual and Temporal Structure of Supercomputing Logs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jain, S; Singh, I; Chandra, A
2009-05-26
Supercomputers are prone to frequent faults that adversely affect their performance, reliability and functionality. System logs collected on these systems are a valuable resource of information about their operational status and health. However, their massive size, complexity, and lack of standard format makes it difficult to automatically extract information that can be used to improve system management. In this work we propose a novel method to succinctly represent the contents of supercomputing logs, by using textual clustering to automatically find the syntactic structures of log messages. This information is used to automatically classify messages into semantic groups via an onlinemore » clustering algorithm. Further, we describe a methodology for using the temporal proximity between groups of log messages to identify correlated events in the system. We apply our proposed methods to two large, publicly available supercomputing logs and show that our technique features nearly perfect accuracy for online log-classification and extracts meaningful structural and temporal message patterns that can be used to improve the accuracy of other log analysis techniques.« less
Geonucleus, the freeware application for managing geological mapping data in GIS
NASA Astrophysics Data System (ADS)
Albert, Gáspár
2016-04-01
Geological mapping is the most traditional way of collecting information from the deposits and rocks. The traditional technique of the documentation was refined by generations of geologists. These traditions were implemented into Geonucleus to create a tool for precise data-recording after fieldwork, but giving the freedom of pondering the details of the observation as well. In 2012 a general xml-based data structure was worked out for storing field observations for the Geological Institute of Hungary (Albert et al. 2012). This structure was implemented into the desktop version of Geonucleus, which creates a database of the recorded data on the client computer. The application saves the complete database in one file, which can be loaded into a GIS. The observations can be saved in simple text format as well, but primarily the kml (Keyhole Markup Languege) is supported. This way, the observations are visualized in comprehensible forms (e.g. on a 3D surface model with satellite photos in Google Earth). If the kml is directly visualized in Google Earth, an info-bubble will appear via clicking on a pinpoint. It displays all the metadata (e.g. index, coordinates, date, logger name, etc.), the descriptions and the photos of the observed site. If a more general GIS application is the aim (e.g. Global Mapper or QGIS), the file can be saved in a different format, but still in a kml-structure. The simple text format is recommended if the observations are to be imported in a user-defined relational database system (RDB). Report text-type is also available if a detailed description of one or more observed site is needed. Importing waypoint gpx-files can quicken the logging. The code was written in VisualBasic.Net. The app is freely accessible from the geonucleus.elte.hu site and it can be installed on any system, which has the .Net framework 4.0 or higher. The software is bilingual (English and Hungarian), and the app is designed for general geological mapping purposes (e.g. quick logging of field trips). The layout of the GUI has three components: 1) metadata area, 2) general description area with unlimited storing capacity, 3) switchable panels for observations, measurements, photos and notes. The latter includes panels for stratigraphy, structures, fossils, samples, photo uploads and general notes. Details like the sequence and contact type of layers, the parameters of structures and slickensides, name and condition of fossils and purpose of sampling are also available to log (but not compulsorily). It is also a tool for teaching geological mapping, since the available parameters - listed in the app - draws attention to the details, which are to be observed on the field. Reference: Albert G, Csillag G, Fodor L, Zentai L. 2012: Visualisation of Geological Observations on Web 2.0 Based Maps, in: Zentai, L. and Reyes-Nunez, J (eds.): Maps for the Future - Children, Education and Internet, Series: Lecture Notes in Geoinformation and Cartography, Tentative volume 5 - Springer, pp. 165-178.
ERIC Educational Resources Information Center
Janning, Ruth; Schatten, Carlotta; Schmidt-Thieme, Lars
2016-01-01
Recognising students' emotion, affect or cognition is a relatively young field and still a challenging task in the area of intelligent tutoring systems. There are several ways to use the output of these recognition tasks within the system. The approach most often mentioned in the literature is using it for giving feedback to the students. The…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Able, CM; Baydush, AH; Nguyen, C
Purpose: To determine the effectiveness of SPC analysis for a model predictive maintenance process that uses accelerator generated parameter and performance data contained in trajectory log files. Methods: Each trajectory file is decoded and a total of 131 axes positions are recorded (collimator jaw position, gantry angle, each MLC, etc.). This raw data is processed and either axis positions are extracted at critical points during the delivery or positional change over time is used to determine axis velocity. The focus of our analysis is the accuracy, reproducibility and fidelity of each axis. A reference positional trace of the gantry andmore » each MLC is used as a motion baseline for cross correlation (CC) analysis. A total of 494 parameters (482 MLC related) were analyzed using Individual and Moving Range (I/MR) charts. The chart limits were calculated using a hybrid technique that included the use of the standard 3σ limits and parameter/system specifications. Synthetic errors/changes were introduced to determine the initial effectiveness of I/MR charts in detecting relevant changes in operating parameters. The magnitude of the synthetic errors/changes was based on: TG-142 and published analysis of VMAT delivery accuracy. Results: All errors introduced were detected. Synthetic positional errors of 2mm for collimator jaw and MLC carriage exceeded the chart limits. Gantry speed and each MLC speed are analyzed at two different points in the delivery. Simulated Gantry speed error (0.2 deg/sec) and MLC speed error (0.1 cm/sec) exceeded the speed chart limits. Gantry position error of 0.2 deg was detected by the CC maximum value charts. The MLC position error of 0.1 cm was detected by the CC maximum value location charts for every MLC. Conclusion: SPC I/MR evaluation of trajectory log file parameters may be effective in providing an early warning of performance degradation or component failure for medical accelerator systems.« less
Rule Systems for Runtime Verification: A Short Tutorial
NASA Astrophysics Data System (ADS)
Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex
In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.
DOT National Transportation Integrated Search
2001-02-01
The Minnesota data system includes the following basic files: Accident data (Accident File, Vehicle File, Occupant File); Roadlog File; Reference Post File; Traffic File; Intersection File; Bridge (Structures) File; and RR Grade Crossing File. For ea...
SU-E-T-784: Using MLC Log Files for Daily IMRT Delivery Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stathakis, S; Defoor, D; Linden, P
2015-06-15
Purpose: To verify daily intensity modulated radiation therapy (IMRT) treatments using multi-leaf collimator (MLC) log files. Methods: The MLC log files from a NovalisTX Varian linear accelerator were used in this study. The MLC files were recorded daily for all patients undergoing IMRT or volumetric modulated arc therapy (VMAT). The first record of each patient was used as reference and all records for subsequent days were compared against the reference. An in house MATLAB software code was used for the comparisons. Each MLC log file was converted to a fluence map (FM) and a gamma index (γ) analysis was usedmore » for the evaluation of each daily delivery for every patient. The tolerance for the gamma index was set to 2% dose difference and 2mm distance to agreement while points with signal of 10% or lower of the maximum value were excluded from the comparisons. Results: The γ between each of the reference FMs and the consecutive daily fraction FMs had an average value of 99.1% (ranged from 98.2 to 100.0%). The FM images were reconstructed at various resolutions in order to study the effect of the resolution on the γ and at the same time reduce the time for processing the images. We found that the comparison of images with the highest resolution (768×1024) yielded on average a lower γ (99.1%) than the ones with low resolution (192×256) (γ 99.5%). Conclusion: We developed an in-house software that allows us to monitor the quality of daily IMRT and VMAT treatment deliveries using information from the MLC log files of the linear accelerator. The information can be analyzed and evaluated as early as after the completion of each daily treatment. Such tool can be valuable to assess the effect of MLC positioning on plan quality, especially in the context of adaptive radiotherapy.« less
Ranade, Manisha K; Lynch, Bart D; Li, Jonathan G; Dempsey, James F
2006-01-01
We have developed an electronic portal imaging device (EPID) employing a fast scintillator and a high-speed camera. The device is designed to accurately and independently characterize the fluence delivered by a linear accelerator during intensity modulated radiation therapy (IMRT) with either step-and-shoot or dynamic multileaf collimator (MLC) delivery. Our aim is to accurately obtain the beam shape and fluence of all segments delivered during IMRT, in order to study the nature of discrepancies between the plan and the delivered doses. A commercial high-speed camera was combined with a terbium-doped gadolinium-oxy-sulfide (Gd2O2S:Tb) scintillator to form an EPID for the unaliased capture of two-dimensional fluence distributions of each beam in an IMRT delivery. The high speed EPID was synchronized to the accelerator pulse-forming network and gated to capture every possible pulse emitted from the accelerator, with an approximate frame rate of 360 frames-per-second (fps). A 62-segment beam from a head-and-neck IMRT treatment plan requiring 68 s to deliver was recorded with our high speed EPID producing approximately 6 Gbytes of imaging data. The EPID data were compared with the MLC instruction files and the MLC controller log files. The frames were binned to provide a frame rate of 72 fps with a signal-to-noise ratio that was sufficient to resolve leaf positions and segment fluence. The fractional fluence from the log files and EPID data agreed well. An ambiguity in the motion of the MLC during beam on was resolved. The log files reported leaf motions at the end of 33 of the 42 segments, while the EPID observed leaf motions in only 7 of the 42 segments. The static IMRT segment shapes observed by the high speed EPID were in good agreement with the shapes reported in the log files. The leaf motions observed during beam-on for step-and-shoot delivery were not temporally resolved by the log files.
Ryder, Robert T.; Swezey, Christopher S.; Crangle, Robert D.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.
2014-01-01
This chapter is a re-release of U.S. Geological Survey Scientific Investigations Map 2985, of the same title, by Ryder and others (2008). For this chapter, two appendixes have been added that do not appear with the original version. Appendix A provides Log ASCII Standard (LAS) files for each drill hole along cross-section E–E'; they are text files which encode gamma-ray, neutron, density, and other logs that can be used by most well-logging software. Appendix B provides graphic well-log traces from each drill hole.
Wave-Ice Interaction and the Marginal Ice Zone
2013-09-30
concept, using a high-quality attitude and heading reference system ( AHRS ) together with an accurate twin-antennae GPS compass. The instruments logged...the AHRS parameters at 50Hz, together with GPS-derived fixes, heading (accurate to better than 1o) and velocities at 10Hz. The 30MB hourly files
INSPIRE and SPIRES Log File Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Cole; /Wheaton Coll. /SLAC
2012-08-31
SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are mademore » between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-05
.... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-07
.... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-11
...-deep intake canal; (5) new trash racks, head gates, and stop log structure; (6) an existing 6-foot... Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site http://www.ferc... copy of the application, can be viewed or printed on the ``eLibrary'' link of the Commission's Web site...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin Yu; Wang Fuji; Tao Yan
2000-07-01
This paper introduced a new idea of transporting mine tailings-logs in mine tailings-slurry pipeline and a new technology of mine cemented filing of tailings-logs with tailings-slurry. The hydraulic principles, the compaction of tailings-logs and the mechanic function of fillbody of tailings-logs cemented by tailings-slurry have been discussed.
Study of the IMRT interplay effect using a 4DCT Monte Carlo dose calculation.
Jensen, Michael D; Abdellatif, Ady; Chen, Jeff; Wong, Eugene
2012-04-21
Respiratory motion may lead to dose errors when treating thoracic and abdominal tumours with radiotherapy. The interplay between complex multileaf collimator patterns and patient respiratory motion could result in unintuitive dose changes. We have developed a treatment reconstruction simulation computer code that accounts for interplay effects by combining multileaf collimator controller log files, respiratory trace log files, 4DCT images and a Monte Carlo dose calculator. Two three-dimensional (3D) IMRT step-and-shoot plans, a concave target and integrated boost were delivered to a 1D rigid motion phantom. Three sets of experiments were performed with 100%, 50% and 25% duty cycle gating. The log files were collected, and five simulation types were performed on each data set: continuous isocentre shift, discrete isocentre shift, 4DCT, 4DCT delivery average and 4DCT plan average. Analysis was performed using 3D gamma analysis with passing criteria of 2%, 2 mm. The simulation framework was able to demonstrate that a single fraction of the integrated boost plan was more sensitive to interplay effects than the concave target. Gating was shown to reduce the interplay effects. We have developed a 4DCT Monte Carlo simulation method that accounts for IMRT interplay effects with respiratory motion by utilizing delivery log files.
Automated clustering-based workload characterization
NASA Technical Reports Server (NTRS)
Pentakalos, Odysseas I.; Menasce, Daniel A.; Yesha, Yelena
1996-01-01
The demands placed on the mass storage systems at various federal agencies and national laboratories are continuously increasing in intensity. This forces system managers to constantly monitor the system, evaluate the demand placed on it, and tune it appropriately using either heuristics based on experience or analytic models. Performance models require an accurate workload characterization. This can be a laborious and time consuming process. It became evident from our experience that a tool is necessary to automate the workload characterization process. This paper presents the design and discusses the implementation of a tool for workload characterization of mass storage systems. The main features of the tool discussed here are: (1)Automatic support for peak-period determination. Histograms of system activity are generated and presented to the user for peak-period determination; (2) Automatic clustering analysis. The data collected from the mass storage system logs is clustered using clustering algorithms and tightness measures to limit the number of generated clusters; (3) Reporting of varied file statistics. The tool computes several statistics on file sizes such as average, standard deviation, minimum, maximum, frequency, as well as average transfer time. These statistics are given on a per cluster basis; (4) Portability. The tool can easily be used to characterize the workload in mass storage systems of different vendors. The user needs to specify through a simple log description language how the a specific log should be interpreted. The rest of this paper is organized as follows. Section two presents basic concepts in workload characterization as they apply to mass storage systems. Section three describes clustering algorithms and tightness measures. The following section presents the architecture of the tool. Section five presents some results of workload characterization using the tool.Finally, section six presents some concluding remarks.
DOT National Transportation Integrated Search
2001-02-01
The Minnesota data system includes the following basic files: Accident data (Accident File, Vehicle File, Occupant File); Roadlog File; Reference Post File; Traffic File; Intersection File; Bridge (Structures) File; and RR Grade Crossing File. For ea...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-06
... the eFiling link to log on and submit the intervention or protests. Persons unable to file... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... electronic service, persons with Internet access who will eFile a document and/or be listed as a contact for...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
.... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...
2013-09-01
to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS
77 FR 55817 - Delek Crude Logistics, LLC; Notice of Petition for Waiver
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-11
... using the eRegistration link. Select the eFiling link to log on and submit the intervention or protests... number. eFiling is encouraged. More detailed information relating to filing requirements, interventions...'') grant a temporary waiver of the filing and reporting requirements of sections 6 and 201 of the...
Ryder, Robert T.; Crangle, Robert D.; Trippi, Michael H.; Swezey, Christopher S.; Lentz, Erika E.; Rowan, Elisabeth L.; Hope, Rebecca S.; Ruppert, Leslie F.; Ryder, Robert T.
2014-01-01
This chapter is a re-release of U.S. Geological Survey Scientific Investigations Map 3067, of the same title, by Ryder and others (2009). For this chapter, two appendixes have been added that do not appear with the original version. Appendix A provides Log ASCII Standard (LAS) files for each drill hole along cross-section D-D'; they are text files which encode gamma-ray, neutron, density, and other logs that can be used by most well-logging software. Appendix B provides graphic well-log traces and lithologic descriptions with formation tops from each drill hole.
15 CFR 762.3 - Records exempt from recordkeeping requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
...; (2) Special export file list; (3) Vessel log from freight forwarder; (4) Inspection certificate; (5... form; (12) Financial hold form; (13) Export parts shipping problem form; (14) Draft number log; (15) Expense invoice mailing log; (16) Financial status report; (17) Bank release of guarantees; (18) Cash...
15 CFR 762.3 - Records exempt from recordkeeping requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
...; (2) Special export file list; (3) Vessel log from freight forwarder; (4) Inspection certificate; (5... form; (12) Financial hold form; (13) Export parts shipping problem form; (14) Draft number log; (15) Expense invoice mailing log; (16) Financial status report; (17) Bank release of guarantees; (18) Cash...
Flexibility and Performance of Parallel File Systems
NASA Technical Reports Server (NTRS)
Kotz, David; Nieuwejaar, Nils
1996-01-01
As we gain experience with parallel file systems, it becomes increasingly clear that a single solution does not suit all applications. For example, it appears to be impossible to find a single appropriate interface, caching policy, file structure, or disk-management strategy. Furthermore, the proliferation of file-system interfaces and abstractions make applications difficult to port. We propose that the traditional functionality of parallel file systems be separated into two components: a fixed core that is standard on all platforms, encapsulating only primitive abstractions and interfaces, and a set of high-level libraries to provide a variety of abstractions and application-programmer interfaces (API's). We present our current and next-generation file systems as examples of this structure. Their features, such as a three-dimensional file structure, strided read and write interfaces, and I/O-node programs, are specifically designed with the flexibility and performance necessary to support a wide range of applications.
76 FR 4463 - Privacy Act of 1974; Report of Modified or Altered System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-25
... occupationally related mortality or morbidity is occurring. In the event of litigation where the defendant is: (a... diseases and which provides for the confidentiality of the information. In the event of litigation..., limited log-ins, virus protection, and user rights/file attribute restrictions. Password protection...
SU-E-T-100: Designing a QA Tool for Enhance Dynamic Wedges Based On Dynalog Files
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yousuf, A; Hussain, A
2014-06-01
Purpose: A robust quality assurance (QA) program for computer controlled enhanced dynamic wedge (EDW) has been designed and tested. Calculations to perform such QA test is based upon the EDW dynamic log files generated during dose delivery. Methods: Varian record and verify system generates dynamic log (dynalog) files during dynamic dose delivery. The system generated dynalog files contain information such as date and time of treatment, energy, monitor units, wedge orientation, and type of treatment. It also contains the expected calculated segmented treatment tables (STT) and the actual delivered STT for the treatment delivery as a verification record. These filesmore » can be used to assess the integrity and precision of the treatment plan delivery. The plans were delivered with a 6 MV beam from a Varian linear accelerator. For available EDW angles (10°, 15°, 20°, 25°, 30°, 45°, and 60°) Varian STT values were used to manually calculate monitor units for each segment. It can also be used to calculate the EDW factors. Independent verification of fractional MUs per segment was performed against those generated from dynalog files. The EDW factors used to calculate MUs in TPS were dosimetrically verified in solid water phantom with semiflex chamber on central axis. Results: EDW factors were generated from the STT provided by Varian and verified against practical measurements. The measurements were in agreement of the order of 1 % to the calculated EDW data. Variation between the MUs per segment obtained from dynalog files and those manually calculated was found to be less than 2%. Conclusion: An efficient and easy tool to perform routine QA procedure of EDW is suggested. The method can be easily implemented in any institution without a need for expensive QA equipment. An error of the order of ≥2% can be easily detected.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-25
... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-30
... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-20
... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-24
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-15
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-12
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-03
... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street, NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-24
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-13
... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-14
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-24
... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE...://www.ferc.gov . To facilitate electronic service, persons with Internet access who will eFile a... using the eRegistration link. Select the eFiling link to log on and submit the intervention or protests...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-02
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-25
... to protest should file with the Federal Energy Regulatory Commission, 888 First Street, NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-12
... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-24
... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-24
... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-15
... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-27
... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...
Multipurpose Controller with EPICS integration and data logging: BPM application for ESS Bilbao
NASA Astrophysics Data System (ADS)
Arredondo, I.; del Campo, M.; Echevarria, P.; Jugo, J.; Etxebarria, V.
2013-10-01
This work presents a multipurpose configurable control system which can be integrated in an EPICS control network, this functionality being configured through a XML configuration file. The core of the system is the so-called Hardware Controller which is in charge of the control hardware management, the set up and communication with the EPICS network and the data storage. The reconfigurable nature of the controller is based on a single XML file, allowing any final user to easily modify and adjust the control system to any specific requirement. The selected Java development environment ensures a multiplatform operation and large versatility, even regarding the control hardware to be controlled. Specifically, this paper, focused on fast control based on a high performance FPGA, describes also an application approach for the ESS Bilbao's Beam Position Monitoring system. The implementation of the XML configuration file and the satisfactory performance outcome achieved are presented, as well as a general description of the Multipurpose Controller itself.
ATLAS, an integrated structural analysis and design system. Volume 4: Random access file catalog
NASA Technical Reports Server (NTRS)
Gray, F. P., Jr. (Editor)
1979-01-01
A complete catalog is presented for the random access files used by the ATLAS integrated structural analysis and design system. ATLAS consists of several technical computation modules which output data matrices to corresponding random access file. A description of the matrices written on these files is contained herein.
SedMob: A mobile application for creating sedimentary logs in the field
NASA Astrophysics Data System (ADS)
Wolniewicz, Pawel
2014-05-01
SedMob is an open-source, mobile software package for creating sedimentary logs, targeted for use in tablets and smartphones. The user can create an unlimited number of logs, save data from each bed in the log as well as export and synchronize the data with a remote server. SedMob is designed as a mobile interface to SedLog: a free multiplatform package for drawing graphic logs that runs on PC computers. Data entered into SedMob are saved in the CSV file format, fully compatible with SedLog.
COMBATXXI, JDAFS, and LBC Integration Requirements for EASE
2015-10-06
process as linear and as new data is made available, any previous analysis is obsolete and has to start the process over again. Figure 2 proposes a...final line of the manifest file names the scenario file associated with the run. Under the usual practice, the analyst now starts the COMBATXXI...describes which events are to be logged. Finally the scenario is started with the click of a button. The simulation generates logs of a couple of sorts
Users' information-seeking behavior on a medical library Website
Rozic-Hristovski, Anamarija; Hristovski, Dimitar; Todorovski, Ljupco
2002-01-01
The Central Medical Library (CMK) at the Faculty of Medicine, University of Ljubljana, Slovenia, started to build a library Website that included a guide to library services and resources in 1997. The evaluation of Website usage plays an important role in its maintenance and development. Analyzing and exploring regularities in the visitors' behavior can be used to enhance the quality and facilitate delivery of information services, identify visitors' interests, and improve the server's performance. The analysis of the CMK Website users' navigational behavior was carried out by analyzing the Web server log files. These files contained information on all user accesses to the Website and provided a great opportunity to learn more about the behavior of visitors to the Website. The majority of the available tools for Web log file analysis provide a predefined set of reports showing the access count and the transferred bytes grouped along several dimensions. In addition to the reports mentioned above, the authors wanted to be able to perform interactive exploration and ad hoc analysis and discover trends in a user-friendly way. Because of that, we developed our own solution for exploring and analyzing the Web logs based on data warehousing and online analytical processing technologies. The analytical solution we developed proved successful, so it may find further application in the field of Web log file analysis. We will apply the findings of the analysis to restructuring the CMK Website. PMID:11999179
18 CFR 270.304 - Tight formation gas.
Code of Federal Regulations, 2011 CFR
2011-04-01
... determination that natural gas is tight formation gas must file with the jurisdictional agency an application... formation; (d) A complete copy of the well log, including the log heading identifying the designated tight...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, B.G.; Richards, R.E.; Reece, W.J.
1992-10-01
This Reference Guide contains instructions on how to install and use Version 3.5 of the NRC-sponsored Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). The NUCLARR data management system is contained in compressed files on the floppy diskettes that accompany this Reference Guide. NUCLARR is comprised of hardware component failure data (HCFD) and human error probability (HEP) data, both of which are available via a user-friendly, menu driven retrieval system. The data may be saved to a file in a format compatible with IRRAS 3.0 and commercially available statistical packages, or used to formulate log-plots and reports of data retrievalmore » and aggregation findings.« less
Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, B.G.; Richards, R.E.; Reece, W.J.
1992-10-01
This Reference Guide contains instructions on how to install and use Version 3.5 of the NRC-sponsored Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). The NUCLARR data management system is contained in compressed files on the floppy diskettes that accompany this Reference Guide. NUCLARR is comprised of hardware component failure data (HCFD) and human error probability (HEP) data, both of which are available via a user-friendly, menu driven retrieval system. The data may be saved to a file in a format compatible with IRRAS 3.0 and commercially available statistical packages, or used to formulate log-plots and reports of data retrievalmore » and aggregation findings.« less
SU-F-T-177: Impacts of Gantry Angle Dependent Scanning Beam Properties for Proton Treatment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Y; Clasie, B; Lu, H
Purpose: In pencil beam scanning (PBS), the delivered spot MU, position and size are slightly different at different gantry angles. We investigated the level of delivery uncertainty at different gantry angles through a log file analysis. Methods: 34 PBS fields covering full 360 degrees gantry angle spread were collected retrospectively from 28 patients treated at our institution. All fields were delivered at zero gantry angle and the prescribed gantry angle, and measured at isocenter with the MatriXX 2D array detector at the prescribed gantry angle. The machine log files were analyzed to extract the delivered MU per spot and themore » beam position from the strip ionization chambers in the treatment nozzle. The beam size was separately measured as a function of gantry angle and beam energy. Using this information, the dose was calculated in a water phantom at both gantry angles and compared to the measurement using the 3D γ-index at 2mm/2%. Results: The spot-by-spot difference between the beam position in the log files from the delivery at the two gantry angles has a mean of 0.3 and 0.4 mm and a standard deviation of 0.6 and 0.7 mm for × and y directions, respectively. Similarly, the spot-by-spot difference between the MU in the log files from the delivery at the two gantry angles has a mean 0.01% and a standard deviation of 0.7%. These small deviations lead to an excellent agreement in dose calculations with an average γ pass rate for all fields being approximately 99.7%. When each calculation is compared to the measurement, a high correlation in γ was also found. Conclusion: Using machine logs files, we verified that PBS beam delivery at different gantry angles are sufficiently small and the planned spot position and MU. This study brings us one step closer to simplifying our patient-specific QA.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-12
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-24
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-21
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-24
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-31
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-30
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-07
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-21
... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-12
... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...
Capabilities Report 2012, West Desert Test Center
2012-03-12
132 FT- IR Spectrometer...electronic system files, paper logs, production batch records, QA/QC data, and PCR data generated during a test. Data analysts also track and QC raw data...Advantage +SL bench-top freeze dryers achieve shelf temperatures as low as -57°C and condenser temperatures to -67°C. The bulk milling facility produces
Web-Based Learning Programs: Use by Learners with Various Cognitive Styles
ERIC Educational Resources Information Center
Chen, Ling-Hsiu
2010-01-01
To consider how Web-based learning program is utilized by learners with different cognitive styles, this study presents a Web-based learning system (WBLS) and analyzes learners' browsing data recorded in the log file to identify how learners' cognitive styles and learning behavior are related. In order to develop an adapted WBLS, this study also…
NASA Technical Reports Server (NTRS)
Katz, Randy H.; Anderson, Thomas E.; Ousterhout, John K.; Patterson, David A.
1991-01-01
Rapid advances in high performance computing are making possible more complete and accurate computer-based modeling of complex physical phenomena, such as weather front interactions, dynamics of chemical reactions, numerical aerodynamic analysis of airframes, and ocean-land-atmosphere interactions. Many of these 'grand challenge' applications are as demanding of the underlying storage system, in terms of their capacity and bandwidth requirements, as they are on the computational power of the processor. A global view of the Earth's ocean chlorophyll and land vegetation requires over 2 terabytes of raw satellite image data. In this paper, we describe our planned research program in high capacity, high bandwidth storage systems. The project has four overall goals. First, we will examine new methods for high capacity storage systems, made possible by low cost, small form factor magnetic and optical tape systems. Second, access to the storage system will be low latency and high bandwidth. To achieve this, we must interleave data transfer at all levels of the storage system, including devices, controllers, servers, and communications links. Latency will be reduced by extensive caching throughout the storage hierarchy. Third, we will provide effective management of a storage hierarchy, extending the techniques already developed for the Log Structured File System. Finally, we will construct a protototype high capacity file server, suitable for use on the National Research and Education Network (NREN). Such research must be a Cornerstone of any coherent program in high performance computing and communications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiong, Z; Vijayan, S; Oines, A
Purpose: To compare PCXMC and EGSnrc calculated organ and effective radiation doses from cone-beam computed tomography (CBCT) and interventional fluoroscopically-guided procedures using automatic exposure-event grouping. Methods: For CBCT, we used PCXMC20Rotation.exe to automatically calculate the doses and compared the results to those calculated using EGSnrc with the Zubal patient phantom. For interventional procedures, we use the dose tracking system (DTS) which we previously developed to produce a log file of all geometry and exposure parameters for every x-ray pulse during a procedure, and the data in the log file is input into PCXMC and EGSnrc for dose calculation. A MATLABmore » program reads data from the log files and groups similar exposures to reduce calculation time. The definition files are then automatically generated in the format used by PCXMC and EGSnrc. Processing is done at the end of the procedure after all exposures are completed. Results: For the Toshiba Infinix CBCT LCI-Middle-Abdominal protocol, most organ doses calculated with PCXMC20Rotation closely matched those calculated with EGSnrc. The effective doses were 33.77 mSv with PCXMC20Rotation and 32.46 mSv with EGSnrc. For a simulated interventional cardiac procedure, similar close agreement in organ dose was obtained between the two codes; the effective doses were 12.02 mSv with PCXMC and 11.35 mSv with EGSnrc. The calculations can be completed on a PC without manual intervention in less than 15 minutes with PCXMC and in about 10 hours with EGSnrc, depending on the level of data grouping and accuracy desired. Conclusion: Effective dose and most organ doses in CBCT and interventional radiology calculated by PCXMC closely match those calculated by EGSnrc. Data grouping, which can be done automatically, makes the calculation time with PCXMC on a standard PC acceptable. This capability expands the dose information that can be provided by the DTS. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-29
... the respondents, including the use of automated collection techniques or other forms of information...: OMB Control Number: 3060-0360. Title: Section 80.409, Station Logs. Form No.: N/A. Type of Review... for filing suits upon such claims. Section 80.409(d), Ship Radiotelegraph Logs: Logs of ship stations...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-14
... securities and assumptions of liability. Any person desiring to intervene or to protest should file with the... with Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-14
... of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-24
... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-05
... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...
The Cheetah Data Management System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kunz, P.F.; Word, G.B.
1991-03-01
Cheetah is a data management system based on the C programming language. The premise of Cheetah is that the banks' of FORTRAN based systems should be structures' as defined by the C language. Cheetah is a system to mange these structures, while preserving the use of the C language in its native form. For C structures managed by Cheetah, the user can use Cheetah utilities such as reading and writing, in a machine independent form, both binary and text files to disk or over a network. Files written by Cheetah also contain a dictionary describing in detail the data containedmore » in the file. Such information is intended to be used by interactive programs for presenting the contents of the file. Such information is intended to be used by interactive programs for presenting the contents of file. Cheetah has been ported to many different operating systems with no operating system dependent switches.« less
NASA Technical Reports Server (NTRS)
Easley, Wesley C.
1991-01-01
Experiment critical use of RS-232 data busses in the Transport Systems Research Vehicle (TSRV) operated by the Advanced Transport Operating Systems Program Office at the NASA Langley Research Center has recently increased. Each application utilizes a number of nonidentical computer and peripheral configurations and requires task specific software development. To aid these development tasks, an IBM PC-based RS-232 bus monitoring system was produced. It can simultaneously monitor two communication ports of a PC or clone, including the nonstandard bus expansion of the TSRV Grid laptop computers. Display occurs in a separate window for each port's input with binary display being selectable. A number of other features including binary log files, screen capture to files, and a full range of communication parameters are provided.
DOT National Transportation Integrated Search
2001-02-01
The Minnesota data system includes the following basic files: Accident data (Accident File, Vehicle File, Occupant File); Roadlog File; Reference Post File; Traffic File; Intersection File; Bridge (Structures) File; and RR Grade Crossing File. For ea...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meylan, W.M.; Howard, P.H.; Aronson, D.
1999-04-01
A compound`s bioconcentration factor (BDF) is the most commonly used indicator of its tendency to accumulate in aquatic organisms from the surrounding medium. Because it is expensive to measure, the BCF is generally estimated from the octanol/water partition coefficient (K{sub ow}), but currently used regression equations were developed from small data sets that do not adequately represent the wide range of chemical substances now subject to review. To develop and improved method, the authors collected BCF data in a file that contained information on measured BCFs and other key experimental details for 694 chemicals. Log BCF was then regressed againstmore » log K{sub ow} and chemicals with significant deviations from the line of best fit were analyzed by chemical structure. The resulting algorithm classifies a substance as either nonionic or ionic, the latter group including carboxylic acids, sulfonic acids and their salts, and quaternary N compounds. Log BCF for nonionics is estimated from log K{sub ow} and a series of correction factors if applicable; different equations apply for log K{sub ow} 1.0 to 7.0 and >7.0. For ionics, chemicals are categorized by log K{sub ow} and a log BCF in the range 0.5 to 1.75 is assigned. Organometallics, nonionics with long alkyl chains, and aromatic azo compounds receive special treatment. The correlation coefficient and mean error for log BCF indicate that the new method is a significantly better fit to existing data than other methods.« less
Sediment data collected in 2013 from the northern Chandeleur Islands, Louisiana
Buster, Noreen A.; Kelso, Kyle W.; Bernier, Julie C.; Flocks, James G.; Miselis, Jennifer L.; DeWitt, Nancy T.
2014-01-01
This data series serves as an archive of sediment data collected in July 2013 from the Chandeleur Islands sand berm and adjacent barrier-island environments. Data products include descriptive core logs, core photographs and x-radiographs, results of sediment grain-size analyses, sample location maps, and Geographic Information System data files with accompanying formal Federal Geographic Data Committee metadata.
VizieR Online Data Catalog: Reference Catalogue of Bright Galaxies (RC1; de Vaucouleurs+ 1964)
NASA Astrophysics Data System (ADS)
de Vaucouleurs, G.; de Vaucouleurs, A.
1995-11-01
The Reference Catalogue of Bright Galaxies lists for each entry the following information: NGC number, IC number, or A number; A, B, or C designation; B1950.0 positions, position at 100 year precession; galactic and supergalactic positions; revised morphological type and source; type and color class in Yerkes list 1 and 2; Hubble-Sandage type; revised Hubble type according to Holmberg; logarithm of mean major diameter (log D) and ratio of major to minor diameter (log R) and their weights; logarithm of major diameter; sources of the diameters; David Dunlap Observatory type and luminosity class; Harvard photographic apparent magnitude; weight of V, B-V(0), U-B(0); integrated magnitude B(0) and its weight in the B system; mean surface brightness in magnitude per square minute of arc and sources for the B magnitude; mean B surface brightness derived from corrected Harvard magnitude; the integrated color index in the standard B-V system; "intrinsic" color index; sources of B-V and/or U-B; integrated color in the standard U-B system; observed radial velocity in km/sec; radial velocity corrected for solar motion in km/sec; sources of radial velocities; solar motion correction; and direct photographic source. The catalog was created by concatenating four files side by side. (1 data file).
Transaction aware tape-infrastructure monitoring
NASA Astrophysics Data System (ADS)
Nikolaidis, Fotios; Kruse, Daniele Francesco
2014-06-01
Administrating a large scale, multi protocol, hierarchical tape infrastructure like the CERN Advanced STORage manager (CASTOR)[2], which stores now 100 PB (with an increasing step of 25 PB per year), requires an adequate monitoring system for quick spotting of malfunctions, easier debugging and on demand report generation. The main challenges for such system are: to cope with CASTOR's log format diversity and its information scattered among several log files, the need for long term information archival, the strict reliability requirements and the group based GUI visualization. For this purpose, we have designed, developed and deployed a centralized system consisting of four independent layers: the Log Transfer layer for collecting log lines from all tape servers to a single aggregation server, the Data Mining layer for combining log data into transaction context, the Storage layer for archiving the resulting transactions and finally the Web UI layer for accessing the information. Having flexibility, extensibility and maintainability in mind, each layer is designed to work as a message broker for the next layer, providing a clean and generic interface while ensuring consistency, redundancy and ultimately fault tolerance. This system unifies information previously dispersed over several monitoring tools into a single user interface, using Splunk, which also allows us to provide information visualization based on access control lists (ACL). Since its deployment, it has been successfully used by CASTOR tape operators for quick overview of transactions, performance evaluation, malfunction detection and from managers for report generation.
Expansion of the roadway reference log : KYSPR-99-201.
DOT National Transportation Integrated Search
2000-05-01
The objectives of this study were to: 1) expand the current route log to include milepoints for all intersections on state maintained roads and 2) recommend a procedure for establishing milepoints and maintaining the file with up-to-date information....
78 FR 52524 - Sunoco Pipeline LP; Notice of Petition for Declaratory Order
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-23
... link to log on and submit the intervention or protests. Persons unable to file electronically should... described in their petition. Any person desiring to intervene or to protest in this proceedings must file in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...
78 FR 62349 - Sunoco Pipeline L.P.; Notice of Petition for Declaratory Order
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-18
... to log on and submit the intervention or protests. Persons unable to file electronically should... petition. Any person desiring to intervene or to protest in this proceeding must file in accordance with..., persons with Internet access who will eFile a document and/or be listed as a contact for an intervenor...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-20
... through DEMD's in-house databases; Well log interpretation, including correlation of formation tops.... Files must have descriptive file names to help DEMD quickly locate specific components of the proposal...
Fort Bliss Geothermal Area Data: Temperature profile, logs, schematic model and cross section
Adam Brandt
2015-11-15
This dataset contains a variety of data about the Fort Bliss geothermal area, part of the southern portion of the Tularosa Basin, New Mexico. The dataset contains schematic models for the McGregor Geothermal System, a shallow temperature survey of the Fort Bliss geothermal area. The dataset also contains Century OH logs, a full temperature profile, and complete logs from well RMI 56-5, including resistivity and porosity data, drill logs with drill rate, depth, lithology, mineralogy, fractures, temperature, pit total, gases, and descriptions among other measurements as well as CDL, CNL, DIL, GR Caliper and Temperature files. A shallow (2 meter depth) temperature survey of the Fort Bliss geothermal area with 63 data points is also included. Two cross sections through the Fort Bliss area, also included, show well position and depth. The surface map included shows faults and well spatial distribution. Inferred and observed fault distributions from gravity surveys around the Fort Bliss geothermal area.
20 CFR 658.414 - Referral of non-JS-related complaints.
Code of Federal Regulations, 2011 CFR
2011-04-01
... applicable, were referred on the complaint log specified in § 658.410(c)(1). The JS official shall also prepare and keep the file specified in § 658.410(c)(3) for the complaints filed pursuant to paragraph (a...
Simpao, Allan; Heitz, James W; McNulty, Stephen E; Chekemian, Beth; Brenn, B Randall; Epstein, Richard H
2011-02-01
Residents in anesthesia training programs throughout the world are required to document their clinical cases to help ensure that they receive adequate training. Current systems involve self-reporting, are subject to delayed updates and misreported data, and do not provide a practicable method of validation. Anesthesia information management systems (AIMS) are being used increasingly in training programs and are a logical source for verifiable documentation. We hypothesized that case logs generated automatically from an AIMS would be sufficiently accurate to replace the current manual process. We based our analysis on the data reporting requirements of the American College of Graduate Medical Education (ACGME). We conducted a systematic review of ACGME requirements and our AIMS record, and made modifications after identifying data element and attribution issues. We studied 2 methods (parsing of free text procedure descriptions and CPT4 procedure code mapping) to automatically determine ACGME case categories and generated AIMS-based case logs and compared these to assignments made by manual inspection of the anesthesia records. We also assessed under- and overreporting of cases entered manually by our residents into the ACGME website. The parsing and mapping methods assigned cases to a majority of the ACGME categories with accuracies of 95% and 97%, respectively, as compared with determinations made by 2 residents and 1 attending who manually reviewed all procedure descriptions. Comparison of AIMS-based case logs with reports from the ACGME Resident Case Log System website showed that >50% of residents either underreported or overreported their total case counts by at least 5%. The AIMS database is a source of contemporaneous documentation of resident experience that can be queried to generate valid, verifiable case logs. The extent of AIMS adoption by academic anesthesia departments should encourage accreditation organizations to support uploading of AIMS-based case log files to improve accuracy and to decrease the clerical burden on anesthesia residents.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-14
... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...
78 FR 63977 - Enable Bakken Crude Services, LLC; Notice of Request For Waiver
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-25
... person desiring to intervene or to protest in this proceedings must file in accordance with Rules 211 and... Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and...
MAIL LOG, program theory, volume 2
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.
Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Davis, Jeffrey B.; Flocks, James G.; Wiese, Dana S.
2004-01-01
In August and September of 1993 and January of 1994, the U.S. Geological Survey, under a cooperative agreement with the St. Johns River Water Management District (SJRWMD), conducted geophysical surveys of Kingsley Lake, Orange Lake, and Lowry Lake in northeast Florida. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, observer's logbook, Field Activity Collection System (FACS) logs, and formal FGDC metadata. A filtered and gained GIF image of each seismic profile is also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. The data archived here were collected under a cooperative agreement with the St. Johns River Water Management District as part of the USGS Lakes and Coastal Aquifers (LCA) Project. For further information about this study, refer to http://coastal.er.usgs.gov/stjohns, Kindinger and others (1994), and Kindinger and others (2000). The USGS Florida Integrated Science Center (FISC) - Coastal and Watershed Studies in St. Petersburg, Florida, assigns a unique identifier to each cruise or field activity. For example, 93LCA01 tells us the data were collected in 1993 for the Lakes and Coastal Aquifers (LCA) Project and the data were collected during the first field activity for that project in that calendar year. For a detailed description of the method used to assign the field activity ID, see http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html. The boomer is an acoustic energy source that consists of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled at the sea surface and when discharged emits a short acoustic pulse, or shot, that propagates through the water and sediment column. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by the receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (e.g., 0.5 s) and recorded for specific intervals of time (e.g., 100 ms). In this way, a two-dimensional vertical image of the shallow geologic structure beneath the ship track is produced. Acquisition geometery for 94LCA01 is recorded in the operations logbook. No logbook exists for 93LCA01. Table 1 displays acquisition parameters for both field activities. For more information about the acquisition equipment used, refer to the FACS equipment logs. The unprocessed seismic data are stored in SEG-Y format (Barry and others, 1975). For a detailed description of the data format, refer to the SEG-Y Format page. See the How To Download SEG-Y Data page for more information about these files. Processed profiles can be viewed as GIF images from the Profiles page. Refer to the Software page for details about the processing and examples of the processing scripts. Detailed information about the navigation systems used for each field activity can be found in Table 1 and the FACS equipment logs. To view the trackline maps and navigation files, and for more information about these items, see the Navigation page. The original trace files were recorded in nonstandard ELICS format and later converted to standard SEG-Y format. The original trace files for 94LCA01 lines ORJ127_1, ORJ127_3, and ORJ131_1 were divided into two or more trace files (e.g., ORJ127_1 became ORJ127_1a and ORJ127_1b) because the original total number of traces exceeded the maximum allowed by the processing system. Digital data were not recoverable for 93LCA
Ackermann, Mark R.
2006-01-01
The purpose of this manuscript is to discuss fluorogenic real-time quantitative polymerase chain reaction (qPCR) inhibition and to introduce/define a novel Microsoft Excel-based file system which provides a way to detect and avoid inhibition, and enables investigators to consistently design dynamically-sound, truly LOG-linear qPCR reactions very quickly. The qPCR problems this invention solves are universal to all qPCR reactions, and it performs all necessary qPCR set-up calculations in about 52 seconds (using a pentium 4 processor) for up to seven qPCR targets and seventy-two samples at a time – calculations that commonly take capable investigators days to finish. We have named this custom Excel-based file system "FocusField2-6GallupqPCRSet-upTool-001" (FF2-6-001 qPCR set-up tool), and are in the process of transforming it into professional qPCR set-up software to be made available in 2007. The current prototype is already fully functional. PMID:17033699
ERIC Educational Resources Information Center
Kerr, Deirdre; Chung, Gregory K. W. K.; Iseli, Markus R.
2011-01-01
Analyzing log data from educational video games has proven to be a challenging endeavor. In this paper, we examine the feasibility of using cluster analysis to extract information from the log files that is interpretable in both the context of the game and the context of the subject area. If cluster analysis can be used to identify patterns of…
NASA Technical Reports Server (NTRS)
Kavelund, Klaus; Barringer, Howard
2012-01-01
TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.
Code of Federal Regulations, 2013 CFR
2013-04-01
... include staff time associated with: (A) Processing FOIA requests; (B) Locating and reviewing files; (C) Monitoring file reviews; (D) Generating computer records (electronic print-outs); and (E) Preparing logs of..., black and white copies. The charge for copying standard sized, black and white public records shall be...
Code of Federal Regulations, 2012 CFR
2012-04-01
... include staff time associated with: (A) Processing FOIA requests; (B) Locating and reviewing files; (C) Monitoring file reviews; (D) Generating computer records (electronic print-outs); and (E) Preparing logs of..., black and white copies. The charge for copying standard sized, black and white public records shall be...
9 CFR 327.10 - Samples; inspection of consignments; refusal of entry; marking.
Code of Federal Regulations, 2010 CFR
2010-01-01
... import establishment and approved by the Director, Import Inspection Division, is on file at the import... (iv) That the establishment will maintain a daily stamping log containing the following information... covering the product to be inspected. The daily stamping log must be retained by the establishment in...
46 CFR 78.37-3 - Logbooks and records.
Code of Federal Regulations, 2011 CFR
2011-10-01
... completed, the master or person in charge shall file the logbook with the Officer in Charge, Marine... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...
46 CFR 131.610 - Logbooks and records.
Code of Federal Regulations, 2011 CFR
2011-10-01
... COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) OFFSHORE SUPPLY VESSELS OPERATIONS Logs § 131... them. (d) When a voyage is completed, or after a specified time has elapsed, the master shall file the... alternative log or record for making entries required by law, including regulations in this subchapter. This...
46 CFR 131.610 - Logbooks and records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) OFFSHORE SUPPLY VESSELS OPERATIONS Logs § 131... them. (d) When a voyage is completed, or after a specified time has elapsed, the master shall file the... alternative log or record for making entries required by law, including regulations in this subchapter. This...
9 CFR 327.10 - Samples; inspection of consignments; refusal of entry; marking.
Code of Federal Regulations, 2011 CFR
2011-01-01
... import establishment and approved by the Director, Import Inspection Division, is on file at the import... (iv) That the establishment will maintain a daily stamping log containing the following information... covering the product to be inspected. The daily stamping log must be retained by the establishment in...
46 CFR 78.37-3 - Logbooks and records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... completed, the master or person in charge shall file the logbook with the Officer in Charge, Marine... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...
Code of Federal Regulations, 2014 CFR
2014-07-01
... test to generate a submission package file, which documents performance test data. You must then submit the file generated by the ERT through the EPA's Compliance and Emissions Data Reporting Interface (CEDRI), which can be accessed by logging in to the EPA's Central Data Exchange (CDX) (https://cdx.epa...
Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.
2010-01-01
In June of 2007, the U.S. Geological Survey (USGS) conducted a geophysical survey offshore of the Chandeleur Islands, Louisiana, in cooperation with the Louisiana Department of Natural Resources (LDNR) as part of the USGS Barrier Island Comprehensive Monitoring (BICM) project. This project is part of a broader study focused on Subsidence and Coastal Change (SCC). The purpose of the study was to investigate the shallow geologic framework and monitor the enviromental impacts of Hurricane Katrina (Louisiana landfall was on August 29, 2005) on the Gulf Coast's barrier island chains. This report serves as an archive of unprocessed digital 512i and 424 Chirp sub-bottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, observer's logbook, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 07SCC01 tells us the data were collected in 2007 for the Subsidence and Coastal Change (SCC) study and the data were collected during the first field activity for that study in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID). All Chirp systems use a signal of continuously varying frequency; the Chirp systems used during this survey produce high resolution, shallow penetration profile images beneath the seafloor. The towfish is a sound source and receiver, which is typically towed 1 - 2 m below the sea surface. The acoustic energy is reflected at density boundaries (such as the seafloor or sediment layers beneath the seafloor), detected by a receiver, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.125 s) and recorded for specific intervals of time (for example, 50 ms). In this way, a two-dimensional vertical image of the shallow geologic structure beneath the ship track is produced. Figure 1 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters. See the digital FACS equipment log (11-KB PDF) for details about the acquisition equipment used. Table 2 lists trackline statistics. Scanned images of the handwritten FACS logs and handwritten science logbook (449-KB PDF) are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y rev 1 format (Norris and Faichney, 2002); ASCII character encoding is used for the first 3,200 bytes of the card image header instead of the SEG-Y rev 0 (Barry and others, 1975) EBCDIC format. The SEG-Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG-Y Data page for download instructions. The web version of this archive does not contain the SEG-Y trace files. These files are very large and would require extremely long download times. To obtain the complete DVD archive, contact USGS Information at 1-888-ASK-USGS or infoservices@usgs.gov. The printable profiles provided here are GIF images that were processed and gained using SU software; refer to the Software page for links to example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992). The processed SEG-Y data were also exported to Chesapeake Technology, Inc. (CTI) SonarWeb software to produce an interactive version of the profile that allows the user to obtain a geographic location and depth from the profile for a given cursor position. This information is displayed in the status bar of the browser.
Remote vibration monitoring system using wireless internet data transfer
NASA Astrophysics Data System (ADS)
Lemke, John
2000-06-01
Vibrations from construction activities can affect infrastructure projects in several ways. Within the general vicinity of a construction site, vibrations can result in damage to existing structures, disturbance to people, damage to sensitive machinery, and degraded performance of precision instrumentation or motion sensitive equipment. Current practice for monitoring vibrations in the vicinity of construction sites commonly consists of measuring free field or structural motions using velocity transducers connected to a portable data acquisition unit via cables. This paper describes an innovative way to collect, process, transmit, and analyze vibration measurements obtained at construction sites. The system described measures vibration at the sensor location, performs necessary signal conditioning and digitization, and sends data to a Web server using wireless data transmission and Internet protocols. A Servlet program running on the Web server accepts the transmitted data and incorporates it into a project database. Two-way interaction between the Web-client and the Web server is accomplished through the use of a Servlet program and a Java Applet running inside a browser located on the Web client's computer. Advantages of this system over conventional vibration data logging systems include continuous unattended monitoring, reduced costs associated with field data collection, instant access to data files and graphs by project team members, and the ability to remotely modify data sampling schemes.
Advanced Technology Multiple Criteria Decision Model.
1981-11-01
ratings of the sys- tem parameters; and (3), HEADER which contains information on the structure of the problem and titles. Two supporting programs develop...in these files are given in Section V.2. 2. DATA STRUCTURE TABLES This section describes the data files used in the system selection model program ...the supporting program PPP and an input file to UPPP and SSMP. Figure 13 shows the structure of this file. b. User’s preference package (UPP) UPP is
Developing a Complete and Effective ACT-R Architecture
2008-01-01
of computational primitives , as contrasted with the predominant “one-off” and “grab-bag” cognitive models in the field. These architectures have...transport/ semaphore protocols connected via a glue script. Both protocols rely on the fact that file rename and file remove operations are atomic...the Trial Log file until just prior to processing the next input request. Thus, to perform synchronous identifications it is necessary to run an
Tewari, Rajendra K; Ali, Sajid; Mishra, Surendra K; Kumar, Ashok; Andrabi, Syed Mukhtar-Un-Nisar; Zoya, Asma; Alam, Sharique
2016-05-01
In the present study, the effectiveness of three rotary and two manual nickel titanium instrument systems on mechanical reduction of the intracanal Enterococcus faecalis population was evaluated. Mandibular premolars with straight roots were selected. Teeth were decoronated and instrumented until 20 K file and irrigated with physiological saline. After sterilization by ethylene oxide gas, root canals were inoculated with Enterococcus faecalis. The specimens were randomly divided into five groups for canal instrumentation: Manual Nitiflex and Hero Shaper nickel titanium files, and rotary Hyflex CM, ProTaper Next, and K3XF nickel titanium files. Intracanal bacterial sampling was done before and after instrumentation. After serial dilution, samples were plated onto the Mitis Salivarius agar. The c.f.u. grown were counted, and log10 transformation was calculated. All instrumentation systems significantly reduced the intracanal bacterial population after root canal preparation. ProTaper Next was found to be significantly more effective than Hyflex CM and manual Nitiflex and Hero Shaper. However, ProTaper Next showed no significant difference with K3XF. Canal instrumentation by all the file systems significantly reduced the intracanal Enterococcus faecalis counts. ProTaper Next was found to be most effective in reducing the number of bacteria than other rotary or hand instruments. © 2014 Wiley Publishing Asia Pty Ltd.
Fast skin dose estimation system for interventional radiology
Takata, Takeshi; Kotoku, Jun’ichi; Maejima, Hideyuki; Kumagai, Shinobu; Arai, Norikazu; Kobayashi, Takenori; Shiraishi, Kenshiro; Yamamoto, Masayoshi; Kondo, Hiroshi; Furui, Shigeru
2018-01-01
Abstract To minimise the radiation dermatitis related to interventional radiology (IR), rapid and accurate dose estimation has been sought for all procedures. We propose a technique for estimating the patient skin dose rapidly and accurately using Monte Carlo (MC) simulation with a graphical processing unit (GPU, GTX 1080; Nvidia Corp.). The skin dose distribution is simulated based on an individual patient’s computed tomography (CT) dataset for fluoroscopic conditions after the CT dataset has been segmented into air, water and bone based on pixel values. The skin is assumed to be one layer at the outer surface of the body. Fluoroscopic conditions are obtained from a log file of a fluoroscopic examination. Estimating the absorbed skin dose distribution requires calibration of the dose simulated by our system. For this purpose, a linear function was used to approximate the relation between the simulated dose and the measured dose using radiophotoluminescence (RPL) glass dosimeters in a water-equivalent phantom. Differences of maximum skin dose between our system and the Particle and Heavy Ion Transport code System (PHITS) were as high as 6.1%. The relative statistical error (2 σ) for the simulated dose obtained using our system was ≤3.5%. Using a GPU, the simulation on the chest CT dataset aiming at the heart was within 3.49 s on average: the GPU is 122 times faster than a CPU (Core i7–7700K; Intel Corp.). Our system (using the GPU, the log file, and the CT dataset) estimated the skin dose more rapidly and more accurately than conventional methods. PMID:29136194
Fast skin dose estimation system for interventional radiology.
Takata, Takeshi; Kotoku, Jun'ichi; Maejima, Hideyuki; Kumagai, Shinobu; Arai, Norikazu; Kobayashi, Takenori; Shiraishi, Kenshiro; Yamamoto, Masayoshi; Kondo, Hiroshi; Furui, Shigeru
2018-03-01
To minimise the radiation dermatitis related to interventional radiology (IR), rapid and accurate dose estimation has been sought for all procedures. We propose a technique for estimating the patient skin dose rapidly and accurately using Monte Carlo (MC) simulation with a graphical processing unit (GPU, GTX 1080; Nvidia Corp.). The skin dose distribution is simulated based on an individual patient's computed tomography (CT) dataset for fluoroscopic conditions after the CT dataset has been segmented into air, water and bone based on pixel values. The skin is assumed to be one layer at the outer surface of the body. Fluoroscopic conditions are obtained from a log file of a fluoroscopic examination. Estimating the absorbed skin dose distribution requires calibration of the dose simulated by our system. For this purpose, a linear function was used to approximate the relation between the simulated dose and the measured dose using radiophotoluminescence (RPL) glass dosimeters in a water-equivalent phantom. Differences of maximum skin dose between our system and the Particle and Heavy Ion Transport code System (PHITS) were as high as 6.1%. The relative statistical error (2 σ) for the simulated dose obtained using our system was ≤3.5%. Using a GPU, the simulation on the chest CT dataset aiming at the heart was within 3.49 s on average: the GPU is 122 times faster than a CPU (Core i7-7700K; Intel Corp.). Our system (using the GPU, the log file, and the CT dataset) estimated the skin dose more rapidly and more accurately than conventional methods.
Techtalk: Telecommunications for Improving Developmental Education.
ERIC Educational Resources Information Center
Caverly, David C.; Broderick, Bill
1993-01-01
Explains how to access the Internet, discussing hardware and software considerations, connectivity, and types of access available to users. Describes the uses of electronic mail; TELNET, a method for remotely logging onto another computer; and anonymous File Transfer Protocol (FTP), a method for downloading files from a remote computer. (MAB)
Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Fosness, Ryan L.; Welcker, Chris; Kelso, Kyle W.
2014-01-01
From March 16 - 31, 2013, the U.S. Geological Survey in cooperation with the Idaho Power Company conducted a geophysical survey to investigate sediment deposits and long-term sediment transport within the Snake River from Brownlee Dam to Hells Canyon Reservoir, along the Idaho and Oregon border; this effort will help the USGS to better understand geologic processes. This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of acronyms and abbreviations used in this report.
Forde, Arnell S.; Dadisman, Shawn V.; Miselis, Jennifer L.; Flocks, James G.; Wiese, Dana S.
2013-01-01
From June 3 to 13, 2011, the U.S. Geological Survey conducted a geophysical survey to investigate the geologic controls on barrier island framework and long-term sediment transport along the oil spill mitigation sand berm constructed at the north end and just offshore of the Chandeleur Islands, LA. This effort is part of a broader USGS study, which seeks to better understand barrier island evolution over medium time scales (months to years). This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided.
Adding Data Management Services to Parallel File Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brandt, Scott
2015-03-04
The objective of this project, called DAMASC for “Data Management in Scientific Computing”, is to coalesce data management with parallel file system management to present a declarative interface to scientists for managing, querying, and analyzing extremely large data sets efficiently and predictably. Managing extremely large data sets is a key challenge of exascale computing. The overhead, energy, and cost of moving massive volumes of data demand designs where computation is close to storage. In current architectures, compute/analysis clusters access data in a physically separate parallel file system and largely leave it scientist to reduce data movement. Over the past decadesmore » the high-end computing community has adopted middleware with multiple layers of abstractions and specialized file formats such as NetCDF-4 and HDF5. These abstractions provide a limited set of high-level data processing functions, but have inherent functionality and performance limitations: middleware that provides access to the highly structured contents of scientific data files stored in the (unstructured) file systems can only optimize to the extent that file system interfaces permit; the highly structured formats of these files often impedes native file system performance optimizations. We are developing Damasc, an enhanced high-performance file system with native rich data management services. Damasc will enable efficient queries and updates over files stored in their native byte-stream format while retaining the inherent performance of file system data storage via declarative queries and updates over views of underlying files. Damasc has four key benefits for the development of data-intensive scientific code: (1) applications can use important data-management services, such as declarative queries, views, and provenance tracking, that are currently available only within database systems; (2) the use of these services becomes easier, as they are provided within a familiar file-based ecosystem; (3) common optimizations, e.g., indexing and caching, are readily supported across several file formats, avoiding effort duplication; and (4) performance improves significantly, as data processing is integrated more tightly with data storage. Our key contributions are: SciHadoop which explores changes to MapReduce assumption by taking advantage of semantics of structured data while preserving MapReduce’s failure and resource management; DataMods which extends common abstractions of parallel file systems so they become programmable such that they can be extended to natively support a variety of data models and can be hooked into emerging distributed runtimes such as Stanford’s Legion; and Miso which combines Hadoop and relational data warehousing to minimize time to insight, taking into account the overhead of ingesting data into data warehousing.« less
46 CFR 196.35-3 - Logbooks and records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... form CG-706 or in the owner's format for an official logbook. Such logs must be kept available for a... master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection. (b) The... of making entries therein as required by law or regulations in this subchapter. Such logs or records...
46 CFR 196.35-3 - Logbooks and records.
Code of Federal Regulations, 2011 CFR
2011-10-01
... form CG-706 or in the owner's format for an official logbook. Such logs must be kept available for a... master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection. (b) The... of making entries therein as required by law or regulations in this subchapter. Such logs or records...
46 CFR 35.07-5 - Logbooks and records-TB/ALL.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., the master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...
29 CFR 1960.28 - Employee reports of unsafe or unhealthful working conditions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... report of an existing or potential unsafe or unhealthful working condition should be recorded on a log maintained at the establishment. If an agency finds it inappropriate to maintain a log of written reports at... sequentially numbered case file, coded for identification, should be assigned for purposes of maintaining an...
20 CFR 658.422 - Handling of non-JS-related complaints by the Regional Administrator.
Code of Federal Regulations, 2011 CFR
2011-04-01
... non-JS-related complaints alleging violations of employment related laws shall be logged. The... which the complainant (or complaint) was referred on a complaint log, similar to the one described in § 658.410(c)(1). The appropriate regional official shall also prepare and keep the file specified in...
46 CFR 35.07-5 - Logbooks and records-TB/ALL.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., the master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibson, Garth
Petascale computing infrastructures for scientific discovery make petascale demands on information storage capacity, performance, concurrency, reliability, availability, and manageability. The Petascale Data Storage Institute focuses on the data storage problems found in petascale scientific computing environments, with special attention to community issues such as interoperability, community buy-in, and shared tools. The Petascale Data Storage Institute is a collaboration between researchers at Carnegie Mellon University, National Energy Research Scientific Computing Center, Pacific Northwest National Laboratory, Oak Ridge National Laboratory, Sandia National Laboratory, Los Alamos National Laboratory, University of Michigan, and the University of California at Santa Cruz. Because the Institute focusesmore » on low level files systems and storage systems, its role in improving SciDAC systems was one of supporting application middleware such as data management and system-level performance tuning. In retrospect, the Petascale Data Storage Institute’s most innovative and impactful contribution is the Parallel Log-structured File System (PLFS). Published in SC09, PLFS is middleware that operates in MPI-IO or embedded in FUSE for non-MPI applications. Its function is to decouple concurrently written files into a per-process log file, whose impact (the contents of the single file that the parallel application was concurrently writing) is determined on later reading, rather than during its writing. PLFS is transparent to the parallel application, offering a POSIX or MPI-IO interface, and it shows an order of magnitude speedup to the Chombo benchmark and two orders of magnitude to the FLASH benchmark. Moreover, LANL production applications see speedups of 5X to 28X, so PLFS has been put into production at LANL. Originally conceived and prototyped in a PDSI collaboration between LANL and CMU, it has grown to engage many other PDSI institutes, international partners like AWE, and has a large team at EMC supporting and enhancing it. PLFS is open sourced with a BSD license on sourceforge. Post PDSI funding comes from NNSA and industry sources. Moreover, PLFS has spin out half a dozen or more papers, partnered on research with multiple schools and vendors, and has projects to transparently 1) dis- tribute metadata over independent metadata servers, 2) exploit drastically non-POSIX Hadoop storage for HPC POSIX applications, 3) compress checkpoints on the fly, 4) batch delayed writes for write speed, 5) compress read-back indexes and parallelize their redistribution, 6) double-buffer writes in NAND Flash storage to decouple host blocking during checkpoint from disk write time in the storage system, 7) pack small files into a smaller number of bigger containers. There are two large scale open source Linux software projects that PDSI significantly incubated, though neither were initated in PDSI. These are 1) Ceph, a UCSC parallel object storage research project that has continued to be a vehicle for research, and has become a released part of Linux, and 2) Parallel NFS (pNFS) a portion of the IETF’s NFSv4.1 that brings the core data parallelism found in Lustre, PanFS, PVFS, and Ceph to the industry standard NFS, with released code in Linux 3.0, and its vendor offerings, with products from NetApp, EMC, BlueArc and RedHat. Both are fundamentally supported and advanced by vendor companies now, but were critcally transferred from research demonstration to viable product with funding from PDSI, in part. At this point Lustre remains the primary path to scalable IO in Exascale systems, but both Ceph and pNFS are viable alternatives with different fundamental advantages. Finally, research community building was a big success for PDSI. Through the HECFSIO workshops and HECURA project with NSF PDSI stimulated and helped to steer leveraged funding of over $25M. Through the Petascale (now Parallel) Data Storage Workshop series, www.pdsw.org, colocated with SCxy each year, PDSI created and incubated five offerings of this high-attendance workshop. The workshop has gone on without PDSI support with two more highly successfully workshops, rewriting its organizational structure to be community managed. More than 70 peer reviewed papers have been presented at PDSW workshops.« less
Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Torres, Aaron
2015-02-03
Techniques are provided for storing files in a parallel computing system using sub-files with semantically meaningful boundaries. A method is provided for storing at least one file generated by a distributed application in a parallel computing system. The file comprises one or more of a complete file and a plurality of sub-files. The method comprises the steps of obtaining a user specification of semantic information related to the file; providing the semantic information as a data structure description to a data formatting library write function; and storing the semantic information related to the file with one or more of the sub-files in one or more storage nodes of the parallel computing system. The semantic information provides a description of data in the file. The sub-files can be replicated based on semantically meaningful boundaries.
Analysis of the access patterns at GSFC distributed active archive center
NASA Technical Reports Server (NTRS)
Johnson, Theodore; Bedet, Jean-Jacques
1996-01-01
The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been operational for more than two years. Its mission is to support existing and pre Earth Observing System (EOS) Earth science datasets, facilitate the scientific research, and test Earth Observing System Data and Information System (EOSDIS) concepts. Over 550,000 files and documents have been archived, and more than six Terabytes have been distributed to the scientific community. Information about user request and file access patterns, and their impact on system loading, is needed to optimize current operations and to plan for future archives. To facilitate the management of daily activities, the GSFC DAAC has developed a data base system to track correspondence, requests, ingestion and distribution. In addition, several log files which record transactions on Unitree are maintained and periodically examined. This study identifies some of the users' requests and file access patterns at the GSFC DAAC during 1995. The analysis is limited to the subset of orders for which the data files are under the control of the Hierarchical Storage Management (HSM) Unitree. The results show that most of the data volume ordered was for two data products. The volume was also mostly made up of level 3 and 4 data and most of the volume was distributed on 8 mm and 4 mm tapes. In addition, most of the volume ordered was for deliveries in North America although there was a significant world-wide use. There was a wide range of request sizes in terms of volume and number of files ordered. On an average 78.6 files were ordered per request. Using the data managed by Unitree, several caching algorithms have been evaluated for both hit rate and the overhead ('cost') associated with the movement of data from near-line devices to disks. The algorithm called LRU/2 bin was found to be the best for this workload, but the STbin algorithm also worked well.
Real World Experience With Ion Implant Fault Detection at Freescale Semiconductor
NASA Astrophysics Data System (ADS)
Sing, David C.; Breeden, Terry; Fakhreddine, Hassan; Gladwin, Steven; Locke, Jason; McHugh, Jim; Rendon, Michael
2006-11-01
The Freescale automatic fault detection and classification (FDC) system has logged data from over 3.5 million implants in the past two years. The Freescale FDC system is a low cost system which collects summary implant statistics at the conclusion of each implant run. The data is collected by either downloading implant data log files from the implant tool workstation, or by exporting summary implant statistics through the tool's automation interface. Compared to the traditional FDC systems which gather trace data from sensors on the tool as the implant proceeds, the Freescale FDC system cannot prevent scrap when a fault initially occurs, since the data is collected after the implant concludes. However, the system can prevent catastrophic scrap events due to faults which are not detected for days or weeks, leading to the loss of hundreds or thousands of wafers. At the Freescale ATMC facility, the practical applications of the FD system fall into two categories: PM trigger rules which monitor tool signals such as ion gauges and charge control signals, and scrap prevention rules which are designed to detect specific failure modes that have been correlated to yield loss and scrap. PM trigger rules are designed to detect shifts in tool signals which indicate normal aging of tool systems. For example, charging parameters gradually shift as flood gun assemblies age, and when charge control rules start to fail a flood gun PM is performed. Scrap prevention rules are deployed to detect events such as particle bursts and excessive beam noise, events which have been correlated to yield loss. The FDC system does have tool log-down capability, and scrap prevention rules often use this capability to automatically log the tool into a maintenance state while simultaneously paging the sustaining technician for data review and disposition of the affected product.
The Wettzell System Monitoring Concept and First Realizations
NASA Technical Reports Server (NTRS)
Ettl, Martin; Neidhardt, Alexander; Muehlbauer, Matthias; Ploetz, Christian; Beaudoin, Christopher
2010-01-01
Automated monitoring of operational system parameters for the geodetic space techniques is becoming more important in order to improve the geodetic data and to ensure the safety and stability of automatic and remote-controlled observations. Therefore, the Wettzell group has developed the system monitoring software, SysMon, which is based on a reliable, remotely-controllable hardware/software realization. A multi-layered data logging system based on a fanless, robust industrial PC with an internal database system is used to collect data from several external, serial, bus, or PCI-based sensors. The internal communication is realized with Remote Procedure Calls (RPC) and uses generative programming with the interface software generator idl2rpc.pl developed at Wettzell. Each data monitoring stream can be configured individually via configuration files to define the logging rates or analog-digital-conversion parameters. First realizations are currently installed at the new laser ranging system at Wettzell to address safety issues and at the VLBI station O Higgins as a meteorological data logger. The system monitoring concept should be realized for the Wettzell radio telescope in the near future.
2001-11-01
that there were· no· target misses. The Hellfire missile does not have a depleted uranium head . . -,, 2.2.2.3 Tank movement During the test, the...guide otber users through the use of this. complicated program. The_input data files for NOISEMAP consist of a root file name with several extensions...SOURCES subdirectory. This file will have the root file name followed by an accession number, then the .bps extension. The user must check the *.log
Wahlgren, Carl-Fredrik; Edelbring, Samuel; Fors, Uno; Hindbeck, Hans; Ståhle, Mona
2006-01-01
Background Most of the many computer resources used in clinical teaching of dermatology and venereology for medical undergraduates are information-oriented and focus mostly on finding a "correct" multiple-choice alternative or free-text answer. We wanted to create an interactive computer program, which facilitates not only factual recall but also clinical reasoning. Methods Through continuous interaction with students, a new computerised interactive case simulation system, NUDOV, was developed. It is based on authentic cases and contains images of real patients, actors and healthcare providers. The student selects a patient and proposes questions for medical history, examines the skin, and suggests investigations, diagnosis, differential diagnoses and further management. Feedback is given by comparing the user's own suggestions with those of a specialist. In addition, a log file of the student's actions is recorded. The program includes a large number of images, video clips and Internet links. It was evaluated with a student questionnaire and by randomising medical students to conventional teaching (n = 85) or conventional teaching plus NUDOV (n = 31) and comparing the results of the two groups in a final written examination. Results The questionnaire showed that 90% of the NUDOV students stated that the program facilitated their learning to a large/very large extent, and 71% reported that extensive working with authentic computerised cases made it easier to understand and learn about diseases and their management. The layout, user-friendliness and feedback concept were judged as good/very good by 87%, 97%, and 100%, respectively. Log files revealed that the students, in general, worked with each case for 60–90 min. However, the intervention group did not score significantly better than the control group in the written examination. Conclusion We created a computerised case simulation program allowing students to manage patients in a non-linear format supporting the clinical reasoning process. The student gets feedback through comparison with a specialist, eliminating the need for external scoring or correction. The model also permits discussion of case processing, since all transactions are stored in a log file. The program was highly appreciated by the students, but did not significantly improve their performance in the written final examination. PMID:16907972
D'Amato, A.W.; Fraver, S.; Palik, B.J.; Bradford, J.B.; Patty, L.
2011-01-01
The role of disturbance in structuring vegetation is widely recognized; however, we are only beginning to understand the effects of multiple interacting disturbances on ecosystem recovery and development. Of particular interest is the impact of post-disturbance management interventions, particularly in light of the global controversy surrounding the effects of salvage logging on forest ecosystem recovery. Studies of salvage logging impacts have focused on the effects of post-disturbance salvage logging within the context of a single natural disturbance event. There have been no formal evaluations of how these effects may differ when followed in short sequence by a second, high severity natural disturbance. To evaluate the impact of this management practice within the context of multiple disturbances, we examined the structural and woody plant community responses of sub-boreal Pinus banksiana systems to a rapid sequence of disturbances. Specifically, we compared responses to Blowdown (B), Fire (F), Blowdown-Fire, and Blowdown-Salvage-Fire (BSF) and compared these to undisturbed control (C) stands. Comparisons between BF and BSF indicated that the primary effect of salvage logging was a decrease in the abundance of structural legacies, such as downed woody debris and snags. Both of these compound disturbance sequences (BF and BSF), resulted in similar woody plant communities, largely dominated by Populus tremuloides; however, there was greater homogeneity in community composition in salvage logged areas. Areas experiencing solely fire (F stands) were dominated by P. banksiana regeneration, and blowdown areas (B stands) were largely characterized by regeneration from shade tolerant conifer species. Our results suggest that salvage logging impacts on woody plant communities are diminished when followed by a second high severity disturbance; however, impacts on structural legacies persist. Provisions for the retention of snags, downed logs, and surviving trees as part of salvage logging operations will minimize these structural impacts and may allow for greater ecosystem recovery following these disturbance combinations. ?? 2011 Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradley, Jon David; Oppel III, Fred J.; Hart, Brian E.
Umbra is a flexible simulation framework for complex systems that can be used by itself for modeling, simulation, and analysis, or to create specific applications. It has been applied to many operations, primarily dealing with robotics and system of system simulations. This version, from 4.8 to 4.8.3b, incorporates bug fixes, refactored code, and new managed C++ wrapper code that can be used to bridge new applications written in C# to the C++ libraries. The new managed C++ wrapper code includes (project/directories) BasicSimulation, CSharpUmbraInterpreter, LogFileView, UmbraAboutBox, UmbraControls, UmbraMonitor and UmbraWrapper.
25 CFR 215.23 - Cooperation between superintendent and district mining supervisor.
Code of Federal Regulations, 2011 CFR
2011-04-01
... notices, reports, drill logs, maps, and records, and all other information relating to mining operations required by said regulations to be submitted by lessees, and shall maintain a file thereof for the superintendent. (b) The files of the Geological Survey supervisor relating to lead and zinc leases of Quapaw...
Agentless Cloud-Wide Monitoring of Virtual Disk State
2015-10-01
packages include Apache, MySQL , PHP, Ruby on Rails, Java Application Servers, and many others. Figure 2.12 shows the results of a run of the Software...Linux, Apache, MySQL , PHP (LAMP) set of applications. Thus, many file-level update logs will contain the same versions of files repeated across many
Military Standard Common APSE (Ada Programming Support Environment) Interface Set (CAIS).
1985-01-01
QUEUEASE. LAST-KEY (QUEENAME) . LASTREI.TIONI(QUEUE-NAME). FILE-NODE. PORN . ATTRIBUTTES. ACCESSCONTROL. LEVEL); CLOSE (QUEUE BASE); CLOSE(FILE NODE...PROPOSED XIIT-STD-C.4 31 J NNUAfY logs procedure zTERT (ITERATOR: out NODE ITERATON; MAMIE: NAME STRING.KIND: NODE KID : KEY : RELATIONSHIP KEY PA1TTE1 :R
Midwest Consortium for Wind Turbine Reliability and Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott R. Dana; Douglas E. Adams; Noah J. Myrent
2012-05-11
This report provides an overview of the efforts aimed to establish a student focused laboratory apparatus that will enhance Purdue's ability to recruit and train students in topics related to the dynamics, operations and economics of wind turbines. The project also aims to facilitate outreach to students at Purdue and in grades K-12 in the State of Indiana by sharing wind turbine operational data. For this project, a portable wind turbine test apparatus was developed and fabricated utilizing an AirX 400W wind energy converter. This turbine and test apparatus was outfitted with an array of sensors used to monitor windmore » speed, turbine rotor speed, power output and the tower structural dynamics. A major portion of this project included the development of a data logging program used to display real-time sensor data and the recording and creation of output files for data post-processing. The apparatus was tested in an open field to subject the turbine to typical operating conditions and the data acquisition system was adjusted to obtain desired functionality to facilitate use for student projects in existing courses offered at Purdue University and Indiana University. Data collected using the data logging program is analyzed and presented to demonstrate the usefulness of the test apparatus related to wind turbine dynamics and operations.« less
47 CFR 76.1704 - Proof-of-performance test data.
Code of Federal Regulations, 2011 CFR
2011-10-01
...-performance test data. (a) The proof of performance tests required by § 76.601 shall be maintained on file at... subscribers, subject to the requirements of § 76.601(d). Note to § 76.1704: If a signal leakage log is being... log must be retained for the period specified in § 76.601(d). ...
49 CFR Appendix A to Part 225 - Schedule of Civil Penalties 1
Code of Federal Regulations, 2010 CFR
2010-10-01
... $1,000 $2,000 225.11Reports of accidents/ incidents 2,500 5,000 225.12(a): Failure to file Railroad... noncompliance: (1) a missing or incomplete log entry for a particular employee's injury or illness; or (2) a missing or incomplete log record for a particular rail equipment accident or incident. Each day a...
47 CFR 76.1704 - Proof-of-performance test data.
Code of Federal Regulations, 2010 CFR
2010-10-01
...-performance test data. (a) The proof of performance tests required by § 76.601 shall be maintained on file at... subscribers, subject to the requirements of § 76.601(d). Note to § 76.1704: If a signal leakage log is being... log must be retained for the period specified in § 76.601(d). ...
ERIC Educational Resources Information Center
Avouris, N.; Fiotakis, G.; Kahrimanis, G.; Margaritis, M.; Komis, V.
2007-01-01
In this article, we discuss key requirements for collecting behavioural data concerning technology-supported collaborative learning activities. It is argued that the common practice of analysis of computer generated log files of user interactions with software tools is not enough for building a thorough view of the activity. Instead, more…
Consistency of Students' Pace in Online Learning
ERIC Educational Resources Information Center
Hershkovitz, Arnon; Nachmias, Rafi
2009-01-01
The purpose of this study is to investigate the consistency of students' behavior regarding their pace of actions over sessions within an online course. Pace in a session is defined as the number of logged actions divided by session length (in minutes). Log files of 6,112 students were collected, and datasets were constructed for examining pace…
Honda, Masayuki; Matsumoto, Takehiro
2017-01-01
Several kinds of event log data produced in daily clinical activities have yet to be used for secure and efficient improvement of hospital activities. Data Warehouse systems in Hospital Information Systems used for the analysis of structured data such as disease, lab-tests, and medications, have also shown efficient outcomes. This article is focused on two kinds of essential functions: process mining using log data and non-structured data analysis via Natural Language Processing.
Harrison, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.; Robbins, Lisa L.
2007-01-01
In May of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Siesta Key, Florida. This report serves as an archive of unprocessed digital chirp seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Harrison, Arnell S.; Dadisman, Shawn V.; Ferina, Nick F.; Wiese, Dana S.; Flocks, James G.
2007-01-01
In June of 2006, the U.S. Geological Survey conducted a geophysical survey offshore of Isles Dernieres, Louisiana. This report serves as an archive of unprocessed digital CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic UNIX (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Pfeiffer, William R.; Flocks, James G.; DeWitt, Nancy T.; Forde, Arnell S.; Kelso, Kyle; Thompson, Phillip R.; Wiese, Dana S.
2011-01-01
In March of 2010, the U.S. Geological Survey (USGS) conducted geophysical surveys offshore of Petit Bois Island, Mississippi, and Dauphin Island, Alabama (fig. 1). These efforts were part of the USGS Gulf of Mexico Science Coordination partnership with the U.S. Army Corps of Engineers (USACE) to assist the Mississippi Coastal Improvements Program (MsCIP) and the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazards Susceptibility Project by mapping the shallow geologic stratigraphic framework of the Mississippi Barrier Island Complex. These geophysical surveys will provide the data necessary for scientists to define, interpret, and provide baseline bathymetry and seafloor habitat for this area and to aid scientists in predicting future geomorphological changes of the islands with respect to climate change, storm impact, and sea-level rise. Furthermore, these data will provide information for barrier island restoration, particularly in Camille Cut, and protection for the historical Fort Massachusetts on Ship Island, Mississippi. For more information please refer to http://ngom.usgs.gov/gomsc/mscip/index.html. This report serves as an archive of the processed swath bathymetry and side scan sonar data (SSS). Data products herein include gridded and interpolated surfaces, seabed backscatter images, and ASCII x,y,z data products for both swath bathymetry and side scan sonar imagery. Additional files include trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal FGDC metadata. Scanned images of the handwritten and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.
Vucicevic, J; Popovic, M; Nikolic, K; Filipic, S; Obradovic, D; Agbaba, D
2017-03-01
For this study, 31 compounds, including 16 imidazoline/α-adrenergic receptor (IRs/α-ARs) ligands and 15 central nervous system (CNS) drugs, were characterized in terms of the retention factors (k) obtained using biopartitioning micellar and classical reversed phase chromatography (log k BMC and log k wRP , respectively). Based on the retention factor (log k wRP ) and slope of the linear curve (S) the isocratic parameter (φ 0 ) was calculated. Obtained retention factors were correlated with experimental log BB values for the group of examined compounds. High correlations were obtained between logarithm of biopartitioning micellar chromatography (BMC) retention factor and effective permeability (r(log k BMC /log BB): 0.77), while for RP-HPLC system the correlations were lower (r(log k wRP /log BB): 0.58; r(S/log BB): -0.50; r(φ 0 /P e ): 0.61). Based on the log k BMC retention data and calculated molecular parameters of the examined compounds, quantitative structure-permeability relationship (QSPR) models were developed using partial least squares, stepwise multiple linear regression, support vector machine and artificial neural network methodologies. A high degree of structural diversity of the analysed IRs/α-ARs ligands and CNS drugs provides wide applicability domain of the QSPR models for estimation of blood-brain barrier penetration of the related compounds.
A Kinect-based system for automatic recording of some pigeon behaviors.
Lyons, Damian M; MacDonall, James S; Cunningham, Kelly M
2015-12-01
Contact switches and touch screens are the state of the art for recording pigeons' pecking behavior. Recording other behavior, however, requires a different sensor for each behavior, and some behaviors cannot easily be recorded. We present a flexible and inexpensive image-based approach to detecting and counting pigeon behaviors that is based on the Kinect sensor from Microsoft. Although the system is as easy to set up and use as the standard approaches, it is more flexible because it can record behaviors in addition to key pecking. In this article, we show how both the fast, fine motion of key pecking and the gross body activity of feeding can be measured. Five pigeons were trained to peck at a lighted contact switch, a pigeon key, to obtain food reward. The timing of the pecks and the food reward signals were recorded in a log file using standard equipment. The Kinect-based system, called BehaviorWatch, also measured the pecking and feeding behavior and generated a different log file. For key pecking, BehaviorWatch had an average sensitivity of 95% and a precision of 91%, which were very similar to the pecking measurements from the standard equipment. For detecting feeding activity, BehaviorWatch had a sensitivity of 95% and a precision of 97%. These results allow us to demonstrate that an advantage of the Kinect-based approach is that it can also be reliably used to measure activity other than key pecking.
Caine, Jonathan S.; Manning, Andrew H.; Verplanck, Philip L.; Bove, Dana J.; Kahn, Katherine Gurley; Ge, Shemin
2006-01-01
Integrated, multidisciplinary studies of the Handcart Gulch alpine watershed provide a unique opportunity to study and characterize the geology and hydrology of an alpine watershed along the Continental Divide. The study area arose out of the donation of four abandoned, deep mineral exploration boreholes to the U.S. Geological Survey for research purposes by Mineral Systems Inc. These holes were supplemented with nine additional shallow holes drilled by the U.S. Geological Survey along the Handcart Gulch trunk stream. All of the holes were converted into observation wells, and a variety of data and samples were measured and collected from each. This open-file report contains: (1) An overview of the research conducted to date in Handcart Gulch; (2) well location, construction, lithologic log, and water level data from the research boreholes; and (3) a brief synopsis of preliminary results. The primary purpose of this report is to provide a research overview as well as raw data from the boreholes. Interpretation of the data will be reported in future publications. The drill hole data were tabulated into a spreadsheet included with this digital open-file report.
Discrete event simulation tool for analysis of qualitative models of continuous processing systems
NASA Technical Reports Server (NTRS)
Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)
1990-01-01
An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.
Analysis of the Impact of Data Normalization on Cyber Event Correlation Query Performance
2012-03-01
2003). Organizations use it in planning, target marketing , decision-making, data analysis, and customer services (Shin, 2003). Organizations that...Following this IP address is a router message sequence number. This is a globally unique number for each router terminal and can range from...Appendix G, invokes the PERL parser for the log files from a particular USAF base, and invokes the CTL file that loads the resultant CSV file into the
The Galley Parallel File System
NASA Technical Reports Server (NTRS)
Nieuwejaar, Nils; Kotz, David
1996-01-01
As the I/O needs of parallel scientific applications increase, file systems for multiprocessors are being designed to provide applications with parallel access to multiple disks. Many parallel file systems present applications with a conventional Unix-like interface that allows the application to access multiple disks transparently. The interface conceals the parallelism within the file system, which increases the ease of programmability, but makes it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. Furthermore, most current parallel file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic parallel workloads. We discuss Galley's file structure and application interface, as well as an application that has been implemented using that interface.
Forde, Arnell S.; Flocks, James G.; Kindinger, Jack G.; Bernier, Julie C.; Kelso, Kyle W.; Wiese, Dana S.
2015-01-01
From August 13-23, 2013, the U.S. Geological Survey (USGS), in cooperation with the U.S. Army Corps of Engineers (USACE) conducted geophysical surveys to investigate the geologic controls on barrier island framework and long-term sediment transport offshore of Petit Bois Island, Mississippi. This investigation is part of a broader USGS study on Coastal Change and Transport (CCT). These surveys were funded through the Mississippi Coastal Improvements Program (MsCIP) with partial funding provided by the Northern Gulf of Mexico Ecosystem Change and Hazard Susceptibility Project. This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained-showing a relative increase in signal amplitude-digital images of the seismic profiles are provided.
Barry, K.M.; Cavers, D.A.; Kneale, C.W.
2011-01-01
In July and September of 2008, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on island framework from Ship Island to Horn Island, MS, for the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility project. This project is also part of a broader USGS study on Coastal Change and Transport (CCT). This report serves as an archive of unprocessed digital Chirp sub-bottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, observer's logbook, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (a relative increase in signal amplitude) digital images of the sub-bottom profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.
2013-06-01
collection are the facts that devices the lack encryption or compression methods and that the log file must be saved on the host system prior to transfer...time. Statistical correlation utilizes numerical algorithms to detect deviations from normal event levels and other routine activities (Chuvakin...can also assist in detecting low volume threats. Although easy and logical to implement, the implementation of statistical correlation algorithms
Integrated system for well-to-well correlation with geological knowledge base
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saito, K.; Doi, E.; Uchiyama, T.
1987-05-01
A task of well-to-well correlation is an essential part of the reservoir description study. Since the task is involved with diverse data such as logs, dipmeter, seismic, and reservoir engineering, a system with simultaneous access to such data is desirable. A system is developed to aid stratigraphic correlation under a Xerox 1108 workstation, written in INTERLISP-D. The system uses log, dipmeter, seismic, and computer-processed results such as Litho-Analysis and LSA (Log Shape Analyzer). The system first defines zones which are segmentations of log data into consistent layers using Litho-Analysis and LSA results. Each zone is defined as a minimum unitmore » for correlation with slot values of lithology, thickness, log values, and log shape such as bell, cylinder, and funnel. Using a user's input of local geological knowledge such as depositional environment, the system selects marker beds and performs correlation among the wells chosen from the base map. Correlation is performed first with markers and then with sandstones of lesser lateral extent. Structural dip and seismic horizon are guides for seeking a correlatable event. Knowledge of sand body geometry such as ratio of thickness and width is also used to provide a guide on how far a correlation should be made. Correlation results performed by the system are displayed on the screen for the user to examine and modify. The system has been tested with data sets from several depositional settings and has shown to be a useful tool for correlation work. The results are stored as a data base for structural mapping and reservoir engineering study.« less
32 CFR 776.80 - Initial screening and Rules Counsel.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Director, JA Division, HQMC, to JAR. (b) JAG(13) and JAR shall log all complaints received and will ensure... within 30 days of the date of its return, the Rules Counsel may close the file without further action... action to close the file. (2) Complaints that comply with the requirements shall be further reviewed by...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wijesooriya, K; Seitter, K; Desai, V
Purpose: To present our single institution experience on catching errors with trajectory log file analysis. The reported causes of failures, probability of occurrences (O), severity of effects (S), and the probability of the failures to be undetected (D) could be added to guidelines of FMEA analysis. Methods: From March 2013 to March 2014, 19569 patient treatment fields/arcs were analyzed. This work includes checking all 131 treatment delivery parameters for all patients, all treatment sites and all treatment delivery fractions. TrueBeam trajectory log files for all treatment field types as well as all imaging types were accessed, read in every 20ms,more » and every control point (total of 37 million parameters) compared to the physician approved plan in the planning system. Results: Couch angle outlier occurrence: N= 327, range = −1.7 −1.2 deg; gantry angle outlier occurrence: N =59, range = 0.09 – 5.61 deg, collimator angle outlier occurrence: N = 13, range = −0.2 – 0.2 deg. VMAT cases have slightly larger variations in mechanical parameters. MLC: 3D single control point fields have a maximum deviation of 0.04 mm, 39 step and shoot IMRT cases have MLC −0.3 – 0.5 mm deviations, all (1286) VMAT cases have −0.9 – 0.7 mm deviations. Two possible serious errors were found: 1) A 4 cm isocenter shift for the PA beam of an AP-PA pair, under-dosing a portion of PTV by 25%. 2) Delivery with MLC leaves abutted behind the jaws as opposed to the midline as planned, leading to a under-dosing of a small volume of the PTV by 25%, by just the boost plan. Due to their error origin, neither of these errors could have been detected by pre-treatment verification. Conclusion: Performing Trajectory Log file analysis could catch typically undetected errors to avoid potentially adverse incidents.« less
Forde, Arnell S.; Miselis, Jennifer L.; Wiese, Dana S.
2014-01-01
From July 23 - 31, 2012, the U.S. Geological Survey conducted geophysical surveys to investigate the geologic controls on barrier island framework and long-term sediment transport along the oil spill mitigation sand berm constructed at the north end and just offshore of the Chandeleur Islands, La. (figure 1). This effort is part of a broader USGS study, which seeks to better understand barrier island evolution over medium time scales (months to years). This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Abbreviations page for expansions of acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 12BIM03 tells us the data were collected in 2012 during the third field activity for that project in that calendar year and BIM is a generic code, which represents efforts related to Barrier Island Mapping. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. All chirp systems use a signal of continuously varying frequency; the EdgeTech SB-424 system used during this survey produces high-resolution, shallow-penetration (typically less than 50 milliseconds (ms)) profile images of sub-seafloor stratigraphy. The towfish contains a transducer that transmits and receives acoustic energy and is typically towed 1 - 2 m below the sea surface. As transmitted acoustic energy intersects density boundaries, such as the seafloor or sub-surface sediment layers, energy is reflected back toward the transducer, received, and recorded by a PC-based seismic acquisition system. This process is repeated at regular time intervals (for example, 0.125 seconds (s)) and returned energy is recorded for a specific duration (for example, 50 ms). In this way, a two-dimensional (2-D) vertical image of the shallow geologic structure beneath the ship track is produced. Figure 2 displays the acquisition geometry. Refer to table 1 for a summary of acquisition parameters and table 2 for trackline statistics. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y rev. 0 format (Barry and others, 1975); the first 3,200 bytes of the card image header are in ASCII format instead of EBCDIC format. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2010). See the How To Download SEG Y Data page for download instructions. The web version of this archive does not contain the SEG Y trace files. These files are very large and would require extremely long download times. To obtain the complete DVD archive, contact USGS Information Services at 1-888-ASK-USGS or infoservices@usgs.gov. The printable profiles provided here are GIF images that were processed and gained using SU software and can be viewed from the Profiles page or from links located on the trackline maps; refer to the Software page for links to example SU processing scripts. The SEG Y files are available on the DVD version of this report or on the Web, downloadable via the USGS Coastal and Marine Geoscience Data System (http://cmgds.marine.usgs.gov). The data are also available for viewing using GeoMapApp (http://www.geomapapp.org) and Virtual Ocean (http://www.virtualocean.org) multi-platform open source software. Detailed information about the navigation system used can be found in table 1 and the Field Activity Collection System (FACS) logs. To view the trackline maps and navigation files, and for more information about these items, see the Navigation page.
SU-E-T-261: Plan Quality Assurance of VMAT Using Fluence Images Reconstituted From Log-Files
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katsuta, Y; Shimizu, E; Matsunaga, K
2014-06-01
Purpose: A successful VMAT plan delivery includes precise modulations of dose rate, gantry rotational and multi-leaf collimator (MLC) shapes. One of the main problem in the plan quality assurance is dosimetric errors associated with leaf-positional errors are difficult to analyze because they vary with MU delivered and leaf number. In this study, we calculated integrated fluence error image (IFEI) from log-files and evaluated plan quality in the area of all and individual MLC leaves scanned. Methods: The log-file reported the expected and actual position for inner 20 MLC leaves and the dose fraction every 0.25 seconds during prostate VMAT onmore » Elekta Synergy. These data were imported to in-house software that developed to calculate expected and actual fluence images from the difference of opposing leaf trajectories and dose fraction at each time. The IFEI was obtained by adding all of the absolute value of the difference between expected and actual fluence images corresponding. Results: In the area all MLC leaves scanned in the IFEI, the average and root mean square (rms) were 2.5 and 3.6 MU, the area of errors below 10, 5 and 3 MU were 98.5, 86.7 and 68.1 %, the 95 % of area was covered with less than error of 7.1 MU. In the area individual MLC leaves scanned in the IFEI, the average and rms value were 2.1 – 3.0 and 3.1 – 4.0 MU, the area of errors below 10, 5 and 3 MU were 97.6 – 99.5, 81.7 – 89.5 and 51.2 – 72.8 %, the 95 % of area was covered with less than error of 6.6 – 8.2 MU. Conclusion: The analysis of the IFEI reconstituted from log-file was provided detailed information about the delivery in the area of all and individual MLC leaves scanned.« less
Interactive visualization tools for the structural biologist.
Porebski, Benjamin T; Ho, Bosco K; Buckle, Ashley M
2013-10-01
In structural biology, management of a large number of Protein Data Bank (PDB) files and raw X-ray diffraction images often presents a major organizational problem. Existing software packages that manipulate these file types were not designed for these kinds of file-management tasks. This is typically encountered when browsing through a folder of hundreds of X-ray images, with the aim of rapidly inspecting the diffraction quality of a data set. To solve this problem, a useful functionality of the Macintosh operating system (OSX) has been exploited that allows custom visualization plugins to be attached to certain file types. Software plugins have been developed for diffraction images and PDB files, which in many scenarios can save considerable time and effort. The direct visualization of diffraction images and PDB structures in the file browser can be used to identify key files of interest simply by scrolling through a list of files.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., Import Inspection Division, is on file at the import inspection facility where the inspection is to be... stamping log containing the following information for each lot of product: the date of inspection, the... container marks, and the MP-410 number covering the product to be inspected. The daily stamping log must be...
The Galley Parallel File System
NASA Technical Reports Server (NTRS)
Nieuwejaar, Nils; Kotz, David
1996-01-01
Most current multiprocessor file systems are designed to use multiple disks in parallel, using the high aggregate bandwidth to meet the growing I/0 requirements of parallel scientific applications. Many multiprocessor file systems provide applications with a conventional Unix-like interface, allowing the application to access multiple disks transparently. This interface conceals the parallelism within the file system, increasing the ease of programmability, but making it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. In addition to providing an insufficient interface, most current multiprocessor file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic scientific multiprocessor workloads. We discuss Galley's file structure and application interface, as well as the performance advantages offered by that interface.
Request queues for interactive clients in a shared file system of a parallel computing system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bent, John M.; Faibish, Sorin
Interactive requests are processed from users of log-in nodes. A metadata server node is provided for use in a file system shared by one or more interactive nodes and one or more batch nodes. The interactive nodes comprise interactive clients to execute interactive tasks and the batch nodes execute batch jobs for one or more batch clients. The metadata server node comprises a virtual machine monitor; an interactive client proxy to store metadata requests from the interactive clients in an interactive client queue; a batch client proxy to store metadata requests from the batch clients in a batch client queue;more » and a metadata server to store the metadata requests from the interactive client queue and the batch client queue in a metadata queue based on an allocation of resources by the virtual machine monitor. The metadata requests can be prioritized, for example, based on one or more of a predefined policy and predefined rules.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galbraith, R.M.
1978-05-01
The Coso Geothermal Exploration Hole number one (CGEH-1) was drilled in the Coso Hot Springs KGRA, California, from September 2 to December 2, 1977. Chip samples were collected at ten foot intervals and extensive geophysical logging surveys were conducted to document the geologic character of the geothermal system as penetrated by CGEH-1. The major rock units encountered include a mafic metamorphic sequence and a leucogranite which intruded the metamorphic rocks. Only weak hydrothermal alteration was noted in these rocks. Drillhole surveys and drilling rate data indicate that the geothermal system is structurally controlled and that the drillhole itself was stronglymore » influenced by structural zones. Water chemistry indicates that this geothermal resource is a hot-water rather than a vapor-dominated system. Several geophysical logs were employed to characcterize the drillhole geology. The natural gamma and neutron porosity logs indicate gross rock type and the accoustic logs indicate fractured rock and potentially permeable zones. A series of temperature logs run as a function of time during and after the completion of drilling were most useful in delineating the zones of maximum heat flux. Convective heat flow and temperatures greater than 350/sup 0/F appear to occur only along an open fracture system encountered between depths of 1850 and 2775 feet. Temperature logs indicate a negative thermal gradient below 3000 feet.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galbraith, R.M.
1978-05-01
The Coso Geothermal Exploration Hole number one (CGEH-1) was drilled in the Coso Hot Springs KGRA, California from September 2 to December 2, 1977. Chip samples were collected at ten foot intervals and extensive geophysical logging surveys were conducted to document the geologic character of the geothermal system as penetrated by CGEH-1. The major rock units encountered include a mafic metamorphic sequence and a leucogranite which intruded the metamorphic rocks. Only weak hydrothermal alteration was noted in these rocks. Drillhole surveys and drilling rate data indicate that the geothermal system is structurally controlled and that the drillhole itself was stronglymore » influenced by structural zones. Water chemistry indicates that this geothermal resource is a hot-water rather than a vapor-dominated system. Several geophysical logs were employed to characterize the drillhole geology. The natural gamma and neutron porosity logs indicate gross rock type and the acoustic logs indicate fractured rock and potentially permeable zones. A series of temperature logs run as a function of time during and after the completion of drilling were most useful in delineating the zones of maximum heat flux. Convective heat flow and temperatures greater than 350/sup 0/F appear to occur only along an open fracture system encountered between depths of 1850 and 2775 feet. Temperature logs indicate a negative thermal gradient below 3000 feet.« less
Dataset for forensic analysis of B-tree file system.
Wani, Mohamad Ahtisham; Bhat, Wasim Ahmad
2018-06-01
Since B-tree file system (Btrfs) is set to become de facto standard file system on Linux (and Linux based) operating systems, Btrfs dataset for forensic analysis is of great interest and immense value to forensic community. This article presents a novel dataset for forensic analysis of Btrfs that was collected using a proposed data-recovery procedure. The dataset identifies various generalized and common file system layouts and operations, specific node-balancing mechanisms triggered, logical addresses of various data structures, on-disk records, recovered-data as directory entries and extent data from leaf and internal nodes, and percentage of data recovered.
Ground-water data for the Hanna and Carbon basins, south-central Wyoming, through 1980
Daddow, P.B.
1986-01-01
Groundwater resources in the Hanna and Carbon Basins of Wyoming were assessed in a study from 1974 through 1980 because of the development of coal mining in the area. Data collected from 105 wells during that study, including well-completion records, lithologic logs, and water levels, are presented. The data are from stock wells, coal-test holes completed as observation wells by the U.S. Geological Survey. The data are mostly from mined coal-bearing formations: the Tertiary Hanna Formation and the Tertiary and Cretaceous Ferris Formation. Well-completion data and lithologic logs were collected on-site during drilling of the wells or from U.S. Geological Survey files, company records, Wyoming State Engineer well-permit files, and published reports. (USGS)
VizieR Online Data Catalog: The Gemini Observation Log (CADC, 2001-)
NASA Astrophysics Data System (ADS)
Association of Universities For Research in Astronomy
2018-01-01
This database contains a log of the Gemini Telescope observations since 2001, managed by the Canadian Astronomical Data Center (CADC). The data are regularly updated (see the date of the last version at the end of this file). The Gemini Observatory consists of twin 8.1-meter diameter optical/infrared telescopes located on two of the best observing sites on the planet. From their locations on mountains in Hawai'i and Chile, Gemini Observatory's telescopes can collectively access the entire sky. Gemini is operated by a partnership of five countries including the United States, Canada, Brazil, Argentina and Chile. Any astronomer in these countries can apply for time on Gemini, which is allocated in proportion to each partner's financial stake. (1 data file).
Ontology based log content extraction engine for a posteriori security control.
Azkia, Hanieh; Cuppens-Boulahia, Nora; Cuppens, Frédéric; Coatrieux, Gouenou
2012-01-01
In a posteriori access control, users are accountable for actions they performed and must provide evidence, when required by some legal authorities for instance, to prove that these actions were legitimate. Generally, log files contain the needed data to achieve this goal. This logged data can be recorded in several formats; we consider here IHE-ATNA (Integrating the healthcare enterprise-Audit Trail and Node Authentication) as log format. The difficulty lies in extracting useful information regardless of the log format. A posteriori access control frameworks often include a log filtering engine that provides this extraction function. In this paper we define and enforce this function by building an IHE-ATNA based ontology model, which we query using SPARQL, and show how the a posteriori security controls are made effective and easier based on this function.
Harrison, Arnell S.; Dadisman, Shawn V.; Reich, Christopher D.; Wiese, Dana S.; Greenwood, Jason W.; Swarzenski, Peter W.
2007-01-01
In September of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Fort Lauderdale, FL. This report serves as an archive of unprocessed digital boomer and CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Sanford, Jordan M.; Harrison, Arnell S.; Wiese, Dana S.; Flocks, James G.
2009-01-01
In June of 1990 and July of 1991, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the shallow geologic framework of the Mississippi-Alabama-Florida shelf in the northern Gulf of Mexico, from Mississippi Sound to the Florida Panhandle. Work was done onboard the Mississippi Mineral Resources Institute R/V Kit Jones as part of a project to study coastal erosion and offshore sand resources. This report is part of a series to digitally archive the legacy analog data collected from the Mississippi-Alabama SHelf (MASH). The MASH data rescue project is a cooperative effort by the USGS and the Minerals Management Service (MMS). This report serves as an archive of high-resolution scanned Tagged Image File Format (TIFF) and Graphics Interchange Format (GIF) images of the original boomer paper records, navigation files, trackline maps, Geographic Information System (GIS) files, cruise logs, and formal Federal Geographic Data Committee (FGDC) metadata.
Rizvi, Sanam Shahla; Chung, Tae-Sun
2010-01-01
Flash memory has become a more widespread storage medium for modern wireless devices because of its effective characteristics like non-volatility, small size, light weight, fast access speed, shock resistance, high reliability and low power consumption. Sensor nodes are highly resource constrained in terms of limited processing speed, runtime memory, persistent storage, communication bandwidth and finite energy. Therefore, for wireless sensor networks supporting sense, store, merge and send schemes, an efficient and reliable file system is highly required with consideration of sensor node constraints. In this paper, we propose a novel log structured external NAND flash memory based file system, called Proceeding to Intelligent service oriented memorY Allocation for flash based data centric Sensor devices in wireless sensor networks (PIYAS). This is the extended version of our previously proposed PIYA [1]. The main goals of the PIYAS scheme are to achieve instant mounting and reduced SRAM space by keeping memory mapping information to a very low size of and to provide high query response throughput by allocation of memory to the sensor data by network business rules. The scheme intelligently samples and stores the raw data and provides high in-network data availability by keeping the aggregate data for a longer period of time than any other scheme has done before. We propose effective garbage collection and wear-leveling schemes as well. The experimental results show that PIYAS is an optimized memory management scheme allowing high performance for wireless sensor networks.
Well 9-1 Logs and Data: Roosevelt Hot Spring Area, Utah (FORGE)
Joe Moore
2016-03-03
This is a compilation of logs and data from Well 9-1 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
ERIC Educational Resources Information Center
Asunka, Stephen; Chae, Hui Soo; Hughes, Brian; Natriello, Gary
2009-01-01
Transaction logs of user activity on an academic library website were analyzed to determine general usage patterns on the website. This paper reports on insights gained from the analysis, and identifies and discusses issues relating to content access, interface design and general functionality of the website. (Contains 13 figures and 8 tables.)
The Use of OPAC in a Large Academic Library: A Transactional Log Analysis Study of Subject Searching
ERIC Educational Resources Information Center
Villen-Rueda, Luis; Senso, Jose A.; de Moya-Anegon, Felix
2007-01-01
The analysis of user searches in catalogs has been the topic of research for over four decades, involving numerous studies and diverse methodologies. The present study looks at how different types of users effect queries in the catalog of a university library. For this purpose, we analyzed log files to determine which was the most frequent type of…
ERIC Educational Resources Information Center
Copeland, Tom
Figuring depreciation can be the most difficult aspect of filing tax returns for a family child care program. This inventory log for family child care programs is designed to assist in keeping track of the furniture, appliances, and other property used in the child care business; once these items have been identified, they can be deducted as…
Grading options for western hemlock "pulpwood" logs from southeastern Alaska.
David W. Green; Kent A. McDonald; John Dramm; Kenneth Kilborn
Properties and grade yield are estimated for structural lumber produced from No. 3, No. 4, and low-end No. 2 grade western hemlock logs of the type previously used primarily for the production of pulp chips. Estimates are given for production in the Structural Framing, Machine Stress Rating, and Laminating Stock grading systems. The information shows that significant...
Wister, CA Downhole and Seismic Data
Akerley, John
2010-12-18
This submission contains Downhole geophysical logs associated with Wister, CA Wells 12-27 and 85-20. The logs include Spontaneous Potential (SP), HILT Caliper (HCAL), Gamma Ray (GR), Array Induction (AIT), and Neutron Porosity (NPOR) data. Also included are a well log, Injection Test, Pressure Temperature Spinner log, shut in temperature survey, a final well schematic, and files about the well's location and drilling history. This submission also contains data from a three-dimensional (3D) multi-component (3C) seismic reflection survey on the Wister Geothermal prospect area in the northern portion of the Imperial Valley, California. The Wister seismic survey area was 13.2 square miles. (Resistivity image logs (Schlumberger FMI) in 85-20 indicate that maximum horizontal stress (Shmax) is oriented NNE but that open fractures are oriented suboptimally).
PKI solar thermal plant evaluation at Capitol Concrete Products, Topeka, Kansas
NASA Astrophysics Data System (ADS)
Hauger, J. S.; Borton, D. N.
1982-07-01
A system feasibility test to determine the technical and operational feasibility of using a solar collector to provide industrial process heat is discussed. The test is of a solar collector system in an industrial test bed plant at Capitol Concrete Products in Topeka, Kansas, with an experiment control at Sandia National Laboratories, Albuquerque. Plant evaluation will occur during a year-long period of industrial utilization. It will include performance testing, operability testing, and system failure analysis. Performance data will be recorded by a data acquisition system. User, community, and environmental inputs will be recorded in logs, journals, and files. Plant installation, start-up, and evaluation, are anticipated for late November, 1981.
PKI solar thermal plant evaluation at Capitol Concrete Products, Topeka, Kansas
NASA Technical Reports Server (NTRS)
Hauger, J. S.; Borton, D. N.
1982-01-01
A system feasibility test to determine the technical and operational feasibility of using a solar collector to provide industrial process heat is discussed. The test is of a solar collector system in an industrial test bed plant at Capitol Concrete Products in Topeka, Kansas, with an experiment control at Sandia National Laboratories, Albuquerque. Plant evaluation will occur during a year-long period of industrial utilization. It will include performance testing, operability testing, and system failure analysis. Performance data will be recorded by a data acquisition system. User, community, and environmental inputs will be recorded in logs, journals, and files. Plant installation, start-up, and evaluation, are anticipated for late November, 1981.
Kimbrow, Dustin R.
2014-01-01
Topographic survey data of areas on Dauphin Island on the Alabama coast were collected using a truck-mounted mobile terrestrial light detection and ranging system. This system is composed of a high frequency laser scanner in conjunction with an inertial measurement unit and a position and orientation computer to produce highly accurate topographic datasets. A global positioning system base station was set up on a nearby benchmark and logged vertical and horizontal position information during the survey for post-processing. Survey control points were also collected throughout the study area to determine residual errors. Data were collected 5 days after Hurricane Isaac made landfall in early September 2012 to document sediment deposits prior to clean-up efforts. Three data files in ASCII text format with the extension .xyz are included in this report, and each file is named according to both the acquisition date and the relative geographic location on Dauphin Island (for example, 20120903_Central.xyz). Metadata are also included for each of the files in both Extensible Markup Language with the extension .xml and ASCII text formats. These topographic data can be used to analyze the effects of storm surge on barrier island environments and also serve as a baseline dataset for future change detection analyses.
Analyzing Medical Image Search Behavior: Semantics and Prediction of Query Results.
De-Arteaga, Maria; Eggel, Ivan; Kahn, Charles E; Müller, Henning
2015-10-01
Log files of information retrieval systems that record user behavior have been used to improve the outcomes of retrieval systems, understand user behavior, and predict events. In this article, a log file of the ARRS GoldMiner search engine containing 222,005 consecutive queries is analyzed. Time stamps are available for each query, as well as masked IP addresses, which enables to identify queries from the same person. This article describes the ways in which physicians (or Internet searchers interested in medical images) search and proposes potential improvements by suggesting query modifications. For example, many queries contain only few terms and therefore are not specific; others contain spelling mistakes or non-medical terms that likely lead to poor or empty results. One of the goals of this report is to predict the number of results a query will have since such a model allows search engines to automatically propose query modifications in order to avoid result lists that are empty or too large. This prediction is made based on characteristics of the query terms themselves. Prediction of empty results has an accuracy above 88%, and thus can be used to automatically modify the query to avoid empty result sets for a user. The semantic analysis and data of reformulations done by users in the past can aid the development of better search systems, particularly to improve results for novice users. Therefore, this paper gives important ideas to better understand how people search and how to use this knowledge to improve the performance of specialized medical search engines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blackwell, David D.; Chickering Pace, Cathy; Richards, Maria C.
The National Geothermal Data System (NGDS) is a Department of Energy funded effort to create a single cataloged source for a variety of geothermal information through a distributed network of databases made available via web services. The NGDS will help identify regions suitable for potential development and further scientific data collection and analysis of geothermal resources as a source for clean, renewable energy. A key NGDS repository or ‘node’ is located at Southern Methodist University developed by a consortium made up of: • SMU Geothermal Laboratory • Siemens Corporate Technology, a division of Siemens Corporation • Bureau of Economic Geologymore » at the University of Texas at Austin • Cornell Energy Institute, Cornell University • Geothermal Resources Council • MLKay Technologies • Texas Tech University • University of North Dakota. The focus of resources and research encompass the United States with particular emphasis on the Gulf Coast (on and off shore), the Great Plains, and the Eastern U.S. The data collection includes the thermal, geological and geophysical characteristics of these area resources. Types of data include, but are not limited to, temperature, heat flow, thermal conductivity, radiogenic heat production, porosity, permeability, geological structure, core geophysical logs, well tests, estimated reservoir volume, in situ stress, oil and gas well fluid chemistry, oil and gas well information, and conventional and enhanced geothermal system related resources. Libraries of publications and reports are combined into a unified, accessible, catalog with links for downloading non-copyrighted items. Field notes, individual temperature logs, site maps and related resources are included to increase data collection knowledge. Additional research based on legacy data to improve quality increases our understanding of the local and regional geology and geothermal characteristics. The software to enable the integration, analysis, and dissemination of this team’s NGDS contributions was developed by Siemens Corporate Technology. The SMU Node interactive application is accessible at http://geothermal.smu.edu. Additionally, files may be downloaded from either http://geothermal.smu.edu:9000/geoserver/web/ or through http://geothermal.smu.edu/static/DownloadFilesButtonPage.htm. The Geothermal Resources Council Library is available at https://www.geothermal-library.org/.« less
VizieR Online Data Catalog: Astron low resolution UV spectra (Boyarchuk+, 1994)
NASA Astrophysics Data System (ADS)
Boyarchuk, A. A.
2017-05-01
Astron was a Soviet spacecraft launched on 23 March 1983, and it was operational for eight years as the largest ultraviolet space telescope during its lifetime. Astron's payload consisted of an 80 cm ultraviolet telescope Spica and an X-ray spectroscope. We present 159 low resolution spectra of stars obtained during the Astron space mission (Tables 4, 5; hereafter table numbers in Boyarchuk et al. 1994 are given). Table 4 (observational log, logs.dat) contains data on 142 sessions for 90 stars (sorted in ascending order of RA), where SED was obtained by scanning method, and then data on 17 sessions for 15 stars (also sorted in ascending order of RA), where multicolor photometry was done. Kilpio et al. (2016, Baltic Astronomy 25, 23) presented results of the comparison of Astron data to the modern UV stellar data, discussed Astron precision and accuracy, and made some conclusions on potential application areas of these data. Also 34 sessions of observations of 27 stellar systems (galaxies and globular clusters) are presented. Observational log was published in Table 10 and data were published in Table 11, respectively. Also 16 sessions of observations of 12 nebulae (Table 12 for observational log and Table 13 for data themselves) are presented. Background radiation intensity data (Table 14) are presented in Table 15. At last, data on comets are presented in different forms. We draw your attention that observational data for stars, stellar systems, nebulae and comets are expressed in log [erg/s/cm^2/A], while for comets data 10E-13 erg/s/cm^2/A units are used, hydroxyl band photometric data for comets are expressed in log [erg/s/cm^2], and for the background data it is radiation intensity expressed in log [erg/s/cm^2/A/sr]. Scanned (PDF version of) Boyarchuk et al. (1994) book is available at http://www.inasan.ru/~astron/astron.pdf (12 data files).
ERIC Educational Resources Information Center
Yee, Patricia; Seltzer, Joanna
This paper summarizes the contents, structure and possible uses of the Information System for Vocational Decisions (ISVD) data file on military jobs in the 3 major services. In all, 170 specific career fields for enlisted men and 34 for officers are included in the data file, which also provides for converting the inquirer's personal…
Sediment data collected in 2010 from Cat Island, Mississippi
Buster, Noreen A.; Kelso, Kyle W.; Miselis, Jennifer L.; Kindinger, Jack G.
2014-01-01
Scientists from the U.S. Geological Survey, St. Petersburg Coastal and Marine Science Center, in collaboration with the U.S. Army Corps of Engineers, conducted geophysical and sedimentological surveys in 2010 around Cat Island, Mississippi, which is the westernmost island in the Mississippi-Alabama barrier island chain. The objective of the study was to understand the geologic evolution of Cat Island relative to other barrier islands in the northern Gulf of Mexico by identifying relationships between the geologic history, present day morphology, and sediment distribution. This data series serves as an archive of terrestrial and marine sediment vibracores collected August 4-6 and October 20-22, 2010, respectively. Geographic information system data products include marine and terrestrial core locations and 2007 shoreline data. Additional files include marine and terrestrial core description logs, core photos, results of sediment grain-size analyses, optically stimulated luminescence dating and carbon-14 dating locations and results, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.
Gopalakrishnan, V; Baskaran, R; Venkatraman, B
2016-08-01
A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee-Pro wireless modules and PSoC controller for wireless interfacing, and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopalakrishnan, V.; Baskaran, R.; Venkatraman, B.
A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee–Pro wireless modules and PSoC controller for wireless interfacing,more » and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.« less
Program Analysis Techniques for Efficient Software Model Checking
2011-02-28
MIT Press, 1986. [29] D. Marinov, A . Andoni, D. Daniliuc, S . Khurshid, and M . Rinard. An evaluation of exhaustive testing for data structures...such as reading, writing, creating, or deleting a file or a directory) on a file system state s , it uses its analyses to identify other file system...ples of Programming Languages (POPL), January 2003. [6] C. Boyapati and M . Rinard. A parameterized type system for race-free Java programs. In
Well 14-2 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)
Joe Moore
2016-03-03
This is a compilation of logs and data from Well 14-2 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
Well 52-21 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)
Joe Moore
2016-03-03
This is a compilation of logs and data from Well 52-21 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
Well 82-33 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)
Joe Moore
2016-03-03
This is a compilation of logs and data from Well 82-33 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
Well Acord 1-26 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joe Moore
This is a compilation of logs and data from Well Acord 1-26 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.
Simulation Control Graphical User Interface Logging Report
NASA Technical Reports Server (NTRS)
Hewling, Karl B., Jr.
2012-01-01
One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.
Risser, Dennis W.; Williams, John H.; Hand, Kristen L.; Behr, Rose-Anna; Markowski, Antonette K.
2013-01-01
Open-File Miscellaneous Investigation 13–01.1 presents the results of geohydrologic investigations on a 1,664-foot-deep core hole drilled in the Bradford County part of the Gleason 7.5-minute quadrangle in north-central Pennsylvania. In the text, the authors discuss their methods of investigation, summarize physical and analytical results, and place those results in context. Four appendices include (1) a full description of the core in an Excel worksheet; (2) water-quality and core-isotope analytical results in Excel workbooks; (3) geophysical logs in LAS and PDF files, and an Excel workbook containing attitudes of bedding and fractures calculated from televiewer logs; and (4) MP4 clips from the downhole video at selected horizons.
Virtual file system on NoSQL for processing high volumes of HL7 messages.
Kimura, Eizen; Ishihara, Ken
2015-01-01
The Standardized Structured Medical Information Exchange (SS-MIX) is intended to be the standard repository for HL7 messages that depend on a local file system. However, its scalability is limited. We implemented a virtual file system using NoSQL to incorporate modern computing technology into SS-MIX and allow the system to integrate local patient IDs from different healthcare systems into a universal system. We discuss its implementation using the database MongoDB and describe its performance in a case study.
Venteris, E.R.; Carter, K.M.
2009-01-01
Mapping and characterization of potential geologic reservoirs are key components in planning carbon dioxide (CO2) injection projects. The geometry of target and confining layers is vital to ensure that the injected CO2 remains in a supercritical state and is confined to the target layer. Also, maps of injection volume (porosity) are necessary to estimate sequestration capacity at undrilled locations. Our study uses publicly filed geophysical logs and geostatistical modeling methods to investigate the reliability of spatial prediction for oil and gas plays in the Medina Group (sandstone and shale facies) in northwestern Pennsylvania. Specifically, the modeling focused on two targets: the Grimsby Formation and Whirlpool Sandstone. For each layer, thousands of data points were available to model structure and thickness but only hundreds were available to support volumetric modeling because of the rarity of density-porosity logs in the public records. Geostatistical analysis based on this data resulted in accurate structure models, less accurate isopach models, and inconsistent models of pore volume. Of the two layers studied, only the Whirlpool Sandstone data provided for a useful spatial model of pore volume. Where reliable models for spatial prediction are absent, the best predictor available for unsampled locations is the mean value of the data, and potential sequestration sites should be planned as close as possible to existing wells with volumetric data. ?? 2009. The American Association of Petroleum Geologists/Division of Environmental Geosciences. All rights reserved.
2009-12-01
other services for early UNIX systems at Bell labs. In many UNIX based systems, the field added to ‘etc/ passwd ’ file to carry GCOS ID information was...charset, and external. struct options_main { /* Option flags */ opt_flags flags; /* Password files */ struct list_main * passwd ; /* Password file...object PASSWD . It is part of several other data structures. struct PASSWD { int id; char *login; char *passwd_hash; int UID
Sedimentologic characteristics of recent washover deposits from Assateague Island, Maryland
Bernier, Julie C.; Zaremba, Nicholas J.; Wheaton, Cathryn J.; Ellis, Alisha M.; Marot, Marci E.; Smith, Christopher G.
2016-06-08
This report describes sediment data collected using sand augers in active overwash zones on Assateague Island in Maryland. Samples were collected by the U.S. Geological Survey (USGS) during two surveys in March/April and October 2014 (USGS Field Activity Numbers [FAN] 2014-301-FA and 2014-322-FA, respectively). The physical characteristics (for example, sediment texture or bedding structure) of and spatial differences among these deposits will provide information about overwash processes and sediment transport from the sandy barrier-island reaches to the back-barrier environments. Metrics derived from these data, such as mean grain size or deposit thicknesses, can be used to ground-truth remote sensing and geophysical data and can also be incorporated into sediment transport models. Data products, including sample location tables, descriptive core logs, core photographs and x-radiographs, the results of sediment grain-size analyses, and Geographic Information System (GIS) data files with accompanying formal Federal Geographic Data Committee (FGDC) metadata can be downloaded from the Data Downloads page.
Evaluated nuclear structure data file
NASA Astrophysics Data System (ADS)
Tuli, J. K.
1996-02-01
The Evaluated Nuclear Structure Data File (ENSDF) contains the evaluated nuclear properties of all known nuclides, as derived both from nuclear reaction and radioactive decay measurements. All experimental data are evaluated to create the adopted properties for each nuclide. ENSDF, together with other numeric and bibliographic files, can be accessed on-line through the INTERNET or modem, and some of the databases are also available on the World Wide Web. The structure and the scope of ENSDF are presented along with the on-line access system of the National Nuclear Data Center at Brookhaven National Laboratory.
Evaluated nuclear structure data file
NASA Astrophysics Data System (ADS)
Tuli, J. K.
The Evaluated Nuclear Structure Data File (ENSDF) contains the evaluated nuclear properties of all known nuclides. These properties are derived both from nuclear reaction and radioactive decay measurements. All experimental data are evaluated to create the adopted properties for each nuclide. ENSDF, together with other numeric and biographic files, can be accessed on-line through the INTERNET or modem. Some of the databases are also available on the World Wide Web. The structure and the scope of ENSDF are presented along with the on-line access system of the National Nuclear Data Center at Brookhaven National Laboratory.
DeWitt, Nancy T.; Flocks, James G.; Reynolds, B.J.; Hansen, Mark
2012-01-01
The Gulf Islands National Seashore (GUIS) is composed of a series of barrier islands along the Mississippi - Alabama coastline. Historically these islands have undergone long-term shoreline change. The devastation of Hurricane Katrina in 2005 prompted questions about the stability of the barrier islands and their potential response to future storm impacts. Additionally, there was concern from the National Park Service (NPS) about the preservation of the historical Fort Massachusetts, located on West Ship Island. During the early 1900s, Ship Island was an individual island. In 1969 Hurricane Camille breached Ship Island, widening the cut and splitting it into what is now known as West Ship Island and East Ship Island. In July of 2007, the U.S. Geological Survey (USGS) was able to provide the NPS with a small bathymetric survey of Camille Cut using high-resolution single-beam bathymetry. This provided GUIS with a post-Katrina assessment of the bathymetry in Camille Cut and along the northern shoreline directly in front of Fort Massachusetts. Ultimately, this survey became an initial bathymetry dataset toward a larger USGS effort included in the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility Project (http://ngom.usgs.gov/gomsc/mscip/). This report serves as an archive of the processed single-beam bathymetry. Data products herein include gridded and interpolated digital depth surfaces and x,y,z data products. Additional files include trackline maps, navigation files, geographic information system (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Scanned images of the handwritten FACS logs and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for description of acronyms and abbreviations used in this report or hold the cursor over an acronym for a pop-up explanation. The USGS St. Petersburg Coastal and Marine Science Center assigns a unique identifier to each cruise or field activity. For example, 07CCT01 tells us the data were collected in 2007 for the Coastal Change and Transport (CCT) study and the data were collected during the first (01) field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. Data were collected using a 26-foot (ft) Glacier Bay catamaran. The single-beam transducers were sled mounted on a rail attached between the catamaran hulls. Navigation was acquired using HYPACK, Inc., Hypack version 4.3a.7.1 and differentially corrected using land-based GPS stations. See the digital FACS equipment log for details about the acquisition equipment used. Raw datasets were stored digitally and processed systematically using NovAtel's Waypoint GrafNav version 7.6, SANDS version 3.7, and ESRI ArcGIS version 9.3.1. For more information on processing refer to the Equipment and Processing page.
ERIC Educational Resources Information Center
Bennett-Abney, Cheryl
2001-01-01
Three organizational tools for counselors are described: three-ring binder for notes, forms, and schedules; daily log of time and activities; and a tickler file with tasks arranged by days of the week. (SK)
Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Worley, Charles R.
2011-01-01
In July of 2008, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on island framework from Ship Island to Horn Island, Mississippi, for the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility project. Funding was provided through the Geologic Framework and Holocene Coastal Evolution of the Mississippi-Alabama Region Subtask (http://ngom.er.usgs.gov/task2_2/index.php); this project is also part of a broader USGS study on Coastal Change and Transport (CCT). This report serves as an archive of unprocessed digital Chirp seismic reflection data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, observer's logbook, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.
Systems and methods for an extensible business application framework
NASA Technical Reports Server (NTRS)
Bell, David G. (Inventor); Crawford, Michael (Inventor)
2012-01-01
Method and systems for editing data from a query result include requesting a query result using a unique collection identifier for a collection of individual files and a unique identifier for a configuration file that specifies a data structure for the query result. A query result is generated that contains a plurality of fields as specified by the configuration file, by combining each of the individual files associated with a unique identifier for a collection of individual files. The query result data is displayed with a plurality of labels as specified in the configuration file. Edits can be performed by querying a collection of individual files using the configuration file, editing a portion of the query result, and transmitting only the edited information for storage back into a data repository.
2013-03-01
the /bin, /sbin, /etc, /var/log, /home, /proc, /root, /dev, /tmp, and /lib directories • Describe the purpose of the /etc/shadow and /etc/ passwd ...UNLIMITED 19 2.6.2 /etc/ passwd and /etc/shadow The /etc/shadow file didn’t exist on early Linux distributions. Originally only root could access the...etc/ passwd file, which stored user names, user configuration information, and passwords. However, when common programs such as ls running under
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, D; Li, X; Li, H
2014-06-15
Purpose: Two aims of this work were to develop a method to automatically verify treatment delivery accuracy immediately after patient treatment and to develop a comprehensive daily treatment report to provide all required information for daily MR-IGRT review. Methods: After systematically analyzing the requirements for treatment delivery verification and understanding the available information from a novel MR-IGRT treatment machine, we designed a method to use 1) treatment plan files, 2) delivery log files, and 3) dosimetric calibration information to verify the accuracy and completeness of daily treatment deliveries. The method verifies the correctness of delivered treatment plans and beams, beammore » segments, and for each segment, the beam-on time and MLC leaf positions. Composite primary fluence maps are calculated from the MLC leaf positions and the beam-on time. Error statistics are calculated on the fluence difference maps between the plan and the delivery. We also designed the daily treatment delivery report by including all required information for MR-IGRT and physics weekly review - the plan and treatment fraction information, dose verification information, daily patient setup screen captures, and the treatment delivery verification results. Results: The parameters in the log files (e.g. MLC positions) were independently verified and deemed accurate and trustable. A computer program was developed to implement the automatic delivery verification and daily report generation. The program was tested and clinically commissioned with sufficient IMRT and 3D treatment delivery data. The final version has been integrated into a commercial MR-IGRT treatment delivery system. Conclusion: A method was developed to automatically verify MR-IGRT treatment deliveries and generate daily treatment reports. Already in clinical use since December 2013, the system is able to facilitate delivery error detection, and expedite physician daily IGRT review and physicist weekly chart review.« less
A Scientific Data Provenance API for Distributed Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raju, Bibi; Elsethagen, Todd O.; Stephan, Eric G.
Data provenance has been an active area of research as a means to standardize how the origin of data, process event history, and what or who was responsible for influencing results is explained. There are two approaches to capture provenance information. The first approach is to collect observed evidence produced by an executing application using log files, event listeners, and temporary files that are used by the application or application developer. The provenance translated from these observations is an interpretation of the provided evidence. The second approach is called disclosed because the application provides a firsthand account of the provenancemore » based on the anticipated questions on data flow, process flow, and responsible agents. Most observed provenance collection systems collect lot of provenance information during an application run or workflow execution. The common trend in capturing provenance is to collect all possible information, then attempt to find relevant information, which is not efficient. Existing disclosed provenance system APIs do not work well in distributed environment and have trouble finding where to fit the individual pieces of provenance information. This work focuses on determining more reliable solutions for provenance capture. As part of the Integrated End-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project, an API was developed, called Producer API (PAPI), which can disclose application targeted provenance, designed to work in distributed environments by means of unique object identification methods. The provenance disclosure approach used adds additional metadata to the provenance information to uniquely identify the pieces and connect them together. PAPI uses a common provenance model to support this provenance integration across disclosure sources. The API also provides the flexibility to let the user decide what to do with the collected provenance. The collected provenance can be sent to a triple store using REST services or it can be logged to a file.« less
Tools for Administration of a UNIX-Based Network
NASA Technical Reports Server (NTRS)
LeClaire, Stephen; Farrar, Edward
2004-01-01
Several computer programs have been developed to enable efficient administration of a large, heterogeneous, UNIX-based computing and communication network that includes a variety of computers connected to a variety of subnetworks. One program provides secure software tools for administrators to create, modify, lock, and delete accounts of specific users. This program also provides tools for users to change their UNIX passwords and log-in shells. These tools check for errors. Another program comprises a client and a server component that, together, provide a secure mechanism to create, modify, and query quota levels on a network file system (NFS) mounted by use of the VERITAS File SystemJ software. The client software resides on an internal secure computer with a secure Web interface; one can gain access to the client software from any authorized computer capable of running web-browser software. The server software resides on a UNIX computer configured with the VERITAS software system. Directories where VERITAS quotas are applied are NFS-mounted. Another program is a Web-based, client/server Internet Protocol (IP) address tool that facilitates maintenance lookup of information about IP addresses for a network of computers.
Image processing tool for automatic feature recognition and quantification
Chen, Xing; Stoddard, Ryan J.
2017-05-02
A system for defining structures within an image is described. The system includes reading of an input file, preprocessing the input file while preserving metadata such as scale information and then detecting features of the input file. In one version the detection first uses an edge detector followed by identification of features using a Hough transform. The output of the process is identified elements within the image.
Zerara, Mohamed; Brickmann, Jürgen; Kretschmer, Robert; Exner, Thomas E
2009-02-01
Quantitative information of solvation and transfer free energies is often needed for the understanding of many physicochemical processes, e.g the molecular recognition phenomena, the transport and diffusion processes through biological membranes and the tertiary structure of proteins. Recently, a concept for the localization and quantification of hydrophobicity has been introduced (Jäger et al. J Chem Inf Comput Sci 43:237-247, 2003). This model is based on the assumptions that the overall hydrophobicity can be obtained as a superposition of fragment contributions. To date, all predictive models for the logP have been parameterized for n-octanol/water (logP(oct)) solvent while very few models with poor predictive abilities are available for other solvents. In this work, we propose a parameterization of an empirical model for n-octanol/water, alkane/water (logP(alk)) and cyclohexane/water (logP(cyc)) systems. Comparison of both logP(alk) and logP(cyc) with the logarithms of brain/blood ratios (logBB) for a set of structurally diverse compounds revealed a high correlation showing their superiority over the logP(oct) measure in this context.
47 CFR 22.359 - Emission limitations.
Code of Federal Regulations, 2011 CFR
2011-10-01
... + 10 log (P) dB. (b) Measurement procedure. Compliance with these rules is based on the use of... contract in their station files and disclose it to prospective assignees or transferees and, upon request...
7 CFR 274.5 - Record retention and forms security.
Code of Federal Regulations, 2011 CFR
2011-01-01
... control logs, or similar controls from the point of initial receipt through the issuance and.... (2) For notices of change which initiate, update or terminate the master issuance file, the State...
ERIC Educational Resources Information Center
Tennant, Roy
1992-01-01
Explains how users can find and access information resources available on the Internet. Highlights include network information centers (NICs); lists, both formal and informal; computer networking protocols, including international standards; electronic mail; remote log-in; and file transfer. (LRW)
NASA Technical Reports Server (NTRS)
Rogers, James L.; Feyock, Stefan; Sobieszczanski-Sobieski, Jaroslaw
1988-01-01
The purpose of this research effort is to investigate the benefits that might be derived from applying artificial intelligence tools in the area of conceptual design. Therefore, the emphasis is on the artificial intelligence aspects of conceptual design rather than structural and optimization aspects. A prototype knowledge-based system, called STRUTEX, was developed to initially configure a structure to support point loads in two dimensions. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user by integrating a knowledge base interface and inference engine, a data base interface, and graphics while keeping the knowledge base and data base files separate. The system writes a file which can be input into a structural synthesis system, which combines structural analysis and optimization.
An overview of the catalog manager
NASA Technical Reports Server (NTRS)
Irani, Frederick M.
1986-01-01
The Catalog Manager (CM) is being used at the Goddard Space Flight Center in conjunction with the Land Analysis System (LAS) running under the Transportable Applications Executive (TAE). CM maintains a catalog of file names for all users of the LAS system. The catalog provides a cross-reference between TAE user file names and fully qualified host-file names. It also maintains information about the content and status of each file. A brief history of CM development is given and a description of naming conventions, catalog structure and file attributes, and archive/retrieve capabilities is presented. General user operation and the LAS user scenario are also discussed.
EPA's Integrated Risk Information System (IRIS) database was developed and is maintained by EPA's Office of Research and Developement, National Center for Environmental Assessment. IRIS is a database of human health effects that may result from exposure to various substances fou...
VizieR Online Data Catalog: CoRoT red giants abundances (Morel+, 2014)
NASA Astrophysics Data System (ADS)
Morel, T.; Miglio, A.; Lagarde, N.; Montalban, J.; Rainer, M.; Poretti, E.; Eggenberger, P.; Hekker, S.; Kallinger, T.; Mosser, B.; Valentini, M.; Carrier, F.; Hareter, M.; Mantegazza, L.
2014-02-01
The equivalent widths were measured manually assuming Gaussian profiles or Voigt profiles for the few lines with extended damping wings. Lines with an unsatisfactory fit or significantly affected by telluric features were discarded. Only values eventually retained for the analysis are provided. For the chemical abundances, the usual notation is used: [X/Y]=[log({epsilon}(X))-log({epsilon}(Y))]star - [log({epsilon}(X))-log({epsilon}(Y))]⊙ with log{epsilon}(X)=12+log[N(X)/N(H)] (N is the number density of the species). For lithium, the following notation is used: [Li/H]=log(N(Li))star-log(N(Li))⊙. The adopted solar abundances are taken from Grevesse & Sauval (1998SSRv...85..161G), except for Li for which we adopt our derived values: log({epsilon}(Li))⊙=1.09 and 1.13 in LTE and NLTE, respectively (see text). All the abundances are computed under the assumption of LTE, except Li for which values corrected for departures from LTE using the data of Lind et al. (2009A&A...503..541L) are also provided. All the quoted error bars are 1-sigma uncertainties. (6 data files).
Calderon, Karynna; Forde, Arnell S.; Dadisman, Shawn V.; Wiese, Dana S.; Phelps, Daniel C.
2012-01-01
In September and October of 2003, the U.S. Geological Survey (USGS), in cooperation with the Florida Geological Survey, conducted geophysical surveys of the Atlantic Ocean offshore northeast Florida from St. Augustine, Florida, to the Florida-Georgia border. This report serves as an archive of unprocessed digital boomer subbottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Filtered and gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of all acronyms and abbreviations used in this report. The USGS St. Petersburg Coastal and Marine Science Center (SPCMSC) assigns a unique identifier to each cruise or field activity. For example, 03FGS01 tells us the data were collected in 2003 as part of cooperative work with the Florida Geological Survey (FGS) and that the data were collected during the first field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity identification (ID). The naming convention used for each seismic line is as follows: yye##a, where 'yy' are the last two digits of the year in which the data were collected, 'e' is a 1-letter abbreviation for the equipment type (for example, b for boomer), '##' is a 2-digit number representing a specific track, and 'a' is a letter representing the section of a line if recording was prematurely terminated or rerun for quality or acquisition problems. The boomer plate is an acoustic energy source that consists of capacitors charged to a high voltage and discharged through a transducer in the water. The transducer is towed on a sled floating on the water surface and when discharged emits a short acoustic pulse, or shot, which propagates through the water, sediment column, or rock beneath. The acoustic energy is reflected at density boundaries (such as the seafloor, sediment, or rock layers beneath the seafloor), detected by hydrophone receivers, and recorded by a PC-based seismic acquisition system. This process is repeated at timed intervals (for example, 0.5 seconds) and recorded for specific intervals of time (for example, 100 milliseconds). In this way, a two-dimensional (2-D) vertical profile of the shallow geologic structure beneath the ship track is produced. Refer to the handwritten FACS operation log (PDF, 442 KB) for diagrams and descriptions of acquisition geometry, which varied throughout the cruises. Table 1 displays a summary of acquisition parameters. See the digital FACS equipment logs (PDF, 9-13 KB each) for details about the acquisition equipment used. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG Y (Barry and others, 1975) format (rev. 0), except for the first 3,200 bytes of the card image header, which are stored in ASCII format instead of the standard EBCDIC format. The SEG Y files may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU) (Cohen and Stockwell, 2005). See the How To Download SEG Y Data page for download instructions. The printable profiles provided here are Graphics Interchange Format (GIF) images that were filtered and gained using SU software. Refer to the Software page for details about the processing and links to example SU processing scripts and USGS software for viewing the SEG Y files (Zihlman, 1992).
Ishihara, Yoshitomo; Nakamura, Mitsuhiro; Miyabe, Yuki; Mukumoto, Nobutaka; Matsuo, Yukinori; Sawada, Akira; Kokubo, Masaki; Mizowaki, Takashi; Hiraoka, Masahiro
2017-03-01
To develop a four-dimensional (4D) dose calculation system for real-time tumor tracking (RTTT) irradiation by the Vero4DRT. First, a 6-MV photon beam delivered by the Vero4DRT was simulated using EGSnrc. A moving phantom position was directly measured by a laser displacement gauge. The pan and tilt angles, monitor units, and the indexing time indicating the phantom position were also extracted from a log file. Next, phase space data at any angle were created from both the log file and particle data under the dynamic multileaf collimator. Irradiation both with and without RTTT, with the phantom moving, were simulated using several treatment field sizes. Each was compared with the corresponding measurement using films. Finally, dose calculation for each computed tomography dataset of 10 respiratory phases with the X-ray head rotated was performed to simulate the RTTT irradiation (4D plan) for lung, liver, and pancreatic cancer patients. Dose-volume histograms of the 4D plan were compared with those calculated on the single reference respiratory phase without the gimbal rotation [three-dimensional (3D) plan]. Differences between the simulated and measured doses were less than 3% for RTTT irradiation in most areas, except the high-dose gradient. For clinical cases, the target coverage in 4D plans was almost identical to that of the 3D plans. However, the doses to organs at risk in the 4D plans varied at intermediate- and low-dose levels. Our proposed system has acceptable accuracy for RTTT irradiation in the Vero4DRT and is capable of simulating clinical RTTT plans. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Calderon, Karynna; Dadisman, Shawn V.; Tihansky, Ann B.; Lewelling, Bill R.; Flocks, James G.; Wiese, Dana S.; Kindinger, Jack G.; Harrison, Arnell S.
2006-01-01
In October and November of 1995 and February of 1996, the U.S. Geological Survey, in cooperation with the Southwest Florida Water Management District, conducted geophysical surveys of the Peace River in west-central Florida from east of Bartow to west of Arcadia. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, observers' logbooks, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plimpton, Steve; Jones, Matt; Crozier, Paul
2006-01-01
Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs on any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invokedmore » interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.« less
HLYWD: a program for post-processing data files to generate selected plots or time-lapse graphics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munro, J.K. Jr.
1980-05-01
The program HLYWD is a post-processor of output files generated by large plasma simulation computations or of data files containing a time sequence of plasma diagnostics. It is intended to be used in a production mode for either type of application; i.e., it allows one to generate along with the graphics sequence, segments containing title, credits to those who performed the work, text to describe the graphics, and acknowledgement of funding agency. The current version is designed to generate 3D plots and allows one to select type of display (linear or semi-log scales), choice of normalization of function values formore » display purposes, viewing perspective, and an option to allow continuous rotations of surfaces. This program was developed with the intention of being relatively easy to use, reasonably flexible, and requiring a minimum investment of the user's time. It uses the TV80 library of graphics software and ORDERLIB system software on the CDC 7600 at the National Magnetic Fusion Energy Computing Center at Lawrence Livermore Laboratory in California.« less
Design of housing file box of fire academy based on RFID
NASA Astrophysics Data System (ADS)
Li, Huaiyi
2018-04-01
This paper presents a design scheme of intelligent file box based on RFID. The advantages of RFID file box and traditional file box are compared and analyzed, and the feasibility of RFID file box design is analyzed based on the actual situation of our university. After introducing the shape and structure design of the intelligent file box, the paper discusses the working process of the file box, and explains in detail the internal communication principle of the RFID file box and the realization of the control system. The application of the RFID based file box will greatly improve the efficiency of our school's archives management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirashima, H; Miyabe, Y; Yokota, K
2016-06-15
Purpose: The Dynamic Wave Arc (DWA) technique, where the multi-leaf collimator (MLC) and gantry/ring move simultaneously in a predefined non-coplanar trajectory, has been developed on the Vero4DRT. The aim of this study is to develop a simple method for quality assurance of DWA delivery using an electronic portal imaging device (EPID) measurements and log files analysis. Methods: The Vero4DRT has an EPID on the beam axis, the resolution of which is 0.18 mm/pixel at the isocenter plane. EPID images were acquired automatically. To verify the detection accuracy of the MLC position by EPID images, the MLC position with intentional errorsmore » was assessed. Tests were designed considering three factors: (1) accuracy of the MLC position (2) dose output consistency with variable dose rate (160–400 MU/min), gantry speed (2.4–6°/s), ring speed (0.5–2.5°/s), and (3) MLC speed (1.6–4.2 cm/s). All the patterns were delivered to the EPID and compared with those obtained with a stationary radiation beam with a 0° gantry angle. The irradiation log, including the MLC position and gantry/ring angle, were recorded simultaneously. To perform independent checks of the machine accuracy, the MLC position and gantry/ring angle position were assessed using log files. Results: 0.1 mm intentional error can be detected by the EPID, which is smaller than the EPID pixel size. The dose outputs with different conditions of the dose rate and gantry/ring speed and MLC speed showed good agreement, with a root mean square (RMS) error of 0.76%. The RMS error between the detected and recorded data were 0.1 mm for the MLC position, 0.12° for the gantry angle, and 0.07° for the ring angle. Conclusion: The MLC position and dose outputs in variable conditions during DWA irradiation can be easily detected using EPID measurements and log file analysis. The proposed method is useful for routine verification. This research is (partially) supported by the Practical Research for Innovative Cancer Control (15Ack0106151h0001) from Japan Agency for Medical Research and development, AMED. Authors Takashi Mizowaki and Masahiro Hiraoka have consultancy agreement with Mitsubishi Heavy Industries, Ltd., Japan.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teke, T; Milette, MP; Huang, V
2014-08-15
The interplay effect between the tumor motion and the radiation beam modulation during a VMAT treatment delivery alters the delivered dose distribution from the planned one. This work present and validate a method to accurately calculate the dose distribution in 4D taking into account the tumor motion, the field modulation and the treatment starting phase. A QUASAR™ respiratory motion phantom was 4D scanned with motion amplitude of 3 cm and with a 3 second period. A static scan was also acquired with the lung insert and the tumor contained in it centered. A VMAT plan with a 6XFFF beam wasmore » created on the averaged CT and delivered on a Varian TrueBeam and the trajectory log file was saved. From the trajectory log file 10 VMAT plans (one for each breathing phase) and a developer mode XML file were created. For the 10 VMAT plans, the tumor motion was modeled by moving the isocentre on the static scan, the plans were re-calculated and summed in the treatment planning system. In the developer mode, the tumor motion was simulated by moving the couch dynamically during the treatment. Gafchromic films were placed in the QUASAR phantom static and irradiated using the developer mode. Different treatment starting phase were investigated (no phase shift, maximum inhalation and maximum exhalation). Calculated and measured isodose lines and profiles are in very good agreement. For each starting phase, the dose distribution exhibit significant differences but are accurately calculated with the methodology presented in this work.« less
VizieR Online Data Catalog: NGC 2264, NGC 2547 and NGC 2516 stellar radii (Jackson+, 2016)
NASA Astrophysics Data System (ADS)
Jackson, R. J.; Jeffries, R. D.; Randich, S.; Bragaglia, A.; Carraro, G.; Costado, M. T.; Flaccomio, E.; Lanzafame; Lardo, C.; Monaco, L.; Morbidelli, L.; Smiljanic, R.; Zaggia, S.
2015-11-01
File Table1.dat contains Photometric and spectroscopic data of GES Survey targets in clusters in NGC 2547, NGC 2516, NGC 22264 downloaded from the Edinburugh GES archive (http://ges/roe.ac.uk/) . Photometric data comprised the (Cousins) I magnitude and 2MASS J, H and K magnitudes. Spectroscopic data comprises the signal to noise ratio, S/N of the target spectrum, the radial velocity, RV (in km/s), the projected equatorial velocity, vsini (in km/s), the number of separate observations co-added to produce the target spectrum and the log of effective temperature (logTeff) of the template spectrum fitted to measure RV and vsini. The absolute precision in RV, pRV (in km/s) and relative precision vsini (pvsini) were estimated, as a function of the logTeff, vsini and S/N, using the prescription described in Jackson et al. (2015A&A...580A..75J, Cat. J/A+A/580/A75). File Table3.dat contains measured and calculated properties of cluster targets with resolved vsini and a reported rotation period. The cluster name, right ascension, RA (deg) and declination, Dec (deg) are given for targets with measured periods given in the literature. Dynamic properties comprise: the radial velocity, RV (in km/s), the absolute precision in RV, pRV (km/s), the projected equatorial velocity, vsini (in km/s), the relative precision in vsini (pvsini) and the rotational period (in days). Also shown are values of absolute K magnitude, MK log of luminosity, log L (in solar units) and probability of cluster membership estimated using cluster data given in the text. Period shows reported values of cluster taken from the literature Estimated values of the projected radius, Rsini (in Rsolar) and uncertainty in projected radius, e_Rsini (in Rsolar) are given for targets where vsini>5km/s and pvsini>0.2. The final column shows a flag which is set to 1 for targets in cluster NGC 2264 where a (H-K) versus (J-H) colour-colour plot indicates possible infra-red excess. Period shows reported values of cluster taken from the literature (2 data files).
Optimizing Earth Data Search Ranking using Deep Learning and Real-time User Behaviour
NASA Astrophysics Data System (ADS)
Jiang, Y.; Yang, C. P.; Armstrong, E. M.; Huang, T.; Moroni, D. F.; McGibbney, L. J.; Greguska, F. R., III
2017-12-01
Finding Earth science data has been a challenging problem given both the quantity of data available and the heterogeneity of the data across a wide variety of domains. Current search engines in most geospatial data portals tend to induce end users to focus on one single data characteristic dimension (e.g., term frequency-inverse document frequency (TF-IDF) score, popularity, release date, etc.). This approach largely fails to take account of users' multidimensional preferences for geospatial data, and hence may likely result in a less than optimal user experience in discovering the most applicable dataset out of a vast range of available datasets. With users interacting with search engines, sufficient information is already hidden in the log files. Compared with explicit feedback data, information that can be derived/extracted from log files is virtually free and substantially more timely. In this dissertation, I propose an online deep learning framework that can quickly update the learning function based on real-time user clickstream data. The contributions of this framework include 1) a log processor that can ingest, process and create training data from web logs in a real-time manner; 2) a query understanding module to better interpret users' search intent using web log processing results and metadata; 3) a feature extractor that identifies ranking features representing users' multidimensional interests of geospatial data; and 4) a deep learning based ranking algorithm that can be trained incrementally using user behavior data. The search ranking results will be evaluated using precision at K and normalized discounted cumulative gain (NDCG).
43 CFR 2743.3 - Leased disposal sites.
Code of Federal Regulations, 2011 CFR
2011-10-01
... review of all records and inspection reports on file with the Bureau of Land Management, State, and local... landfill concerning site management and a review of all reports and logs pertaining to the type and amount...
25 CFR 214.13 - Diligence; annual expenditures; mining records.
Code of Federal Regulations, 2011 CFR
2011-04-01
... within 90 days after an ore body of sufficient quantity is discovered, and shown by the logs or records.... Lessee shall, before commencing operations, file with the superintendent a plat and preliminary statement...
47 CFR 22.861 - Emission limitations.
Code of Federal Regulations, 2011 CFR
2011-10-01
... below the transmitting power (P) by a factor of at least 43 + 10 log (P) dB. (b) Measurement procedure... maintain a copy of the contract in their station files and disclose it to prospective assignees or...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-20
... system; (2) a single lower reservoir dam constructed of earth fill materials with an internal dam drainage system; (3) concrete inlet-outlet structures at both upper reservoirs equipped with trash racks..., using the eComment system at http://www.ferc.gov/docs-filing/ecomment.asp . You must include your name...
Jarlath McEntee
2016-03-21
Workbooks showing Annualized Energy Production, Cost Breakdown Structure, Levelized Cost of Electricity for DOE Refernce Tidal Project 1) Baseline TidGen Power System 2) TidGen Power System with the application of Advanced Controls 3) Advanced TidGen Power System with several enhancements These files are provided as a zipped set. Files are linked together and must be viewed in the same folder.
Li, Heng; Sahoo, Narayan; Poenisch, Falk; Suzuki, Kazumichi; Li, Yupeng; Li, Xiaoqiang; Zhang, Xiaodong; Lee, Andrew K.; Gillin, Michael T.; Zhu, X. Ronald
2013-01-01
Purpose: The purpose of this work was to assess the monitor unit (MU) values and position accuracy of spot scanning proton beams as recorded by the daily treatment logs of the treatment control system, and furthermore establish the feasibility of using the delivered spot positions and MU values to calculate and evaluate delivered doses to patients. Methods: To validate the accuracy of the recorded spot positions, the authors generated and executed a test treatment plan containing nine spot positions, to which the authors delivered ten MU each. The spot positions were measured with radiographic films and Matrixx 2D ion-chambers array placed at the isocenter plane and compared for displacements from the planned and recorded positions. Treatment logs for 14 patients were then used to determine the spot MU values and position accuracy of the scanning proton beam delivery system. Univariate analysis was used to detect any systematic error or large variation between patients, treatment dates, proton energies, gantry angles, and planned spot positions. The recorded patient spot positions and MU values were then used to replace the spot positions and MU values in the plan, and the treatment planning system was used to calculate the delivered doses to patients. The results were compared with the treatment plan. Results: Within a treatment session, spot positions were reproducible within ±0.2 mm. The spot positions measured by film agreed with the planned positions within ±1 mm and with the recorded positions within ±0.5 mm. The maximum day-to-day variation for any given spot position was within ±1 mm. For all 14 patients, with ∼1 500 000 spots recorded, the total MU accuracy was within 0.1% of the planned MU values, the mean (x, y) spot displacement from the planned value was (−0.03 mm, −0.01 mm), the maximum (x, y) displacement was (1.68 mm, 2.27 mm), and the (x, y) standard deviation was (0.26 mm, 0.42 mm). The maximum dose difference between calculated dose to the patient based on the plan and recorded data was within 2%. Conclusions: The authors have shown that the treatment log file in a spot scanning proton beam delivery system is precise enough to serve as a quality assurance tool to monitor variation in spot position and MU value, as well as the delivered dose uncertainty from the treatment delivery system. The analysis tool developed here could be useful for assessing spot position uncertainty and thus dose uncertainty for any patient receiving spot scanning proton beam therapy. PMID:23387726
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dubois, P.F.
1989-05-16
This paper discusses the basis system. Basis is a program development system for scientific programs. It has been developed over the last five years at Lawrence Livermore National Laboratory (LLNL), where it is now used in about twenty major programming efforts. The Basis System includes two major components, a program development system and a run-time package. The run-time package provides the Basis Language interpreter, through which the user does input, output, plotting, and control of the program's subroutines and functions. Variables in the scientific packages are known to this interpreter, so that the user may arbitrarily print, plot, and calculatemore » with, any major program variables. Also provided are facilities for dynamic memory management, terminal logs, error recovery, text-file i/o, and the attachment of non-Basis-developed packages.« less
Remote Environmental Monitoring and Diagnostics in the Perishables Supply Chain - Phase 1
2011-12-12
The table below displays the raw data from the tests. Each cell contains a number between 0 and 5 corresponding to the number of successful...along with the raw temperature data to the email addresses specified in the configuration file. As mentioned previously, for the CAEN...the Intelleflex system. The user also has the option to save the data log, which contains the raw temperature data, to a file on the Windows
D0 Superconducting Solenoid Quench Data and Slow Dump Data Acquisition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Markley, D.; /Fermilab
1998-06-09
This Dzero Engineering note describes the method for which the 2 Tesla Superconducting Solenoid Fast Dump and Slow Dump data are accumulated, tracked and stored. The 2 Tesla Solenoid has eleven data points that need to be tracked and then stored when a fast dump or a slow dump occur. The TI555(Texas Instruments) PLC(Programmable Logic Controller) which controls the DC power circuit that powers the Solenoid, also has access to all the voltage taps and other equipment in the circuit. The TI555 constantly logs these eleven points in a rotating memory buffer. When either a fast dump(dump switch opens) ormore » a slow dump (power supply turns off) occurs, the TI555 organizes the respective data and will down load the data to a file on DO-CCRS2. This data in this file is moved over ethernet and is stored in a CSV (comma separated format) file which can easily be examined by Microsoft Excel or any other spreadsheet. The 2 Tesla solenoid control system also locks in first fault information. The TI555 decodes the first fault and passes it along to the program collecting the data and storing it on DO-CCRS2. This first fault information is then part of the file.« less
A Novel Network Attack Audit System based on Multi-Agent Technology
NASA Astrophysics Data System (ADS)
Jianping, Wang; Min, Chen; Xianwen, Wu
A network attack audit system which includes network attack audit Agent, host audit Agent and management control center audit Agent is proposed. And the improved multi-agent technology is carried out in the network attack audit Agent which has achieved satisfactory audit results. The audit system in terms of network attack is just in-depth, and with the function improvement of network attack audit Agent, different attack will be better analyzed and audit. In addition, the management control center Agent should manage and analyze audit results from AA (or HA) and audit data on time. And the history files of network packets and host log data should also be audit to find deeper violations that cannot be found in real time.
NASA Technical Reports Server (NTRS)
Brieda, Lubos
2015-01-01
This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.
Improved performance in NASTRAN (R)
NASA Technical Reports Server (NTRS)
Chan, Gordon C.
1989-01-01
Three areas of improvement in COSMIC/NASTRAN, 1989 release, were incorporated recently that make the analysis program run faster on large problems. Actual log files and actual timings on a few test samples that were run on IBM, CDC, VAX, and CRAY computers were compiled. The speed improvement is proportional to the problem size and number of continuation cards. Vectorizing certain operations in BANDIT, makes BANDIT run twice as fast in some large problems using structural elements with many node points. BANDIT is a built-in NASTRAN processor that optimizes the structural matrix bandwidth. The VAX matrix packing routine BLDPK was modified so that it is now packing a column of a matrix 3 to 9 times faster. The denser and bigger the matrix, the greater is the speed improvement. This improvement makes a host of routines and modules that involve matrix operation run significantly faster, and saves disc space for dense matrices. A UNIX version, converted from 1988 COSMIC/NASTRAN, was tested successfully on a Silicon Graphics computer using the UNIX V Operating System, with Berkeley 4.3 Extensions. The Utility Modules INPUTT5 and OUTPUT5 were expanded to handle table data, as well as matrices. Both INPUTT5 and OUTPUT5 are general input/output modules that read and write FORTRAN files with or without format. More user informative messages are echoed from PARAMR, PARAMD, and SCALAR modules to ensure proper data values and data types being handled. Two new Utility Modules, GINOFILE and DATABASE, were written for the 1989 release. Seven rigid elements are added to COSMIC/NASTRAN. They are: CRROD, CRBAR, CRTRPLT, CRBE1, CRBE2, CRBE3, and CRSPLINE.
Zinge, Priyanka Ramdas; Patil, Jayaprakash
2017-01-01
The aim of this study is to evaluate and compare the effect of one shape, Neolix rotary single-file systems and WaveOne, Reciproc reciprocating single-file systems on pericervical dentin (PCD) using cone-beam computed tomography (CBCT). A total of 40 freshly extracted mandibular premolars were collected and divided into two groups, namely, Group A - Rotary: A 1 - Neolix and A 2 - OneShape and Group B - Reciprocating: B 1 - WaveOne and B 2 - Reciproc. Preoperative scans of each were taken followed by conventional access cavity preparation and working length determination with 10-k file. Instrumentation of the canal was done according to the respective file system, and postinstrumentation CBCT scans of teeth were obtained. 90 μm thick slices were obtained 4 mm apical and coronal to the cementoenamel junction. The PCD thickness was calculated as the shortest distance from the canal outline to the closest adjacent root surface, which was measured in four surfaces, i.e., facial, lingual, mesial, and distal for all the groups in the two obtained scans. There was no significant difference found between rotary single-file systems and reciprocating single-file systems in their effect on PCD, but in Group B 2 , there was most significant loss of tooth structure in the mesial, lingual, and distal surface ( P < 0.05). Reciproc single-file system removes more PCD as compared to other experimental groups, whereas Neolix single file system had the least effect on PCD.
NVST Data Archiving System Based On FastBit NoSQL Database
NASA Astrophysics Data System (ADS)
Liu, Ying-bo; Wang, Feng; Ji, Kai-fan; Deng, Hui; Dai, Wei; Liang, Bo
2014-06-01
The New Vacuum Solar Telescope (NVST) is a 1-meter vacuum solar telescope that aims to observe the fine structures of active regions on the Sun. The main tasks of the NVST are high resolution imaging and spectral observations, including the measurements of the solar magnetic field. The NVST has been collecting more than 20 million FITS files since it began routine observations in 2012 and produces a maximum observational records of 120 thousand files in a day. Given the large amount of files, the effective archiving and retrieval of files becomes a critical and urgent problem. In this study, we implement a new data archiving system for the NVST based on the Fastbit Not Only Structured Query Language (NoSQL) database. Comparing to the relational database (i.e., MySQL; My Structured Query Language), the Fastbit database manifests distinctive advantages on indexing and querying performance. In a large scale database of 40 million records, the multi-field combined query response time of Fastbit database is about 15 times faster and fully meets the requirements of the NVST. Our study brings a new idea for massive astronomical data archiving and would contribute to the design of data management systems for other astronomical telescopes.
Kelso, Kyle W.; Flocks, James G.
2015-01-01
Selection of the core site locations was based on geophysical surveys conducted around the islands from 2008 to 2010. The surveys, using acoustic systems to image and interpret the nearsurface stratigraphy, were conducted to investigate the geologic controls on island evolution. This data series serves as an archive of sediment data collected from August to September 2010, offshore of the Mississippi barrier islands. Data products, including descriptive core logs, core photographs, results of sediment grain-size analyses, sample location maps, and geographic information system (GIS) data files with accompanying formal Federal Geographic Data Committee (FDGC) metadata can be downloaded from the data products and downloads page.
User's Guide for the Updated EST/BEST Software System
NASA Technical Reports Server (NTRS)
Shah, Ashwin
2003-01-01
This User's Guide describes the structure of the IPACS input file that reflects the modularity of each module. The structured format helps the user locate specific input data and manually enter or edit it. The IPACS input file can have any user-specified filename, but must have a DAT extension. The input file may consist of up to six input data blocks; the data blocks must be separated by delimiters beginning with the $ character. If multiple sections are desired, they must be arranged in the order listed.
47 CFR 22.917 - Emission limitations for cellular equipment.
Code of Federal Regulations, 2011 CFR
2011-10-01
... frequency ranges must be attenuated below the transmitting power (P) by a factor of at least 43 + 10 log(P... such contract shall maintain a copy of the contract in their station files and disclose it to...
Life cycle performances of log wood applied for soil bioengineering constructions
NASA Astrophysics Data System (ADS)
Kalny, Gerda; Strauss-Sieberth, Alexandra; Strauss, Alfred; Rauch, Hans Peter
2016-04-01
Nowadays there is a high demand on engineering solutions considering not only technical aspects but also ecological and aesthetic values. Soil bioengineering is a construction technique that uses biological components for hydraulic and civil engineering solutions. Soil bioengineering solutions are based on the application of living plants and other auxiliary materials including among others log wood. This kind of construction material supports the soil bioengineering system as long as the plants as living construction material overtake the stability function. Therefore it is important to know about the durability and the degradation process of the wooden logs to retain the integral performance of a soil bio engineering system. These aspects will be considered within the framework of the interdisciplinary research project „ELWIRA Plants, wood, steel and concrete - life cycle performances as construction materials". Therefore field investigations on soil bioengineering construction material, specifically European Larch wood logs, of different soil bioengineering structures at the river Wien have been conducted. The drilling resistance as a parameter for particular material characteristics of selected logs was measured and analysed. The drilling resistance was measured with a Rinntech Resistograph instrument at different positions of the wooden logs, all surrounded with three different backfills: Fully surrounded with air, with earth contact on one side and near the water surface in wet-dry conditions. The age of the used logs ranges from one year old up to 20 year old. Results show progress of the drilling resistance throughout the whole cross section as an indicator to assess soil bioengineering construction material. Logs surrounded by air showed a higher drilling resistance than logs with earth contact and the ones exposed to wet-dry conditions. Hence the functional capability of wooden logs were analysed and discussed in terms of different levels of degradation. The results contribute to a sustainable and resource conserving handling with building materials in frame of construction and maintenance works of soil bioengineering structures.
Moring, B.C.
1990-01-01
Wells logs used for this map of the Winnemucca quadrangle are from the following sources: (1) logs of more than 1,000 water wells reported to the State of Nevada Division of Water Resources, which are on file with them in Reno and at the with U.S. Geological Survey in Carson City, (2) 44 petroleum wells collected by the Nevada Bureau of Mines (Lintz, 1957; Schilling and Garside, 1968; Garside and Schilling, 1977, Garside and others, 1977; 1988), and (3) Two geothermal wells reported in Zoback (1979) and Flynn and others (1982). Data from isostatic residual and Bouguer gravity maps by Wagini (1985) contributed to the interpretation of basin configuration. Gravity models of Dixie Valley (Schaefer, 1982, and Speed, 1976) and Grass Valley (Grannell and Noble, 1977) and seismic profiles of Grass and Pine Valleys (Potter and others, 1987) helped refine basis interpretations in those areas. The geologic base map of Paleozoic and Mesozoic igneous and sedimentary rocks, Tertiary volcanic and sedimentary rocks, and Cenozoic structures was simplified from Stewart and Carlson (1976b).
Unthank, Michael D.; Nelson, Hugh L.
2006-01-01
The hydrogeologic characteristics of the unconsolidated glacial outwash sand and gravel deposits that compose the northeast portion of the alluvial aquifer at Louisville, Kentucky, indicate a prolific water-bearing formation with approximately 7 billion gallons of ground-water storage and an estimated sustainable yield of over 280 million gallons per day. This abundance of ground water and the need to properly develop and manage this resource has prompted many past investigations (since 1956), which have produced reports, maps, and data files covering a variety of topics relative to the movement, availability, and use of ground water in this area. These data have been compiled into a single report to assist in future development and use of the ground-water resources. Available ground-water data for the alluvial aquifer at Louisville, Kentucky, from Beargrass Creek to Harrods Creek, were compiled from the U.S. Geological Survey National Water Information System and the Kentucky Groundwater Data Repository. Data contained in these databases include ground-water well-construction details and historical ground-water levels, drillers' logs, and water-quality information. Additional data and information were gathered from project files at the U.S. Geological Survey--Kentucky Water Science Center and files at the Louisville Water Company. Information contained in these files included data from area pumping tests describing aquifer characteristics and ground-water flow. Data describing current conditions of the ground-water system in the northeast portion of the alluvial aquifer also are included. Ground-water levels from a network of observation wells show recent trends in the flow system, and information from the Kentucky Division of Water-Groundwater Branch lists current permitted ground-water withdrawals in the area.
Computer-based learning of spelling skills in children with and without dyslexia.
Kast, Monika; Baschera, Gian-Marco; Gross, Markus; Jäncke, Lutz; Meyer, Martin
2011-12-01
Our spelling training software recodes words into multisensory representations comprising visual and auditory codes. These codes represent information about letters and syllables of a word. An enhanced version, developed for this study, contains an additional phonological code and an improved word selection controller relying on a phoneme-based student model. We investigated the spelling behavior of children by means of learning curves based on log-file data of the previous and the enhanced software version. First, we compared the learning progress of children with dyslexia working either with the previous software (n = 28) or the adapted version (n = 37). Second, we investigated the spelling behavior of children with dyslexia (n = 37) and matched children without dyslexia (n = 25). To gain deeper insight into which factors are relevant for acquiring spelling skills, we analyzed the influence of cognitive abilities, such as attention functions and verbal memory skills, on the learning behavior. All investigations of the learning process are based on learning curve analyses of the collected log-file data. The results evidenced that those children with dyslexia benefit significantly from the additional phonological cue and the corresponding phoneme-based student model. Actually, children with dyslexia improve their spelling skills to the same extent as children without dyslexia and were able to memorize phoneme to grapheme correspondence when given the correct support and adequate training. In addition, children with low attention functions benefit from the structured learning environment. Generally, our data showed that memory sources are supportive cognitive functions for acquiring spelling skills and for using the information cues of a multi-modal learning environment.
Forde, Arnell S.; Miselis, Jennifer L.; Flocks, James G.; Bernier, Julie C.; Wiese, Dana S.
2014-01-01
On July 5–19 (cruise 13BIM02) and August 22–September 1 (cruise 13BIM07), 2013, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on barrier island evolution and medium-term and interannual sediment transport along the oil spill mitigation sand berm constructed at the north end and offshore of the Chandeleur Islands, Louisiana. This investigation is part of a broader USGS study, which seeks to understand barrier island evolution better over medium time scales (months to years). This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained–showing a relative increase in signal amplitude–digital images of the seismic profiles are provided. Refer to the Abbreviations page for explanations of acronyms and abbreviations used in this report.
Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.
2011-01-01
In June and July of 2009, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on island framework from Cat Island, Mississippi, to Dauphin Island, Alabama, as part of a broader USGS study on Coastal Change and Transport (CCT). The surveys were funded through the Northern Gulf of Mexico Ecosystem Change and Hazard Susceptibility Project as part of the Holocene Evolution of the Mississippi-Alabama Region Subtask (http://ngom.er.usgs.gov/task2_2/index.php). This report serves as an archive of unprocessed digital Chirp seismic profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Single-beam and Swath bathymetry data were also collected during these cruises and will be published as a separate archive. Gained (a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.
Small Aircraft Data Distribution System
NASA Technical Reports Server (NTRS)
Chazanoff, Seth L.; Dinardo, Steven J.
2012-01-01
The CARVE Small Aircraft Data Distribution System acquires the aircraft location and attitude data that is required by the various programs running on a distributed network. This system distributes the data it acquires to the data acquisition programs for inclusion in their data files. It uses UDP (User Datagram Protocol) to broadcast data over a LAN (Local Area Network) to any programs that might have a use for the data. The program is easily adaptable to acquire additional data and log that data to disk. The current version also drives displays using precision pitch and roll information to aid the pilot in maintaining a level-level attitude for radar/radiometer mapping beyond the degree available by flying visually or using a standard gyro-driven attitude indicator. The software is designed to acquire an array of data to help the mission manager make real-time decisions as to the effectiveness of the flight. This data is displayed for the mission manager and broadcast to the other experiments on the aircraft for inclusion in their data files. The program also drives real-time precision pitch and roll displays for the pilot and copilot to aid them in maintaining the desired attitude, when required, during data acquisition on mapping lines.
Paleomagnetic dating: Methods, MATLAB software, example
NASA Astrophysics Data System (ADS)
Hnatyshin, Danny; Kravchinsky, Vadim A.
2014-09-01
A MATLAB software tool has been developed to provide an easy to use graphical interface for the plotting and interpretation of paleomagnetic data. The tool takes either paleomagnetic directions or paleopoles and compares them to a user defined apparent polar wander path or secular variation curve to determine the age of a paleomagnetic sample. Ages can be determined in two ways, either by translating the data onto the reference curve, or by rotating it about a set location (e.g. sampling location). The results are then compiled in data tables which can be exported as an excel file. This data can also be plotted using variety of built-in stereographic projections, which can then be exported as an image file. This software was used to date the giant Sukhoi Log gold deposit in Russia. Sukhoi Log has undergone a complicated history of faulting, folding, metamorphism, and is the vicinity of many granitic bodies. Paleomagnetic analysis of Sukhoi Log allowed for the timing of large scale thermal or chemical events to be determined. Paleomagnetic analysis from gold mineralized black shales was used to define the natural remanent magnetization recorded at Sukhoi Log. The obtained paleomagnetic direction from thermal demagnetization produced a paleopole at 61.3°N, 155.9°E, with the semi-major axis and semi-minor axis of the 95% confidence ellipse being 16.6° and 15.9° respectively. This paleopole is compared to the Siberian apparent polar wander path (APWP) by translating the paleopole to the nearest location on the APWP. This produced an age of 255.2- 31.0+ 32.0Ma and is the youngest well defined age known for Sukhoi Log. We propose that this is the last major stage of activity at Sukhoi Log, and likely had a role in determining the present day state of mineralization seen at the deposit.
1979-08-01
0 Blue staff modules will operate under a manual staff system only. The section begins with the fundamental structure of the design concept. This...engagements, etc. Hard wired elements, like the steel and concrete in a building under construction, represent the underlying structural framework of... structure of this file is illustrated in Figure 4-10. The file will consist of 300 records of approximately 300 bytes or characters each. The records
An Efficient Format for Nearly Constant-Time Access to Arbitrary Time Intervals in Large Trace Files
Chan, Anthony; Gropp, William; Lusk, Ewing
2008-01-01
A powerful method to aid in understanding the performance of parallel applications uses log or trace files containing time-stamped events and states (pairs of events). These trace files can be very large, often hundreds or even thousands of megabytes. Because of the cost of accessing and displaying such files, other methods are often used that reduce the size of the tracefiles at the cost of sacrificing detail or other information. This paper describes a hierarchical trace file format that provides for display of an arbitrary time window in a time independent of the total size of the file and roughlymore » proportional to the number of events within the time window. This format eliminates the need to sacrifice data to achieve a smaller trace file size (since storage is inexpensive, it is necessary only to make efficient use of bandwidth to that storage). The format can be used to organize a trace file or to create a separate file of annotations that may be used with conventional trace files. We present an analysis of the time to access all of the events relevant to an interval of time and we describe experiments demonstrating the performance of this file format.« less
FEQinput—An editor for the full equations (FEQ) hydraulic modeling system
Ancalle, David S.; Ancalle, Pablo J.; Domanski, Marian M.
2017-10-30
IntroductionThe Full Equations Model (FEQ) is a computer program that solves the full, dynamic equations of motion for one-dimensional unsteady hydraulic flow in open channels and through control structures. As a result, hydrologists have used FEQ to design and operate flood-control structures, delineate inundation maps, and analyze peak-flow impacts. To aid in fighting floods, hydrologists are using the software to develop a system that uses flood-plain models to simulate real-time streamflow.Input files for FEQ are composed of text files that contain large amounts of parameters, data, and instructions that are written in a format exclusive to FEQ. Although documentation exists that can aid in the creation and editing of these input files, new users face a steep learning curve in order to understand the specific format and language of the files.FEQinput provides a set of tools to help a new user overcome the steep learning curve associated with creating and modifying input files for the FEQ hydraulic model and the related utility tool, Full Equations Utilities (FEQUTL).
FT-IR, FT-Raman spectra and DFT calculations of melaminium perchlorate monohydrate
NASA Astrophysics Data System (ADS)
Kanagathara, N.; Marchewka, M. K.; Drozd, M.; Renganathan, N. G.; Gunasekaran, S.; Anbalagan, G.
2013-08-01
Melaminium perchlorate monohydrate (MPM), an organic material has been synthesized by slow solvent evaporation method at room temperature. Powder X-ray diffraction analysis confirms that MPM crystal belongs to triclinic system with space group P-1. FTIR and FT Raman spectra are recorded at room temperature. Functional group assignment has been made for the melaminium cations and perchlorate anions. Vibrational spectra have also been discussed on the basis of quantum chemical density functional theory (DFT) calculations using Firefly (PC GAMESS) version 7.1 G. Vibrational frequencies are calculated and scaled values are compared with experimental values. The assignment of the bands has been made on the basis of the calculated PED. The Mulliken charges, HOMO-LUMO orbital energies are analyzed directly from Firefly program log files and graphically illustrated. HOMO-LUMO energy gap and other related molecular properties are also calculated. The theoretically constructed FT-IR and FT-Raman spectra of MPM coincide with the experimental one. The chemical structure of the compound has been established by 1H and 13C NMR spectra. No detectable signal was observed during powder test for second harmonic generation.
Oracle Applications Patch Administration Tool (PAT) Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
2002-01-04
PAT is a Patch Administration Tool that provides analysis, tracking, and management of Oracle Application patches. This includes capabilities as outlined below: Patch Analysis & Management Tool Outline of capabilities: Administration Patch Data Maintenance -- track Oracle Application patches applied to what database instance & machine Patch Analysis capture text files (readme.txt and driver files) form comparison detail report comparison detail PL/SQL package comparison detail SQL scripts detail JSP module comparison detail Parse and load the current applptch.txt (10.7) or load patch data from Oracle Application database patch tables (11i) Display Analysis -- Compare patch to be applied with currentmore » Oracle Application installed Appl_top code versions Patch Detail Module comparison detail Analyze and display one Oracle Application module patch. Patch Management -- automatic queue and execution of patches Administration Parameter maintenance -- setting for directory structure of Oracle Application appl_top Validation data maintenance -- machine names and instances to patch Operation Patch Data Maintenance Schedule a patch (queue for later execution) Run a patch (queue for immediate execution) Review the patch logs Patch Management Reports« less
Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Morton, Robert A.; Wiese, Dana S.
2004-01-01
In June of 1994 and August and September of 1995, the U.S. Geological Survey, in cooperation with the University of Texas Bureau of Economic Geology, conducted geophysical surveys of the Sabine and Calcasieu Lake areas and the Gulf of Mexico offshore eastern Texas and western Louisiana. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, observers' logbooks, GIS information, and formal FGDC metadata. In addition, a filtered and gained GIF image of each seismic profile is provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. Processed profile images, trackline maps, navigation files, and formal metadata may be viewed with a web browser. Scanned handwritten logbooks and Field Activity Collection System (FACS) logs may be viewed with Adobe Reader.
Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Ferina, Nicholas F.; Wiese, Dana S.
2004-01-01
In October of 2001 and August of 2002, the U.S. Geological Survey conducted geophysical surveys of the Lower Atchafalaya River, the Mississippi River Delta, Barataria Bay, and the Gulf of Mexico south of East Timbalier Island, Louisiana. This report serves as an archive of unprocessed digital marine seismic reflection data, trackline maps, navigation files, observers' logbooks, GIS information, and formal FGDC metadata. In addition, a filtered and gained GIF image of each seismic profile is provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and othes, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. Processed profile images, trackline maps, navigation files, and formal metadata may be viewed with a web browser. Scanned handwritten logbooks and Field Activity Collection System (FACS) logs may be viewed with Adobe Reader.
Windows Instant Messaging App Forensics: Facebook and Skype as Case Studies
Yang, Teing Yee; Dehghantanha, Ali; Choo, Kim-Kwang Raymond; Muda, Zaiton
2016-01-01
Instant messaging (IM) has changed the way people communicate with each other. However, the interactive and instant nature of these applications (apps) made them an attractive choice for malicious cyber activities such as phishing. The forensic examination of IM apps for modern Windows 8.1 (or later) has been largely unexplored, as the platform is relatively new. In this paper, we seek to determine the data remnants from the use of two popular Windows Store application software for instant messaging, namely Facebook and Skype on a Windows 8.1 client machine. This research contributes to an in-depth understanding of the types of terrestrial artefacts that are likely to remain after the use of instant messaging services and application software on a contemporary Windows operating system. Potential artefacts detected during the research include data relating to the installation or uninstallation of the instant messaging application software, log-in and log-off information, contact lists, conversations, and transferred files. PMID:26982207
Windows Instant Messaging App Forensics: Facebook and Skype as Case Studies.
Yang, Teing Yee; Dehghantanha, Ali; Choo, Kim-Kwang Raymond; Muda, Zaiton
2016-01-01
Instant messaging (IM) has changed the way people communicate with each other. However, the interactive and instant nature of these applications (apps) made them an attractive choice for malicious cyber activities such as phishing. The forensic examination of IM apps for modern Windows 8.1 (or later) has been largely unexplored, as the platform is relatively new. In this paper, we seek to determine the data remnants from the use of two popular Windows Store application software for instant messaging, namely Facebook and Skype on a Windows 8.1 client machine. This research contributes to an in-depth understanding of the types of terrestrial artefacts that are likely to remain after the use of instant messaging services and application software on a contemporary Windows operating system. Potential artefacts detected during the research include data relating to the installation or uninstallation of the instant messaging application software, log-in and log-off information, contact lists, conversations, and transferred files.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-22
... visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...
Associative programming language and virtual associative access manager
NASA Technical Reports Server (NTRS)
Price, C.
1978-01-01
APL provides convenient associative data manipulation functions in a high level language. Six statements were added to PL/1 via a preprocessor: CREATE, INSERT, FIND, FOR EACH, REMOVE, and DELETE. They allow complete control of all data base operations. During execution, data base management programs perform the functions required to support the APL language. VAAM is the data base management system designed to support the APL language. APL/VAAM is used by CADANCE, an interactive graphic computer system. VAAM is designed to support heavily referenced files. Virtual memory files, which utilize the paging mechanism of the operating system, are used. VAAM supports a full network data structure. The two basic blocks in a VAAM file are entities and sets. Entities are the basic information element and correspond to PL/1 based structures defined by the user. Sets contain the relationship information and are implemented as arrays.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-20
... consist of the following: (1) An existing 32-foot-high, 3500-foot-long earth filled dam; (2) a reservoir... bypass channel; (4) a 65-foot-wide, 35- foot-long intake structure with a trash rack cleaning system; (5... system at http://www.ferc.gov/docs-filing/ecomment.asp . You must include your name and contact...
Park, Jong In; Park, Jong Min; Kim, Jung-In; Park, So-Yeon; Ye, Sung-Joon
2015-12-01
The aim of this study was to investigate the sensitivity of the gamma-index method according to various gamma criteria for volumetric modulated arc therapy (VMAT). Twenty head and neck (HN) and twenty prostate VMAT plans were retrospectively selected for this study. Both global and local 2D gamma evaluations were performed with criteria of 3%/3 mm, 2%/2 mm, 1%/2 mm and 2%/1 mm. In this study, the global and local gamma-index calculated the differences in doses relative to the maximum dose and the dose at the current measurement point, respectively. Using log files acquired during delivery, the differences in parameters at every control point between the VMAT plans and the log files were acquired. The differences in dose-volumetric parameters between reconstructed VMAT plans using the log files and the original VMAT plans were calculated. The Spearman's rank correlation coefficients (rs) were calculated between the passing rates and those differences. Considerable correlations with statistical significances were observed between global 1%/2 mm, local 1%/2 mm and local 2%/1 mm and the MLC position differences (rs = -0.712, -0.628 and -0.581). The numbers of rs values with statistical significance between the passing rates and the changes in dose-volumetric parameters were largest in global 2%/2 mm (n = 16), global 2%/1 mm (n = 15) and local 2%/1 mm (n = 13) criteria. Local gamma-index method with 2%/1 mm generally showed higher sensitivity to detect deviations between a VMAT plan and the delivery of the VMAT plan. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Neuropsychological constraints to human data production on a global scale
NASA Astrophysics Data System (ADS)
Gros, C.; Kaczor, G.; Marković, D.
2012-01-01
Which are the factors underlying human information production on a global level? In order to gain an insight into this question we study a corpus of 252-633 mil. publicly available data files on the Internet corresponding to an overall storage volume of 284-675 Terabytes. Analyzing the file size distribution for several distinct data types we find indications that the neuropsychological capacity of the human brain to process and record information may constitute the dominant limiting factor for the overall growth of globally stored information, with real-world economic constraints having only a negligible influence. This supposition draws support from the observation that the files size distributions follow a power law for data without a time component, like images, and a log-normal distribution for multimedia files, for which time is a defining qualia.
Chirp subbottom profile data collected in 2015 from the northern Chandeleur Islands, Louisiana
Forde, Arnell S.; DeWitt, Nancy T.; Fredericks, Jake J.; Miselis, Jennifer L.
2018-01-30
As part of the Barrier Island Evolution Research project, scientists from the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center conducted a nearshore geophysical survey around the northern Chandeleur Islands, Louisiana, in September 2015. The objective of the project is to improve the understanding of barrier island geomorphic evolution, particularly storm-related depositional and erosional processes that shape the islands over annual to interannual time scales (1–5 years). Collecting geophysical data can help researchers identify relations between the geologic history of the islands and their present day morphology and sediment distribution. High-resolution geophysical data collected along this rapidly changing barrier island system can provide a unique time-series dataset to further the analyses and geomorphological interpretations of this and other coastal systems, improving our understanding of coastal response and evolution over medium-term time scales (months to years). Subbottom profile data were collected in September 2015 offshore of the northern Chandeleur Islands, during USGS Field Activity Number 2015-331-FA. Data products, including raw digital chirp subbottom data, processed subbottom profile images, survey trackline map, navigation files, geographic information system data files and formal Federal Geographic Data Committee metadata, and Field Activity Collection System and operation logs are available for download.
A teledentistry system for the second opinion.
Gambino, Orazio; Lima, Fausto; Pirrone, Roberto; Ardizzone, Edoardo; Campisi, Giuseppina; di Fede, Olga
2014-01-01
In this paper we present a Teledentistry system aimed to the Second Opinion task. It make use of a particular camera called intra-oral camera, also called dental camera, in order to perform the photo shooting and real-time video of the inner part of the mouth. The pictures acquired by the Operator with such a device are sent to the Oral Medicine Expert (OME) by means of a current File Transfer Protocol (FTP) service and the real-time video is channeled into a video streaming thanks to the VideoLan client/server (VLC) application. It is composed by a HTML5 web-pages generated by PHP and allows to perform the Second Opinion both when Operator and OME are logged and when one of them is offline.
CHARMM-GUI ligand reader and modeler for CHARMM force field generation of small molecules.
Kim, Seonghoon; Lee, Jumin; Jo, Sunhwan; Brooks, Charles L; Lee, Hui Sun; Im, Wonpil
2017-06-05
Reading ligand structures into any simulation program is often nontrivial and time consuming, especially when the force field parameters and/or structure files of the corresponding molecules are not available. To address this problem, we have developed Ligand Reader & Modeler in CHARMM-GUI. Users can upload ligand structure information in various forms (using PDB ID, ligand ID, SMILES, MOL/MOL2/SDF file, or PDB/mmCIF file), and the uploaded structure is displayed on a sketchpad for verification and further modification. Based on the displayed structure, Ligand Reader & Modeler generates the ligand force field parameters and necessary structure files by searching for the ligand in the CHARMM force field library or using the CHARMM general force field (CGenFF). In addition, users can define chemical substitution sites and draw substituents in each site on the sketchpad to generate a set of combinatorial structure files and corresponding force field parameters for throughput or alchemical free energy simulations. Finally, the output from Ligand Reader & Modeler can be used in other CHARMM-GUI modules to build a protein-ligand simulation system for all supported simulation programs, such as CHARMM, NAMD, GROMACS, AMBER, GENESIS, LAMMPS, Desmond, OpenMM, and CHARMM/OpenMM. Ligand Reader & Modeler is available as a functional module of CHARMM-GUI at http://www.charmm-gui.org/input/ligandrm. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Peru, M; Peru, C; Mannocci, F; Sherriff, M; Buchanan, L S; Pitt Ford, T R
2006-02-01
The aim of this study was to evaluate root canals instrumented by dental students using the modified double-flared technique, nickel-titanium (NiTi) rotary System GT files and NiTi rotary ProTaper files by micro-computed tomography (MCT). A total of 36 root canals from 18 mesial roots of mandibular molar teeth were prepared; 12 canals were prepared with the modified double-flared technique, using K-flexofiles and Gates-Glidden burs; 12 canals were prepared using System GT and 12 using ProTaper rotary files. Each root was scanned using MCT preoperatively and postoperatively. At the coronal and mid-root sections, System GT and ProTaper files produced significantly less enlarged canal cross-sectional area, volume and perimeter than the modified double-flared technique (P < 0.05). In the mid-root sections there was significantly less thinning of the root structure towards the furcation with System GT and ProTaper (P < 0.05). The rotary techniques were both three times faster than the modified double-flared technique (P < 0.05). Qualitative evaluation of the preparations showed that both ProTaper and System GT were able to prepare root canals with little or no procedural error compared with the modified double-flared technique. Under the conditions of this study, inexperienced dental students were able to prepare curved root canals with rotary files with greater preservation of tooth structure, low risk of procedural errors and much quicker than with hand instruments.
20 CFR 655.201 - Temporary labor certification applications.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Temporary labor certification applications... applications. (a)(1) An employer who anticipates a labor shortage of workers for agricultural or logging... an agent file, in duplicate, a temporary labor certification application, signed by the employer...
Information Retrieval Using Hadoop Big Data Analysis
NASA Astrophysics Data System (ADS)
Motwani, Deepak; Madan, Madan Lal
This paper concern on big data analysis which is the cognitive operation of probing huge amounts of information in an attempt to get uncovers unseen patterns. Through Big Data Analytics Applications such as public and private organization sectors have formed a strategic determination to turn big data into cut throat benefit. The primary occupation of extracting value from big data give rise to a process applied to pull information from multiple different sources; this process is known as extract transforms and lode. This paper approach extract information from log files and Research Paper, awareness reduces the efforts for blueprint finding and summarization of document from several positions. The work is able to understand better Hadoop basic concept and increase the user experience for research. In this paper, we propose an approach for analysis log files for finding concise information which is useful and time saving by using Hadoop. Our proposed approach will be applied on different research papers on a specific domain and applied for getting summarized content for further improvement and make the new content.
Money-center structures in dynamic banking systems
NASA Astrophysics Data System (ADS)
Li, Shouwei; Zhang, Minghui
2016-10-01
In this paper, we propose a dynamic model for banking systems based on the description of balance sheets. It generates some features identified through empirical analysis. Through simulation analysis of the model, we find that banking systems have the feature of money-center structures, that bank asset distributions are power-law distributions, and that contract size distributions are log-normal distributions.
A series of PDB related databases for everyday needs.
Joosten, Robbie P; te Beek, Tim A H; Krieger, Elmar; Hekkelman, Maarten L; Hooft, Rob W W; Schneider, Reinhard; Sander, Chris; Vriend, Gert
2011-01-01
The Protein Data Bank (PDB) is the world-wide repository of macromolecular structure information. We present a series of databases that run parallel to the PDB. Each database holds one entry, if possible, for each PDB entry. DSSP holds the secondary structure of the proteins. PDBREPORT holds reports on the structure quality and lists errors. HSSP holds a multiple sequence alignment for all proteins. The PDBFINDER holds easy to parse summaries of the PDB file content, augmented with essentials from the other systems. PDB_REDO holds re-refined, and often improved, copies of all structures solved by X-ray. WHY_NOT summarizes why certain files could not be produced. All these systems are updated weekly. The data sets can be used for the analysis of properties of protein structures in areas ranging from structural genomics, to cancer biology and protein design.
Data Management System for the National Energy-Water System (NEWS) Assessment Framework
NASA Astrophysics Data System (ADS)
Corsi, F.; Prousevitch, A.; Glidden, S.; Piasecki, M.; Celicourt, P.; Miara, A.; Fekete, B. M.; Vorosmarty, C. J.; Macknick, J.; Cohen, S. M.
2015-12-01
Aiming at providing a comprehensive assessment of the water-energy nexus, the National Energy-Water System (NEWS) project requires the integration of data to support a modeling framework that links climate, hydrological, power production, transmission, and economical models. Large amounts of Georeferenced data has to be streamed to the components of the inter-disciplinary model to explore future challenges and tradeoffs in the US power production, based on climate scenarios, power plant locations and technologies, available water resources, ecosystem sustainability, and economic demand. We used open source and in-house build software components to build a system that addresses two major data challenges: On-the-fly re-projection, re-gridding, interpolation, extrapolation, nodata patching, merging, temporal and spatial aggregation, of static and time series datasets in virtually any file formats and file structures, and any geographic extent for the models I/O, directly at run time; Comprehensive data management based on metadata cataloguing and discovery in repositories utilizing the MAGIC Table (Manipulation and Geographic Inquiry Control database). This innovative concept allows models to access data on-the-fly by data ID, irrespective of file path, file structure, file format and regardless its GIS specifications. In addition, a web-based information and computational system is being developed to control the I/O of spatially distributed Earth system, climate, and hydrological, power grid, and economical data flow within the NEWS framework. The system allows scenario building, data exploration, visualization, querying, and manipulation any loaded gridded, point, and vector polygon dataset. The system has demonstrated its potential for applications in other fields of Earth science modeling, education, and outreach. Over time, this implementation of the system will provide near real-time assessment of various current and future scenarios of the water-energy nexus.
The Evolution of Globular Cluster Systems In Early-Type Galaxies
NASA Astrophysics Data System (ADS)
Grillmair, Carl
1999-07-01
We will measure structural parameters {core radii and concentrations} of globular clusters in three early-type galaxies using deep, four-point dithered observations. We have chosen globular cluster systems which have young, medium-age and old cluster populations, as indicated by cluster colors and luminosities. Our primary goal is to test the hypothesis that globular cluster luminosity functions evolve towards a ``universal'' form. Previous observations have shown that young cluster systems have exponential luminosity functions rather than the characteristic log-normal luminosity function of old cluster systems. We will test to see whether such young system exhibits a wider range of structural parameters than an old systems, and whether and at what rate plausible disruption mechanisms will cause the luminosity function to evolve towards a log-normal form. A simple observational comparison of structural parameters between different age cluster populations and between diff er ent sub-populations within the same galaxy will also provide clues concerning both the formation and destruction mechanisms of star clusters, the distinction between open and globular clusters, and the advisability of using globular cluster luminosity functions as distance indicators.
P1198: software for tracing decision behavior in lending to small businesses.
Andersson, P
2001-05-01
This paper describes a process-tracing software program specially designed to capture decision behavior in lending to small businesses. The source code was written in Lotus Notes. The software runs in a Web browser and consists of two interacting systems: a database and a user interface. The database includes three realistic loan applications. The user interface consists of different but interacting screens that enable the participant to operate the software. Log files register the decision behavior of the participant. An empirical example is presented in order to show the software's potential in providing insights into judgment and decision making. The implications of the software are discussed.
3D Numerical simulation of bed morphological responses to complex in-streamstructures
NASA Astrophysics Data System (ADS)
Xu, Y.; Liu, X.
2017-12-01
In-stream structures are widely used in stream restoration for both hydraulic and ecologicalpurposes. The geometries of the structures are usually designed to be extremely complex andirregular, so as to provide nature-like physical habitat. The aim of this study is to develop anumerical model to accurately predict the bed-load transport and the morphological changescaused by the complex in-stream structures. This model is developed in the platform ofOpenFOAM. In the hydrodynamics part, it utilizes different turbulence models to capture thedetailed turbulence information near the in-stream structures. The technique of immersedboundary method (IBM) is efficiently implemented in the model to describe the movable bendand the rigid solid body of in-stream structures. With IBM, the difficulty of mesh generation onthe complex geometry is greatly alleviated, and the bed surface deformation is able to becoupled in to flow system. This morphodynamics model is firstly validated by simple structures,such as the morphology of the scour in log-vane structure. Then it is applied in a more complexstructure, engineered log jams (ELJ), which consists of multiple logs piled together. Thenumerical results including turbulence flow information and bed morphological responses areevaluated against the experimental measurement within the exact same flow condition.
Tracking scanning laser ophthalmoscope (TSLO)
NASA Astrophysics Data System (ADS)
Hammer, Daniel X.; Ferguson, R. Daniel; Magill, John C.; White, Michael A.; Elsner, Ann E.; Webb, Robert H.
2003-07-01
The effectiveness of image stabilization with a retinal tracker in a multi-function, compact scanning laser ophthalmoscope (TSLO) was demonstrated in initial human subject tests. The retinal tracking system uses a confocal reflectometer with a closed loop optical servo system to lock onto features in the fundus. The system is modular to allow configuration for many research and clinical applications, including hyperspectral imaging, multifocal electroretinography (MFERG), perimetry, quantification of macular and photo-pigmentation, imaging of neovascularization and other subretinal structures (drusen, hyper-, and hypo-pigmentation), and endogenous fluorescence imaging. Optical hardware features include dual wavelength imaging and detection, integrated monochromator, higher-order motion control, and a stimulus source. The system software consists of a real-time feedback control algorithm and a user interface. Software enhancements include automatic bias correction, asymmetric feature tracking, image averaging, automatic track re-lock, and acquisition and logging of uncompressed images and video files. Normal adult subjects were tested without mydriasis to optimize the tracking instrumentation and to characterize imaging performance. The retinal tracking system achieves a bandwidth of greater than 1 kHz, which permits tracking at rates that greatly exceed the maximum rate of motion of the human eye. The TSLO stabilized images in all test subjects during ordinary saccades up to 500 deg/sec with an inter-frame accuracy better than 0.05 deg. Feature lock was maintained for minutes despite subject eye blinking. Successful frame averaging allowed image acquisition with decreased noise in low-light applications. The retinal tracking system significantly enhances the imaging capabilities of the scanning laser ophthalmoscope.
Stress wave sorting of red maple logs for structural quality
Xiping Wang; Robert J. Ross; David W. Green; Brian Brashaw; Karl Englund; Michael Wolcott
2004-01-01
Existing log grading procedures in the United States make only visual assessments of log quality. These procedures do not incorporate estimates of the modulus of elasticity (MOE) of logs. It is questionable whether the visual grading procedures currently used for logs adequately assess the potential quality of structural products manufactured from them, especially...
76 FR 32144 - Marine Mammals; File No. 15543
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-03
..., dynamics, life history, social structure, genetic structure including paternity patterns, and human interactions. The sampling and tagging will support health assessment, auditory system, feeding, and ranging...
NASA Astrophysics Data System (ADS)
Gipson, John
2011-07-01
I describe the proposed data structure for storing, archiving and processing VLBI data. In this scheme, most VLBI data is stored in NetCDF files. NetCDF has the advantage that there are interfaces to most common computer languages including Fortran, Fortran-90, C, C++, Perl, etc, and the most common operating systems including linux, Windows and Mac. The data files for a particular session are organized by special ASCII "wrapper" files which contain pointers to the data files. This allows great flexibility in the processing and analysis of VLBI data, and also allows for extending the types of data used, e.g., source maps. I discuss the use of the new format in calc/solve and other VLBI analysis packages. I also discuss plans for transitioning to the new structure.
VizieR Online Data Catalog: Wide binaries in Tycho-Gaia: search method (Andrews+, 2017)
NASA Astrophysics Data System (ADS)
Andrews, J. J.; Chaname, J.; Agueros, M. A.
2017-11-01
Our catalogue of wide binaries identified in the Tycho-Gaia Astrometric Solution catalogue. The Gaia source IDs, Tycho IDs, astrometry, posterior probabilities for both the log-flat prior and power-law prior models, and angular separation are presented. (1 data file).
46 CFR 380.24 - Schedule of retention periods and description of records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...
46 CFR 380.24 - Schedule of retention periods and description of records.
Code of Federal Regulations, 2014 CFR
2014-10-01
... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...
46 CFR 380.24 - Schedule of retention periods and description of records.
Code of Federal Regulations, 2012 CFR
2012-10-01
... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...
46 CFR 380.24 - Schedule of retention periods and description of records.
Code of Federal Regulations, 2011 CFR
2011-10-01
... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...
46 CFR 380.24 - Schedule of retention periods and description of records.
Code of Federal Regulations, 2013 CFR
2013-10-01
... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doug Blankenship
Natural fracture data from wells 33-7, 33A-7,52A-7, 52B-7 and 83-11 at West Flank. Fracture orientations were determined from image logs of these wells (see accompanying submissions). Data files contain depth, apparent (in wellbore reference frame) and true (in geographic reference frame) azimuth and dip, respectively.
Pereira, Andre; Atri, Mostafa; Rogalla, Patrik; Huynh, Thien; O'Malley, Martin E
2015-11-01
The value of a teaching case repository in radiology training programs is immense. The allocation of resources for putting one together is a complex issue, given the factors that have to be coordinated: hardware, software, infrastructure, administration, and ethics. Costs may be significant and cost-effective solutions are desirable. We chose Medical Imaging Resource Center (MIRC) to build our teaching file. It is offered by RSNA for free. For the hardware, we chose the Raspberry Pi, developed by the Raspberry Foundation: a small control board developed as a low cost computer for schools also used in alternative projects such as robotics and environmental data collection. Its performance and reliability as a file server were unknown to us. For the operational system, we chose Raspbian, a variant of Debian Linux, along with Apache (web server), MySql (database server) and PHP, which enhance the functionality of the server. A USB hub and an external hard drive completed the setup. Installation of software was smooth. The Raspberry Pi was able to handle very well the task of hosting the teaching file repository for our division. Uptime was logged at 100 %, and loading times were similar to other MIRC sites available online. We setup two servers (one for backup), each costing just below $200.00 including external storage and USB hub. It is feasible to run RSNA's MIRC off a low-cost control board (Raspberry Pi). Performance and reliability are comparable to full-size servers for the intended purpose of hosting a teaching file within an intranet environment.
Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E.; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders
2018-01-01
Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from interdisciplinary collaboration. With the publication of Exdir, we invite the scientific community to join the development to create an open specification that will serve as many needs as possible and as a foundation for open access to and exchange of data. PMID:29706879
Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Lepperød, Mikkel E; Tennøe, Simen; Fyhn, Marianne; Hafting, Torkel; Malthe-Sørenssen, Anders
2018-01-01
Natural sciences generate an increasing amount of data in a wide range of formats developed by different research groups and commercial companies. At the same time there is a growing desire to share data along with publications in order to enable reproducible research. Open formats have publicly available specifications which facilitate data sharing and reproducible research. Hierarchical Data Format 5 (HDF5) is a popular open format widely used in neuroscience, often as a foundation for other, more specialized formats. However, drawbacks related to HDF5's complex specification have initiated a discussion for an improved replacement. We propose a novel alternative, the Experimental Directory Structure (Exdir), an open specification for data storage in experimental pipelines which amends drawbacks associated with HDF5 while retaining its advantages. HDF5 stores data and metadata in a hierarchy within a complex binary file which, among other things, is not human-readable, not optimal for version control systems, and lacks support for easy access to raw data from external applications. Exdir, on the other hand, uses file system directories to represent the hierarchy, with metadata stored in human-readable YAML files, datasets stored in binary NumPy files, and raw data stored directly in subdirectories. Furthermore, storing data in multiple files makes it easier to track for version control systems. Exdir is not a file format in itself, but a specification for organizing files in a directory structure. Exdir uses the same abstractions as HDF5 and is compatible with the HDF5 Abstract Data Model. Several research groups are already using data stored in a directory hierarchy as an alternative to HDF5, but no common standard exists. This complicates and limits the opportunity for data sharing and development of common tools for reading, writing, and analyzing data. Exdir facilitates improved data storage, data sharing, reproducible research, and novel insight from interdisciplinary collaboration. With the publication of Exdir, we invite the scientific community to join the development to create an open specification that will serve as many needs as possible and as a foundation for open access to and exchange of data.
An expert system for prediction of chemical toxicity
Hickey, James P.; Aldridge, Andrew J.; Passino-Reader, Dora R.; Frank, Anthony M.
1992-01-01
The National Fisheries Research Center- Great Lakes has developed an interactive computer program that uses the structure of an organic molecule to predict its acute toxicity to four aquatic species. The expert system software, written in the muLISP language, identifies the skeletal structures and substituent groups of an organic molecule from a user-supplied standard chemical notation known as a SMILES string, and then generates values for four solvatochromic parameters. Multiple regression equations relate these parameters to the toxicities (expressed as log10LC50s and log10EC50s, along with 95% confidence intervals) for four species. The system is demonstrated by prediction of toxicity for anilide-type pesticides to the fathead minnow (Pimephales promelas). This software is designed for use on an IBM-compatible personal computer by personnel with minimal toxicology background for rapid estimation of chemical toxicity. The system has numerous applications, with much potential for use in the pharmaceutical industry
VizieR Online Data Catalog: New atmospheric parameters of MILES cool stars (Sharma+, 2016)
NASA Astrophysics Data System (ADS)
Sharma, K.; Prugniel, P.; Singh, H. P.
2015-11-01
MILES V2 spectral interpolator The FITS file is an improved version of MILES interpolator previously presented in PVK. It contains the coefficients of the interpolator, which allows one to compute an interpolated spectrum, giving an effective temperature, log of surface gravity and metallicity (Teff, logg, and [Fe/H]). The file consists of three extensions containing the three temperature regimes described in the paper. Extension Teff range 0 warm 4000-9000K 1 hot >7000K 2 cold <4550K The three functions are linearly interpolated in the Teff overlapping regions. Each extension contains a 2D image-type array, whose first axis is the wavelength described by a WCS (Air wavelength, starting at 3536Å, step=0.9Å). This FITS file can be used by the ULySS v1.3 or higher. (5 data files).
Dzurová, Lenka; Forneris, Federico; Savino, Simone; Galuszka, Petr; Vrabka, Josef; Frébort, Ivo
2015-08-01
The recently discovered cytokinin (CK)-specific phosphoribohydrolase "Lonely Guy" (LOG) is a key enzyme of CK biosynthesis, converting inactive CK nucleotides into biologically active free bases. We have determined the crystal structures of LOG from Claviceps purpurea (cpLOG) and its complex with the enzymatic product phosphoribose. The structures reveal a dimeric arrangement of Rossmann folds, with the ligands bound to large pockets at the interface between cpLOG monomers. Structural comparisons highlight the homology of cpLOG to putative lysine decarboxylases. Extended sequence analysis enabled identification of a distinguishing LOG sequence signature. Taken together, our data suggest phosphoribohydrolase activity for several proteins of unknown function. © 2015 Wiley Periodicals, Inc.
Wagner, Bjoern; Fischer, Holger; Kansy, Manfred; Seelig, Anna; Assmus, Frauke
2015-02-20
Here we present a miniaturized assay, referred to as Carrier-Mediated Distribution System (CAMDIS) for fast and reliable measurement of octanol/water distribution coefficients, log D(oct). By introducing a filter support for octanol, phase separation from water is facilitated and the tendency of emulsion formation (emulsification) at the interface is reduced. A guideline for the best practice of CAMDIS is given, describing a strategy to manage drug adsorption at the filter-supported octanol/buffer interface. We validated the assay on a set of 52 structurally diverse drugs with known shake flask log D(oct) values. Excellent agreement with literature data (r(2) = 0.996, standard error of estimate, SEE = 0.111), high reproducibility (standard deviation, SD < 0.1 log D(oct) units), minimal sample consumption (10 μL of 100 μM DMSO stock solution) and a broad analytical range (log D(oct) range = -0.5 to 4.2) make CAMDIS a valuable tool for the high-throughput assessment of log D(oc)t. Copyright © 2014 Elsevier B.V. All rights reserved.
General Chemistry Division. Quarterly report, July--September 1978
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrar, J.E.
1978-11-17
Status of the following studies is given: nonaqueous titrimetry; molar absorbance of 1,3,5,-triamine-2,4,6,-trinitrobenzene in dimethylsulfoxide, potentiometric microdetermination of pentaerythritol tetranitrate (PETN) in PETN-containing composites; potentiometric semimicrodetermination of some tetrazoles with silver nitrate; applications of a mode-locked krypton ion laser; time-resolved spectroscopy; photoelectrochemistry; evaluation of a prototype atomic emission source system; laser spectroscopy of neptunium; high-performance liquid chromatography of polyphenyl ether; acquisition of a portable, computerized mass spectrometer; improved inlet for quantitative mass spectrometry; a computer data system for the UTI gas analyzers; analysis of perfluorobutene-2; examination of iridium coatings; source of high-intensity, polarized x rays for fluorescence analysis; mass spectrometermore » for the coal gasification field test; materials protection measurement guides; the LOG system of sample file control; and methylation of platinum compounds by methylcobalamin. (LK)« less
VizieR Online Data Catalog: Distances to RRab stars from WISE and Gaia (Sesar+, 2017)
NASA Astrophysics Data System (ADS)
Sesar, B.; Fouesneau, M.; Price-Whelan, A. M.; Bailer-Jones, C. A. L.; Gould, A.; Rix, H.-W.
2017-10-01
To constrain the period-luminosity-metallicity (PLZ) relations for RR Lyrae stars in WISE W1 and W2 bands, we use TGAS trigonometric parallaxes (barω), spectroscopic metallicities ([Fe/H]; Fernley+ 1998, J/A+A/330/515), log-periods (logP, base 10), and apparent magnitudes (m; Klein+ 2014, J/MNRAS/440/L96) for 102 RRab stars within ~2.5kpc from the Sun. The E(B-V) reddening at a star's position is obtained from the Schlegel+ (1998ApJ...500..525S) dust map. (1 data file).
XML Flight/Ground Data Dictionary Management
NASA Technical Reports Server (NTRS)
Wright, Jesse; Wiklow, Colette
2007-01-01
A computer program generates Extensible Markup Language (XML) files that effect coupling between the command- and telemetry-handling software running aboard a spacecraft and the corresponding software running in ground support systems. The XML files are produced by use of information from the flight software and from flight-system engineering. The XML files are converted to legacy ground-system data formats for command and telemetry, transformed into Web-based and printed documentation, and used in developing new ground-system data-handling software. Previously, the information about telemetry and command was scattered in various paper documents that were not synchronized. The process of searching and reading the documents was time-consuming and introduced errors. In contrast, the XML files contain all of the information in one place. XML structures can evolve in such a manner as to enable the addition, to the XML files, of the metadata necessary to track the changes and the associated documentation. The use of this software has reduced the extent of manual operations in developing a ground data system, thereby saving considerable time and removing errors that previously arose in the translation and transcription of software information from the flight to the ground system.
Senathirajah, Yalini; Kaufman, David; Bakken, Suzanne
2016-01-01
Challenges in the design of electronic health records (EHRs) include designing usable systems that must meet the complex, rapidly changing, and high-stakes information needs of clinicians. The ability to move and assemble elements together on the same page has significant human-computer interaction (HCI) and efficiency advantages, and can mitigate the problems of negotiating multiple fixed screens and the associated cognitive burdens. We compare MedWISE-a novel EHR that supports user-composable displays-with a conventional EHR in terms of the number of repeat views of data elements for patient case appraisal. The study used mixed-methods for examination of clinical data viewing in four patient cases. The study compared use of an experimental user-composable EHR with use of a conventional EHR, for case appraisal. Eleven clinicians used a user-composable EHR in a case appraisal task in the laboratory setting. This was compared with log file analysis of the same patient cases in the conventional EHR. We investigated the number of repeat views of the same clinical information during a session and across these two contexts, and compared them using Fisher's exact test. There was a significant difference (p<.0001) in proportion of cases with repeat data element viewing between the user-composable EHR (14.6 percent) and conventional EHR (72.6 percent). Users of conventional EHRs repeatedly viewed the same information elements in the same session, as revealed by log files. Our findings are consistent with the hypothesis that conventional systems require that the user view many screens and remember information between screens, causing the user to forget information and to have to access the information a second time. Other mechanisms (such as reduction in navigation over a population of users due to interface sharing, and information selection) may also contribute to increased efficiency in the experimental system. Systems that allow a composable approach that enables the user to gather together on the same screen any desired information elements may confer cognitive support benefits that can increase productive use of systems by reducing fragmented information. By reducing cognitive overload, it can also enhance the user experience.
The evolution of the FIGARO data reduction system
NASA Technical Reports Server (NTRS)
Shortridge, K.
1992-01-01
The Figaro data reduction system originated at Caltech around 1983. It was based on concepts being developed in the U.K. by the Starlink organization, particularly the use of hierarchical self-defining data structures and the abstraction of most user-interaction into a set of 'parameter system' routines. Since 1984 it has continued to be developed at AAO, in collaboration with Starlink and Caltech. It was adopted as Starlink's main spectroscopic data reduction package, although it is by no means limited to spectra; it has operations for images and data cubes and even a few (very specialized) for four-dimensional data hypercubes. It continued to be used at Caltech and will be used at the Keck. It is also in use at a variety of other organizations around the world. Figaro was originally a system for VMS Vaxes. Recently it was ported (at Caltech) to run on SUN's, and work is underway at the University of New South Wales on a DecStation version. It is hoped to coordinate all this work into a unified release, but coordination of the development of a system by organizations covering three continents poses a number of interesting administrative problems. The hierarchical data structures used by Figaro allow it to handle a variety of types of data, and to add new items to data structures. Error and data quality information was added to the basic file format used, error information being particularly useful for infrared data. Cooperating sets of programs can add specific sub-structures to data files to carry information that they understand (polarimetry data containing multiple data arrays, for example), without this affecting the way other programs handle the files. Complex instrument-specific ancillary information can be added to data files written at a telescope and can be used by programs that understand the instrumental details in order to produce properly calibrated data files. Once this preliminary data processing was done the resulting files contain 'ordinary' spectra or images that can be processed by programs that are not instrument-specific. The structures holding the instrumental information can then be discarded from the files. Much effort has gone into trying to make it easy to write Figaro programs; data access subroutines are now available to handle access to all the conventional items found in Figaro files (main data arrays, error information, quality information etc), and programs that only need to access such items can be very simple indeed. A large number of Figaro users do indeed write their own Figaro applications using these routines. The fact that Figaro programs are written as callable subroutines getting information from the user through a small set of parameter routines means that they can be invoked in numerous ways; they are normally linked and run as individual programs (called by a small main routine that is generated automatically), but are also available linked to run under the ADAM data acquisition system and there is an interface that lets them be called as part of a user-written Fortran program. The long-term future of Figaro probably depends to a large extent on how successfully it manages the transition from being a VMS-only system to being a multi-platform system.
Log on to the Future: One School's Success Story.
ERIC Educational Resources Information Center
Hovenic, Ginger
This paper describes Clear View Elementary School's (California) successful experience with integrating technology into the curriculum. Since its inception seven years ago, the school has acquired 250 computers, networked them all on two central file servers, and computerized the library and trained all staff members to be proficient facilitators…
40 CFR 146.14 - Information to be considered by the Director.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., logging procedures, deviation checks, and a drilling, testing, and coring program; and (16) A certificate... information listed below which are current and accurate in the file. For a newly drilled Class I well, the..., construction, date drilled, location, depth, record of plugging and/or completion, and any additional...
ERIC Educational Resources Information Center
Descy, Don E.
1993-01-01
This introduction to the Internet with examples for Macintosh computer users demonstrates the ease of using e-mail, participating on discussion group listservs, logging in to remote sites using Telnet, and obtaining resources using the File Transfer Protocol (FTP). Included are lists of discussion groups, Telnet sites, and FTP Archive sites. (EA)
A Query Analysis of Consumer Health Information Retrieval
Hong, Yi; de la Cruz, Norberto; Barnas, Gary; Early, Eileen; Gillis, Rick
2002-01-01
The log files of MCW HealthLink web site were analyzed to study users' needs for consumer health information and get a better understanding of the health topics users are searching for, the paths users usually take to find consumer health information and the way to improve search effectiveness.
The Internet and Technical Services: A Point Break Approach.
ERIC Educational Resources Information Center
McCombs, Gillian M.
1994-01-01
Discusses implications of using the Internet for library technical services. Topics addressed include creative uses of the Internet; three basic applications on the Internet, i.e., electronic mail, remote log-in to another computer, and file transfer; electronic processing of information; electronic access to information; and electronic processing…
Library Web Proxy Use Survey Results.
ERIC Educational Resources Information Center
Murray, Peter E.
2001-01-01
Outlines the use of proxy Web servers by libraries and reports on a survey on their use in libraries. Highlights include proxy use for remote resource access, for filtering, for bandwidth conservation, and for gathering statistics; privacy policies regarding the use of proxy server log files; and a copy of the survey. (LRW)
NASA Technical Reports Server (NTRS)
Noll, Carey E.; Pearlman, Michael Reisman; Torrence, Mark H.
2013-01-01
Network stations provided system configuration documentation upon joining the ILRS. This information, found in the various site and system log files available on the ILRS website, is essential to the ILRS analysis centers, combination centers, and general user community. Therefore, it is imperative that the station personnel inform the ILRS community in a timely fashion when changes to the system occur. This poster provides some information about the various documentation that must be maintained. The ILRS network consists of over fifty global sites actively ranging to over sixty satellites as well as five lunar reflectors. Information about these stations are available on the ILRS website (http://ilrs.gsfc.nasa.gov/network/stations/index.html). The ILRS Analysis Centers must have current information about the stations and their system configuration in order to use their data in generation of derived products. However, not all information available on the ILRS website is as up-to-date as necessary for correct analysis of their data.
Kushniruk, Andre W; Borycki, Elizabeth M
2015-01-01
Innovations in healthcare information systems promise to revolutionize and streamline healthcare processes worldwide. However, the complexity of these systems and the need to better understand issues related to human-computer interaction have slowed progress in this area. In this chapter the authors describe their work in using methods adapted from usability engineering, video ethnography and analysis of digital log files for improving our understanding of complex real-world healthcare interactions between humans and technology. The approaches taken are cost-effective and practical and can provide detailed ethnographic data on issues health professionals and consumers encounter while using systems as well as potential safety problems. The work is important in that it can be used in techno-anthropology to characterize complex user interactions with technologies and also to provide feedback into redesign and optimization of improved healthcare information systems.
DSSTOX MASTER STRUCTURE-INDEX FILE: SDF FILE AND ...
The DSSTox Master Structure-Index File serves to consolidate, manage, and ensure quality and uniformity of the chemical and substance information spanning all DSSTox Structure Data Files, including those in development but not yet published separately on this website. The DSSTox Master Structure-Index File serves to consolidate, manage, and ensure quality and uniformity of the chemical and substance information spanning all DSSTox Structure Data Files, including those in development but not yet published separately on this website.
Exhaust heated hydrogen and oxygen producing catalytic converter for combustion engine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schreiber, E.T.
1977-07-26
A steam generator is provided in operative association with a source of water and the exhaust system of a combustion engine including an air induction system provided with primary fuel inlet structure and supplemental fuel inlet structure. The steam generator derives its heat for converting water into steam from the exhaust system of the combustion engine and the steam generator includes a steam outlet communicated with and opening into one end of an elongated tubular housing disposed in good heat transfer relation with the exhaust system of the combustion engine and having a gas outlet at its other end communicatedmore » with the supplemental fuel inlet of the induction system. The tubular housing has iron filings disposed therein and is in such heat transfer relation with the exhaust system of the combustion engine so as to elevate the temperature of steam passing therethrough and to heat the iron filings to the extent that passage of the heated steam over the heated filings will result in hydrogen and oxygen gas being produced in the tubular housing for subsequent passage to the supplemental fuel inlet of the combustion engine induction system.« less
Streamlining CASTOR to manage the LHC data torrent
NASA Astrophysics Data System (ADS)
Lo Presti, G.; Espinal Curull, X.; Cano, E.; Fiorini, B.; Ieri, A.; Murray, S.; Ponce, S.; Sindrilaru, E.
2014-06-01
This contribution describes the evolution of the main CERN storage system, CASTOR, as it manages the bulk data stream of the LHC and other CERN experiments, achieving over 90 PB of stored data by the end of LHC Run 1. This evolution was marked by the introduction of policies to optimize the tape sub-system throughput, going towards a cold storage system where data placement is managed by the experiments' production managers. More efficient tape migrations and recalls have been implemented and deployed where bulk meta-data operations greatly reduce the overhead due to small files. A repack facility is now integrated in the system and it has been enhanced in order to automate the repacking of several tens of petabytes, required in 2014 in order to prepare for the next LHC run. Finally the scheduling system has been evolved to integrate the internal monitoring. To efficiently manage the service a solid monitoring infrastructure is required, able to analyze the logs produced by the different components (about 1 kHz of log messages). A new system has been developed and deployed, which uses a transport messaging layer provided by the CERN-IT Agile Infrastructure and exploits technologies including Hadoop and HBase. This enables efficient data mining by making use of MapReduce techniques, and real-time data aggregation and visualization. The outlook for the future is also presented. Directions and possible evolution will be discussed in view of the restart of data taking activities.
CaseLog: semantic network interface to a student computer-based patient record system.
Cimino, C.; Goldman, E. K.; Curtis, J. A.; Reichgott, M. J.
1993-01-01
We have developed a computer program called CaseLog, which serves as an exemplary, computer-based patient record (CPR) system. The program allows for the introduction of the students to issues unique to patient record systems. These include record security, unique patient identifiers, and the use of controlled vocabularies. A particularly challenging aspect of the development of this program was allowing for student entry of controlled vocabulary terms. There were four goals we wished to achieve: students should be able to find the terms they are looking for; once a term has been found, it should be easy to find contextually related terms; it should be easy to determine that a sought-for term is not in the vocabulary; and the structure of the vocabulary should be dynamically altered by contextual information to allow its use for a variety of purposes. We chose a semantic network for our vocabulary structure. Within the processing power of the equipment we were working with, we achieved our goals. This paper will describe the development of the vocabulary, the design of the CaseLog program, and the feedback from student users of the program. PMID:8130581
Production data in media systems and press front ends: capture, formats and database methods
NASA Astrophysics Data System (ADS)
Karttunen, Simo
1997-02-01
The nature, purpose and data presentation features of media jobs are analyzed in relation to the content, document, process and resource management in media production. Formats are the natural way of presenting, collecting and storing information, contents, document components and final documents. The state of the art and the trends in the media formats and production data are reviewed. The types and the amount of production data are listed, e.g. events, schedules, product descriptions, reports, visual support, quality, process states and color data. The data exchange must be vendor-neutral. Adequate infrastructure and system architecture are defined for production and media data. The roles of open servers and intranets are evaluated and their potential roles as future solutions are anticipated. The press frontend is the part of print media production where large files dominate. The new output alternatives, i.e. film recorders, direct plate output (CTP and CTP-on-press) and digital, plateless printing lines need new workflow tools and very efficient file and format management. The paper analyzes the capture, formatting and storing of job files and respective production data, such as the event logs of the processes. Intranet, browsers, Java applets and open web severs will be used to capture production data, especially where intranets are used anyhow, or where several companies are networked to plan, design and use documents and printed products. The user aspects of installing intranets is stressed since there are numerous more traditional and more dedicated networking solutions on the market.
Purple L1 Milestone Review Panel GPFS Functionality and Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loewe, W E
2006-12-01
The GPFS deliverable for the Purple system requires the functionality and performance necessary for ASC I/O needs. The functionality includes POSIX and MPIIO compatibility, and multi-TB file capability across the entire machine. The bandwidth performance required is 122.15 GB/s, as necessary for productive and defensive I/O requirements, and the metadata performance requirement is 5,000 file stats per second. To determine success for this deliverable, several tools are employed. For functionality testing of POSIX, 10TB-files, and high-node-count capability, the parallel file system bandwidth performance test IOR is used. IOR is an MPI-coordinated application that can write and then read to amore » single shared file or to an individual file per process and check the data integrity of the file(s). The MPIIO functionality is tested with the MPIIO test suite from the MPICH library. Bandwidth performance is tested using IOR for the required 122.15 GB/s sustained write. All IOR tests are performanced with data checking enabled. Metadata performance is tested after ''aging'' the file system with 80% data block usage and 20% inode usage. The fdtree metadata test is expected to create/remove a large directory/file structure in under 20 minutes time, akin to interactive metadata usage. Multiple (10) instances of ''ls -lR'', each performing over 100K stats, are run concurrently in different large directories to demonstrate 5,000 stats/sec.« less
Enhancement/upgrade of Engine Structures Technology Best Estimator (EST/BEST) Software System
NASA Technical Reports Server (NTRS)
Shah, Ashwin
2003-01-01
This report describes the work performed during the contract period and the capabilities included in the EST/BEST software system. The developed EST/BEST software system includes the integrated NESSUS, IPACS, COBSTRAN, and ALCCA computer codes required to perform the engine cycle mission and component structural analysis. Also, the interactive input generator for NESSUS, IPACS, and COBSTRAN computer codes have been developed and integrated with the EST/BEST software system. The input generator allows the user to create input from scratch as well as edit existing input files interactively. Since it has been integrated with the EST/BEST software system, it enables the user to modify EST/BEST generated files and perform the analysis to evaluate the benefits. Appendix A gives details of how to use the newly added features in the EST/BEST software system.
Knowledge Representation: A Brief Review.
ERIC Educational Resources Information Center
Vickery, B. C.
1986-01-01
Reviews different structures and techniques of knowledge representation: structure of database records and files, data structures in computer programming, syntatic and semantic structure of natural language, knowledge representation in artificial intelligence, and models of human memory. A prototype expert system that makes use of some of these…
Building Specialized Multilingual Lexical Graphs Using Community Resources
NASA Astrophysics Data System (ADS)
Daoud, Mohammad; Boitet, Christian; Kageura, Kyo; Kitamoto, Asanobu; Mangeot, Mathieu; Daoud, Daoud
We are describing methods for compiling domain-dedicated multilingual terminological data from various resources. We focus on collecting data from online community users as a main source, therefore, our approach depends on acquiring contributions from volunteers (explicit approach), and it depends on analyzing users' behaviors to extract interesting patterns and facts (implicit approach). As a generic repository that can handle the collected multilingual terminological data, we are describing the concept of dedicated Multilingual Preterminological Graphs MPGs, and some automatic approaches for constructing them by analyzing the behavior of online community users. A Multilingual Preterminological Graph is a special lexical resource that contains massive amount of terms related to a special domain. We call it preterminological, because it is a raw material that can be used to build a standardized terminological repository. Building such a graph is difficult using traditional approaches, as it needs huge efforts by domain specialists and terminologists. In our approach, we build such a graph by analyzing the access log files of the website of the community, and by finding the important terms that have been used to search in that website, and their association with each other. We aim at making this graph as a seed repository so multilingual volunteers can contribute. We are experimenting this approach with the Digital Silk Road Project. We have used its access log files since its beginning in 2003, and obtained an initial graph of around 116000 terms. As an application, we used this graph to obtain a preterminological multilingual database that is serving a CLIR system for the DSR project.
Design and development of an interactive medical teleconsultation system over the World Wide Web.
Bai, J; Zhang, Y; Dai, B
1998-06-01
The objective of the medical teleconsultation system presented in this paper is to demonstrate the use of the World Wide Web (WWW) for telemedicine and interactive medical information exchange. The system, which is developed based on Java, could provide several basic Java tools to fulfill the requirements of medical applications, including a file manager, data tool, bulletin board, and digital audio tool. The digital audio tool uses point-to-point structure to enable two physicians to communicate directly through voice. The others use multipoint structure. The file manager manages the medical images stored in the WWW information server, which come from a hospital database. The data tool supports cooperative operations on the medical data between the participating physicians. The bulletin board enables the users to discuss special cases by writing text on the board, send their personal or group diagnostic reports on the cases, and reorganize the reports and store them in its report file for later use. The system provides a hardware-independent platform for physicians to interact with one another as well as to access medical information over the WWW.
SITE TECHNOLOGY CAPSULE: GIS\\KEY ENVIRONMENTAL DATA MANAGEMENT SYSTEM
GIS/Key™ is a comprehensive environmental database management system that integrates site data and graphics, enabling the user to create geologic cross-sections; boring logs; potentiometric, isopleth, and structure maps; summary tables; and hydrographs. GIS/Key™ is menu-driven an...
Dynamic Non-Hierarchical File Systems for Exascale Storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, Darrell E.; Miller, Ethan L
This constitutes the final report for “Dynamic Non-Hierarchical File Systems for Exascale Storage”. The ultimate goal of this project was to improve data management in scientific computing and high-end computing (HEC) applications, and to achieve this goal we proposed: to develop the first, HEC-targeted, file system featuring rich metadata and provenance collection, extreme scalability, and future storage hardware integration as core design goals, and to evaluate and develop a flexible non-hierarchical file system interface suitable for providing more powerful and intuitive data management interfaces to HEC and scientific computing users. Data management is swiftly becoming a serious problem in themore » scientific community – while copious amounts of data are good for obtaining results, finding the right data is often daunting and sometimes impossible. Scientists participating in a Department of Energy workshop noted that most of their time was spent “...finding, processing, organizing, and moving data and it’s going to get much worse”. Scientists should not be forced to become data mining experts in order to retrieve the data they want, nor should they be expected to remember the naming convention they used several years ago for a set of experiments they now wish to revisit. Ideally, locating the data you need would be as easy as browsing the web. Unfortunately, existing data management approaches are usually based on hierarchical naming, a 40 year-old technology designed to manage thousands of files, not exabytes of data. Today’s systems do not take advantage of the rich array of metadata that current high-end computing (HEC) file systems can gather, including content-based metadata and provenance1 information. As a result, current metadata search approaches are typically ad hoc and often work by providing a parallel management system to the “main” file system, as is done in Linux (the locate utility), personal computers, and enterprise search appliances. These search applications are often optimized for a single file system, making it difficult to move files and their metadata between file systems. Users have tried to solve this problem in several ways, including the use of separate databases to index file properties, the encoding of file properties into file names, and separately gathering and managing provenance data, but none of these approaches has worked well, either due to limited usefulness or scalability, or both. Our research addressed several key issues: High-performance, real-time metadata harvesting: extracting important attributes from files dynamically and immediately updating indexes used to improve search; Transparent, automatic, and secure provenance capture: recording the data inputs and processing steps used in the production of each file in the system; Scalable indexing: indexes that are optimized for integration with the file system; Dynamic file system structure: our approach provides dynamic directories similar to those in semantic file systems, but these are the native organization rather than a feature grafted onto a conventional system. In addition to these goals, our research effort will include evaluating the impact of new storage technologies on the file system design and performance. In particular, the indexing and metadata harvesting functions can potentially benefit from the performance improvements promised by new storage class memories.« less
Ryder, Robert T.; Harris, Anita G.; Repetski, John E.; Crangle, Robert D.; Ruppert, Leslie F.; Ryder, Robert T.
2014-01-01
This chapter is a re-release of U.S. Geological Survey Bulletin 1839-K, of the same title, by Ryder and others (1992; online version 2.0 revised and digitized by Robert D. Crangle, Jr., 2003). It consists of one file of the report text as it appeared in USGS Bulletin 1839-K and a second file containing the cross section, figures 1 and 2, and tables 1 and 2 on one oversized sheet; the second file was digitized in 2003 as version 2.0 and also includes the gamma-ray well log traces.
Visualization of usability and functionality of a professional website through web-mining.
Jones, Josette F; Mahoui, Malika; Gopa, Venkata Devi Pragna
2007-10-11
Functional interface design requires understanding of the information system structure and the user. Web logs record user interactions with the interface, and thus provide some insight into user search behavior and efficiency of the search process. The present study uses a data-mining approach with techniques such as association rules, clustering and classification, to visualize the usability and functionality of a digital library through in depth analyses of web logs.
Non-volatile main memory management methods based on a file system.
Oikawa, Shuichi
2014-01-01
There are upcoming non-volatile (NV) memory technologies that provide byte addressability and high performance. PCM, MRAM, and STT-RAM are such examples. Such NV memory can be used as storage because of its data persistency without power supply while it can be used as main memory because of its high performance that matches up with DRAM. There are a number of researches that investigated its uses for main memory and storage. They were, however, conducted independently. This paper presents the methods that enables the integration of the main memory and file system management for NV memory. Such integration makes NV memory simultaneously utilized as both main memory and storage. The presented methods use a file system as their basis for the NV memory management. We implemented the proposed methods in the Linux kernel, and performed the evaluation on the QEMU system emulator. The evaluation results show that 1) the proposed methods can perform comparably to the existing DRAM memory allocator and significantly better than the page swapping, 2) their performance is affected by the internal data structures of a file system, and 3) the data structures appropriate for traditional hard disk drives do not always work effectively for byte addressable NV memory. We also performed the evaluation of the effects caused by the longer access latency of NV memory by cycle-accurate full-system simulation. The results show that the effect on page allocation cost is limited if the increase of latency is moderate.
Critiquing ';pore connectivity' as basis for in situ flow in geothermal systems
NASA Astrophysics Data System (ADS)
Kenedi, C. L.; Leary, P.; Malin, P.
2013-12-01
Geothermal system in situ flow systematics derived from detailed examination of grain-scale structures, fabrics, mineral alteration, and pore connectivity may be extremely misleading if/when extrapolated to reservoir-scale flow structure. In oil/gas field clastic reservoir operations, it is standard to assume that small scale studies of flow fabric - notably the Kozeny-Carman and Archie's Law treatments at the grain-scale and well-log/well-bore sampling of formations/reservoirs at the cm-m scale - are adequate to define the reservoir-scale flow properties. In the case of clastic reservoirs, however, a wide range of reservoir-scale data wholly discredits this extrapolation: Well-log data show that grain-scale fracture density fluctuation power scales inversely with spatial frequency k, S(k) ~ 1/k^β, 1.0 < β < 1.2, 1cycle/km < k < 1cycle/cm; the scaling is a ';universal' feature of well-logs (neutron porosity, sonic velocity, chemical abundance, mass density, resistivity, in many forms of clastic rock and instances of shale bodies, for both horizontal and vertical wells). Grain-scale fracture density correlates with in situ porosity; spatial fluctuations of porosity φ in well-core correlate with spatial fluctuations in the logarithm of well-core permeability, δφ ~ δlog(κ) with typical correlation coefficient ~ 85%; a similar relation is observed in consolidating sediments/clays, indicating a generic coupling between fluid pressure and solid deformation at pore sites. In situ macroscopic flow systems are lognormally distributed according to κ ~ κ0 exp(α(φ-φ0)), α >>1 an empirical parameter for degree of in situ fracture connectivity; the lognormal distribution applies to well-productivities in US oil fields and NZ geothermal fields, ';frack productivity' in oil/gas shale body reservoirs, ore grade distributions, and trace element abundances. Although presently available evidence for these properties in geothermal reservoirs is limited, there are indications that geothermal system flow essentially obeys the same ';universal' in situ flow rules as does clastic rock: Well-log data from Los Azufres, MX, show power-law scaling S(k) ~ 1/k^β, 1.2 < β < 1.4, for spatial frequency range 2cycles/km to 0.5cycle/m; higher β-values are likely due to the relatively fresh nature of geothermal systems; Well-core at Bulalo (PH) and Ohaaki (NZ) show statistically significant spatial correlation, δφ ~ δlog(κ) Well productivity at Ohaaki/Ngawha (NZ) and in geothermal systems elsewhere are lognormally distributed; K/Th/U abundances lognormally distributed in Los Azufres well-logs We therefore caution that small-scale evidence for in situ flow fabric in geothermal systems that is interpreted in terms of ';pore connectivity' may in fact not reflect how small-scale chemical processes are integrated into a large-scale geothermal flow structure. Rather such small scale studies should (perhaps) be considered in term of the above flow rules. These flow rules are easily incorporated into standard flow simulation codes, in particular the OPM = Open Porous Media open-source industry-standard flow code. Geochemical transport data relevant to geothermal systems can thus be expected to be well modeled by OPM or equivalent (e.g., INL/LANL) codes.
Forensic Investigation of Cooperative Storage Cloud Service: Symform as a Case Study.
Teing, Yee-Yang; Dehghantanha, Ali; Choo, Kim-Kwang Raymond; Dargahi, Tooska; Conti, Mauro
2017-05-01
Researchers envisioned Storage as a Service (StaaS) as an effective solution to the distributed management of digital data. Cooperative storage cloud forensic is relatively new and is an under-explored area of research. Using Symform as a case study, we seek to determine the data remnants from the use of cooperative cloud storage services. In particular, we consider both mobile devices and personal computers running various popular operating systems, namely Windows 8.1, Mac OS X Mavericks 10.9.5, Ubuntu 14.04.1 LTS, iOS 7.1.2, and Android KitKat 4.4.4. Potential artefacts recovered during the research include data relating to the installation and uninstallation of the cloud applications, log-in to and log-out from Symform account using the client application, file synchronization as well as their time stamp information. This research contributes to an in-depth understanding of the types of terrestrial artifacts that are likely to remain after the use of cooperative storage cloud on client devices. © 2016 American Academy of Forensic Sciences.
Production, prices, employment, and trade in Northwest forest industries, third quarter 1996.
Debra D. Warren
1997-01-01
Provides current information on lumber and plywood production and prices; employment in the forest industries: international trade in logs, lumber, and plywood: volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at
Production, prices, employment, and trade in Northwest forest industries, all quarters 2000.
Debra D. Warren
2002-01-01
Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at
Production, prices, employment, and trade in Northwest forest industries, all quarters 2002.
Debra D. Warren
2004-01-01
Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at
Production, prices, employment, and trade in Northwest forest industries, all quarters 2005.
Debra D. Warren
2007-01-01
Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at
Production, prices, employment, and trade in Northwest forest industries, all quarters 2006.
Debra D. Warren
2008-01-01
Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at
Production, prices, employment, and trade in Northwest forest industries, all quarters 2004.
Debra D. Warren
2006-01-01
Provides current information on lumber and plywood production and prices; employment in forest industries; international trade in logs, lumber, and plywood; volumes and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at
Voting with Their Seats: Computer Laboratory Design and the Casual User
ERIC Educational Resources Information Center
Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David
2007-01-01
Student computer laboratories are provided by most teaching institutions around the world; however, what is the most effective layout for such facilities? The log-in data files from computer laboratories at a regional university in Australia were analysed to determine whether there was a pattern in student seating. In particular, it was…
Using Learning Styles and Viewing Styles in Streaming Video
ERIC Educational Resources Information Center
de Boer, Jelle; Kommers, Piet A. M.; de Brock, Bert
2011-01-01
Improving the effectiveness of learning when students observe video lectures becomes urgent with the rising advent of (web-based) video materials. Vital questions are how students differ in their learning preferences and what patterns in viewing video can be detected in log files. Our experiments inventory students' viewing patterns while watching…
Motivational Aspects of Learning Genetics with Interactive Multimedia
ERIC Educational Resources Information Center
Tsui, Chi-Yan; Treagust, David F.
2004-01-01
A BioLogica trial in six U.S. schools using interpretive approach is conducted by the Concord Consortium that examined the student motivation of learning genetics. Multiple data sources like online tests, computer data log files and classroom observation are used that found the result in terms of interviewees' perception, class-wide online…
16. Photocopy of photograph (4 x 5 inch reduction of ...
16. Photocopy of photograph (4 x 5 inch reduction of 1939 3-1/4 x 5-5/8 inch print, photographer unknown; in Recreation files, Supervisor's Office, Mt. Baker-Snoqualmie National Forest) GENERAL VIEW, NORTHEAST CORNER, INTERPRETIVE LOG TO LEFT. - Glacier Ranger Station, Washington State Route 542, Glacier, Whatcom County, WA
Production, prices, employment, and trade in Northwest forest industries, all quarters 1998.
Debra D. Warren
2000-01-01
Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at
Production, prices, employment, and trade in Northwest forest industries, fourth quarter 1996.
Debra D. Warren
1997-01-01
Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at
Production, prices, employment, and trade in Northwest forest industries, all quarters of 2007.
Debra D. Warren
2008-01-01
Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at
Production, prices, employment, and trade in Northwest forest industries, all quarters 2003.
Debra D. Warren
2005-01-01
Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at
Production, prices, employment, and trade in Northwest forest industries, all quarters 2008
Debra Warren
2009-01-01
Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at
Data Retention Policy | High-Performance Computing | NREL
HPC Data Retention Policy. File storage areas on Peregrine and Gyrfalcon are either user-centric to reclaim storage. We can make special arrangements for permanent storage, if needed. User-Centric > is 3 months after the last project ends. During this retention period, the user may log in to
Patterns in Elementary School Students' Strategic Actions in Varying Learning Situations
ERIC Educational Resources Information Center
Malmberg, Jonna; Järvenoja, Hanna; Järvelä, Sanna
2013-01-01
This study uses log file traces to examine differences between high-and low-achieving students' strategic actions in varying learning situations. In addition, this study illustrates, in detail, what strategic and self-regulated learning constitutes in practice. The study investigates the learning patterns that emerge in learning situations…
Online Persistence in Higher Education Web-Supported Courses
ERIC Educational Resources Information Center
Hershkovitz, Arnon; Nachmias, Rafi
2011-01-01
This research consists of an empirical study of online persistence in Web-supported courses in higher education, using Data Mining techniques. Log files of 58 Moodle websites accompanying Tel Aviv University courses were drawn, recording the activity of 1189 students in 1897 course enrollments during the academic year 2008/9, and were analyzed…
78 FR 56873 - Information Collection Being Reviewed by the Federal Communications Commission
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-16
... on the respondents, including the use of automated collection techniques or other forms of....: 3060-0360. Title: Section 80.409, Station Logs (Maritime Services). Form No.: N/A. Type of Review... the claim or complaint has been satisfied or barred by statute limiting the time for filing suits upon...
DeWitt, Nancy T.; Flocks, James G.; Pfeiffer, William R.; Wiese, Dana S.
2010-01-01
In March of 2010, the U.S. Geological Survey (USGS) conducted geophysical surveys east of Cat Island, Mississippi (fig. 1). The efforts were part of the USGS Gulf of Mexico Science Coordination partnership with the U.S. Army Corps of Engineers (USACE) to assist the Mississippi Coastal Improvements Program (MsCIP) and the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazards Susceptibility Project by mapping the shallow geological stratigraphic framework of the Mississippi Barrier Island Complex. These geophysical surveys will provide the data necessary for scientists to define, interpret, and provide baseline bathymetry and seafloor habitat for this area and to aid scientists in predicting future geomorpholocial changes of the islands with respect to climate change, storm impact, and sea-level rise. Furthermore, these data will provide information for barrier island restoration, particularly in Camille Cut, and provide protection for the historical Fort Massachusetts. For more information refer to http://ngom.usgs.gov/gomsc/mscip/index.html. This report serves as an archive of the processed swath bathymetry and side scan sonar data (SSS). Data products herein include gridded and interpolated surfaces, surface images, and x,y,z data products for both swath bathymetry and side scan sonar imagery. Additional files include trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal FGDC metadata. Scanned images of the handwritten FACS logs and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report or hold the cursor over an acronym for a pop-up explanation. The USGS St. Petersburg Coastal and Marine Science Center assigns a unique identifier to each cruise or field activity. For example, 10CCT01 tells us the data were collected in 2010 for the Coastal Change and Transport (CCT) study and the data were collected during the first field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. Data were collected using a 26-foot (ft) Glacier Bay Catamaran. Side scan sonar and interferometric swath bathymetry data were collected simultaneously along the tracklines. The side scan sonar towfish was towed off the port side just slightly behind the vessel, close to the seafloor. The interferometric swath transducer was sled-mounted on a rail attached between the catamaran hulls. During the survey the sled is secured into position. Navigation was acquired with a CodaOctopus Octopus F190 Precision Attitude and Positioning System and differentially corrected with OmniSTAR. See the digital FACS equipment log for details about the acquisition equipment used. Both raw datasets were stored digitally and processed using CARIS HIPS and SIPS software at the USGS St. Petersburg Coastal and Marine Science Center. For more information on processing refer to the Equipment and Processing page. Post-processing of the swath dataset revealed a motion artifact that is attributed to movement of the pole that the swath transducers are attached to in relation to the boat. The survey took place in the winter months, in which strong winds and rough waves contributed to a reduction in data quality. The rough seas contributed to both the movement of the pole and the very high noise base seen in the raw amplitude data of the side scan sonar. Chirp data were also collected during this survey and are archived separately.
Rapid Diagnostics of Onboard Sequences
NASA Technical Reports Server (NTRS)
Starbird, Thomas W.; Morris, John R.; Shams, Khawaja S.; Maimone, Mark W.
2012-01-01
Keeping track of sequences onboard a spacecraft is challenging. When reviewing Event Verification Records (EVRs) of sequence executions on the Mars Exploration Rover (MER), operators often found themselves wondering which version of a named sequence the EVR corresponded to. The lack of this information drastically impacts the operators diagnostic capabilities as well as their situational awareness with respect to the commands the spacecraft has executed, since the EVRs do not provide argument values or explanatory comments. Having this information immediately available can be instrumental in diagnosing critical events and can significantly enhance the overall safety of the spacecraft. This software provides auditing capability that can eliminate that uncertainty while diagnosing critical conditions. Furthermore, the Restful interface provides a simple way for sequencing tools to automatically retrieve binary compiled sequence SCMFs (Space Command Message Files) on demand. It also enables developers to change the underlying database, while maintaining the same interface to the existing applications. The logging capabilities are also beneficial to operators when they are trying to recall how they solved a similar problem many days ago: this software enables automatic recovery of SCMF and RML (Robot Markup Language) sequence files directly from the command EVRs, eliminating the need for people to find and validate the corresponding sequences. To address the lack of auditing capability for sequences onboard a spacecraft during earlier missions, extensive logging support was added on the Mars Science Laboratory (MSL) sequencing server. This server is responsible for generating all MSL binary SCMFs from RML input sequences. The sequencing server logs every SCMF it generates into a MySQL database, as well as the high-level RML file and dictionary name inputs used to create the SCMF. The SCMF is then indexed by a hash value that is automatically included in all command EVRs by the onboard flight software. Second, both the binary SCMF result and the RML input file can be retrieved simply by specifying the hash to a Restful web interface. This interface enables command line tools as well as large sophisticated programs to download the SCMF and RMLs on-demand from the database, enabling a vast array of tools to be built on top of it. One such command line tool can retrieve and display RML files, or annotate a list of EVRs by interleaving them with the original sequence commands. This software has been integrated with the MSL sequencing pipeline where it will serve sequences useful in diagnostics, debugging, and situational awareness throughout the mission.
Feng, Yingang
2017-01-01
The use of NMR methods to determine the three-dimensional structures of carbohydrates and glycoproteins is still challenging, in part because of the lack of standard protocols. In order to increase the convenience of structure determination, the topology and parameter files for carbohydrates in the program Crystallography & NMR System (CNS) were investigated and new files were developed to be compatible with the standard simulated annealing protocols for proteins and nucleic acids. Recalculating the published structures of protein-carbohydrate complexes and glycosylated proteins demonstrates that the results are comparable to the published structures which employed more complex procedures for structure calculation. Integrating the new carbohydrate parameters into the standard structure calculation protocol will facilitate three-dimensional structural study of carbohydrates and glycosylated proteins by NMR spectroscopy.
2017-01-01
The use of NMR methods to determine the three-dimensional structures of carbohydrates and glycoproteins is still challenging, in part because of the lack of standard protocols. In order to increase the convenience of structure determination, the topology and parameter files for carbohydrates in the program Crystallography & NMR System (CNS) were investigated and new files were developed to be compatible with the standard simulated annealing protocols for proteins and nucleic acids. Recalculating the published structures of protein-carbohydrate complexes and glycosylated proteins demonstrates that the results are comparable to the published structures which employed more complex procedures for structure calculation. Integrating the new carbohydrate parameters into the standard structure calculation protocol will facilitate three-dimensional structural study of carbohydrates and glycosylated proteins by NMR spectroscopy. PMID:29232406
FT-IR, FT-Raman spectra and DFT calculations of melaminium perchlorate monohydrate.
Kanagathara, N; Marchewka, M K; Drozd, M; Renganathan, N G; Gunasekaran, S; Anbalagan, G
2013-08-01
Melaminium perchlorate monohydrate (MPM), an organic material has been synthesized by slow solvent evaporation method at room temperature. Powder X-ray diffraction analysis confirms that MPM crystal belongs to triclinic system with space group P-1. FTIR and FT Raman spectra are recorded at room temperature. Functional group assignment has been made for the melaminium cations and perchlorate anions. Vibrational spectra have also been discussed on the basis of quantum chemical density functional theory (DFT) calculations using Firefly (PC GAMESS) version 7.1 G. Vibrational frequencies are calculated and scaled values are compared with experimental values. The assignment of the bands has been made on the basis of the calculated PED. The Mulliken charges, HOMO-LUMO orbital energies are analyzed directly from Firefly program log files and graphically illustrated. HOMO-LUMO energy gap and other related molecular properties are also calculated. The theoretically constructed FT-IR and FT-Raman spectra of MPM coincide with the experimental one. The chemical structure of the compound has been established by (1)H and (13)C NMR spectra. No detectable signal was observed during powder test for second harmonic generation. Copyright © 2013 Elsevier B.V. All rights reserved.
VizieR Online Data Catalog: Bessel (1825) calculation for geodesic measurements (Karney+, 2010)
NASA Astrophysics Data System (ADS)
Karney, C. F. F.; Deakin, R. E.
2010-06-01
The solution of the geodesic problem for an oblate ellipsoid is developed in terms of series. Tables are provided to simplify the computation. Included here are the tables that accompanied Bessel's paper (with corrections). The tables were crafted by Bessel to be minimize the labor of hand calculations. To this end, he adjusted the intervals in the tables, the number of terms included in the series, and the number of significant digits given so that the final results are accurate to about 8 places. For that reason, the most useful form of the tables is as the PDF file which provides the tables in a layout close to the original. Also provided is the LaTeX source file for the PDF file. Finally, the data has been put into a format so that it can be read easily by computer programs. All the logarithms are in base 10 (common logarithms). The characteristic and the mantissa should be read separately (indicated as x.c and x.m in the file description). Thus the first entry in the table, -4.4, should be parsed as "-4" (the characteristic) and ".4" (the mantissa); the anti-log for this entry is 10(-4+0.4)=2.5e-4. The "Delta" columns give the first difference of the preceding column, i.e., the difference of the preceding column in the next row and the preceding column in the current row. In the printed tables these are expressed as "units in the last place" and the differences are of the rounded representations in the preceding columns (to minimize interpolation errors). In table1.dat these are given scaled to a match the format used for the preceding column, as indicated by the units given for these columns. The unit log(") (in the description within square brackets [arcsec]) means the logarithm of a quantity expressed in arcseconds. (3 data files).
GIS\\KEY™ ENVIRONMENTAL DATA MANAGEMENT SYSTEM - INNOVATIVE TECHNOLOGY EVALUATION REPORT
GIS/Key™ is a comprehensive environmental database management system that integrates site data and graphics, enabling the user to create geologic cross-sections; boring logs; potentiometric, isopleth, and structure maps; summary tables; and hydrographs. GIS/Key™ is menu-driven an...
Arthur, J.K.; Taylor, R.E.
1986-01-01
As part of the Gulf Coast Regional Aquifer System Analysis (GC RASA) study, data from 184 geophysical well logs were used to define the geohydrologic framework of the Mississippi embayment aquifer system in Mississippi for flow model simulation. Five major aquifers of Eocene and Paleocene age were defined within this aquifer system in Mississippi. A computer data storage system was established to assimilate the information obtained from the geophysical logs. Computer programs were developed to manipulate the data to construct geologic sections and structure maps. Data from the storage system will be input to a five-layer, three-dimensional, finite-difference digital computer model that is used to simulate the flow dynamics in the five major aquifers of the Mississippi embayment aquifer system.
Durand, C.T.; Edwards, L.E.; Malinconico, M.L.; Powars, D.S.
2009-01-01
During 2005-2006, the International Continental Scientific Drilling Program and the U.S. Geological Survey drilled three continuous core holes into the Chesapeake Bay impact structure to a total depth of 1766.3 m. A collection of supplemental materials that presents a record of the core recovery and measurement data for the Eyreville cores is available on CD-ROM at the end of this volume and in the GSA Data Repository. The supplemental materials on the CD-ROM include digital photographs of each core box from the three core holes, tables of the three coring-run logs, as recorded on site, and a set of depth-conversion programs. In this chapter, the contents, purposes, and basic applications of the supplemental materials are briefly described. With this information, users can quickly decide if the materials will apply to their specific research needs. ?? 2009 The Geological Society of America.
Measurement, Modeling, and Analysis of a Large-scale Blog Sever Workload
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeon, Myeongjae; Hwang, Jeaho; Kim, Youngjae
2010-01-01
Despite the growing popularity of Online Social Networks (OSNs), the workload characteristics of OSN servers, such as those hosting blog services, are not well understood. Understanding workload characteristics is important for opti- mizing and improving the performance of current systems and software based on observed trends. Thus, in this paper, we characterize the system workload of the largest blog hosting servers in South Korea, Tistory1. In addition to understanding the system workload of the blog hosting server, we have developed synthesized workloads and obtained the following major findings: (i) the transfer size of non-multimedia files and blog articles can bemore » modeled by a truncated Pareto distribution and a log-normal distribution respectively, and (ii) users accesses to blog articles do not show temporal locality, but they are strongly biased toward those posted along with images or audio.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2011-06-22
The Linac Coherent Light Source (LCLS) is required to deliver a high quality electron beam for producing coherent X-rays. As a result, high resolution beam position monitoring is required. The Beam Position Monitor (BPM) digitizer acquires analog signals from the beam line and digitizes them to obtain beam position data. Although Matlab is currently being used to test the BPM digitizer?s functions and capability, the Controls Department at SLAC prefers to use Experimental Physics and Industrial Control Systems (EPICS). This paper discusses the transition of providing similar as well as enhanced functionalities, than those offered by Matlab, to test themore » digitizer. Altogether, the improved test stand development system can perform mathematical and statistical calculations with the waveform signals acquired from the digitizer and compute the fast Fourier transform (FFT) of the signals. Finally, logging of meaningful data into files has been added.« less
Murmer, a message generator and reporter for Unix, VMS, and VxWorks
NASA Astrophysics Data System (ADS)
Oleynik, G.; Appleton, B.; Moore, C.; Sergey, G.; Udumula, L.
1994-02-01
Murmer is a Unix based message generation, reporting, display, and logging system that we have developed for use in data acquisition systems at Fermilab. Murmer is a tool for the production and management of message reporting. Its usefulness ranges from software product development and maintenance to system level shakedown and diagnostics. Murmer provides a VMS MESSAGE-like function code generation utility, a client routine package for sending these codes over the network to a central server, and a server which translates the codes into meaningful visual information, writes the information to a logfile, and display it on B&W or color X windows. Because Murmer stores message information in keyed access files, it can provide advanced features such as popping up help when a displayed message is clicked on by the mouse and executing 'action' shell scripts when selected messages are received by the server.
NASA Astrophysics Data System (ADS)
Jurado, Maria Jose; Schleicher, Anja
2014-05-01
The objective of our research is a detailed characterization of structures on the basis of LWD oriented images and logs,and clay mineralogy of cuttings from Hole C0002F of the Nankai Trough accretionary prism. Our results show an integrated interpretation of structures derived from borehole images, petrophysical characterization on LWD logs and cuttings mineralogy. The geometry of the structure intersected at Hole C0002F has been characterized by the interpretation of oriented borehole resistivity images acquired during IODP Expedition 338. The characterization of structural features, faults and fracture zones is based on a detailed post-cruise interpretation of bedding and fractures on borehole images and also on the analysis of Logging While Drilling (LWD) log response (gamma radioactivity, resistivity and sonic logs). The interpretation and complete characterization of structures (fractures, fracture zones, fault zones, folds) was achieved after detailed shorebased reprocessing of resistivity images, which allowed to enhance bedding and fracture's imaging for geometry and orientation interpretation. In order to characterize distinctive petrophysical properties based on LWD log response, it could be compared with compositional changes derived from cuttings analyses. Cuttings analyses were used to calibrate and to characterize log response and to verify interpretations in terms of changes in composition and texture at fractures and fault zones defined on borehole images. Cuttings were taken routinely every 5 m during Expedition 338, indicating a clay-dominated lithology of silty claystone with interbeds of weakly consolidated, fine sandstones. The main mineralogical components are clay minerals, quartz, feldspar and calcite. Selected cuttings were taken from areas of interest as defined on LWD logs and images. The clay mineralogy was investigated on the <2 micron clay-size fraction, with special focus on smectite and illite minerals. Based on X-ray diffraction analysis measured at room temperature and a relative humidity of ~30%, we compared the shape and size of illite and smectite, as well as their water content and their polytypes. The comparison of cuttings mineralogy with logging while drilling (LWD) data allowed us to characterize structural, petrophysical and mineralogical properties at fracture and fault zones. We also analyzed the relationship between deformation structures and compositional and mineralogical changes. We established a correlation between observed results on clay mineralogy and log responses in relation with the structures and trends characterized on logging data. In general, the log data provide a good correlation with the actual mineralogy and the relative abundance of clay. In particular we analyzed trends characterized by smectite water layers as indication of compaction. These trends were correlated with log response (on sonic velocity) within Unit IV. Our results show the integration of logging data and cutting sample analyses as a valuable tool for characterization of petrophysical and mineralogical changes of the structures of the Nankai accretionary prism.
Profex: a graphical user interface for the Rietveld refinement program BGMN.
Doebelin, Nicola; Kleeberg, Reinhard
2015-10-01
Profex is a graphical user interface for the Rietveld refinement program BGMN . Its interface focuses on preserving BGMN 's powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems.
Profex: a graphical user interface for the Rietveld refinement program BGMN
Doebelin, Nicola; Kleeberg, Reinhard
2015-01-01
Profex is a graphical user interface for the Rietveld refinement program BGMN. Its interface focuses on preserving BGMN’s powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems. PMID:26500466
Grid-wide neuroimaging data federation in the context of the NeuroLOG project
Michel, Franck; Gaignard, Alban; Ahmad, Farooq; Barillot, Christian; Batrancourt, Bénédicte; Dojat, Michel; Gibaud, Bernard; Girard, Pascal; Godard, David; Kassel, Gilles; Lingrand, Diane; Malandain, Grégoire; Montagnat, Johan; Pélégrini-Issac, Mélanie; Pennec, Xavier; Rojas Balderrama, Javier; Wali, Bacem
2010-01-01
Grid technologies are appealing to deal with the challenges raised by computational neurosciences and support multi-centric brain studies. However, core grids middleware hardly cope with the complex neuroimaging data representation and multi-layer data federation needs. Moreover, legacy neuroscience environments need to be preserved and cannot be simply superseded by grid services. This paper describes the NeuroLOG platform design and implementation, shedding light on its Data Management Layer. It addresses the integration of brain image files, associated relational metadata and neuroscience semantic data in a heterogeneous distributed environment, integrating legacy data managers through a mediation layer. PMID:20543431
Interpretation of well logs in a carbonate aquifer
MacCary, L.M.
1978-01-01
This report describes the log analysis of the Randolph and Sabial core holes in the Edwards aquifer in Texas, with particular attention to the principles that can be applied generally to any carbonate system. The geologic and hydrologic data were obtained during the drilling of the two holes, from extensive laboratory analysis of the cores, and from numerous geophysical logs run in the two holes. Some logging methods are inherently superiors to others for the analysis of limestone and dolomite aquifers. Three such systems are the dentistry, neutron, and acoustic-velocity (sonic) logs. Most of the log analysis described here is based on the interpretation of suites of logs from these three systems. In certain instances, deeply focused resistivity logs can be used to good advantage in carbonate rock studies; this technique is used to computer the water resistivity in the Randolph core hole. The rocks penetrated by the Randolph core hole are typical of those carbonates that have undergone very little solution by recent ground-water circulation. There are few large solutional openings; the water is saline; and the rocks are dark, dolomitic, have pore space that is interparticle or intercrystalline, and contain unoxidized organic material. The total porosity of rocks in the saline zone is higher than that of rocks in the fresh-water aquifer; however, the intrinsic permeability is much less in the saline zone because there are fewer large solutional openings. The Sabinal core hole penetrates a carbonate environment that has experienced much solution by ground water during recent geologic time. The rocks have high secondary porosities controlled by sedimentary structures within the rock; the water is fresh; and the dominant rock composition is limestone. The relative percentages of limestone and dolomite, the average matrix (grain) densities of the rock mixtures , and the porosity of the rock mass can be calculated from density, neutron, and acoustic logs. With supporting data from resistivity logs, the formation water quality can be estimated, as well as the relative cementation or tortuosity of the rock. Many of these properties calculated from logs can be verified by analysis of the core available from test holes drilled in the saline and fresh water zones.
Linking log quality with product performance
D. W. Green; Robert Ross
1997-01-01
In the United States, log grading procedures use visual assessment of defects, in relation to the log scaling diameter, to estimate the yield of lumber that maybe expected from the log. This procedure was satisfactory when structural grades were based only on defect size and location. In recent years, however, structural products have increasingly been graded using a...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stathakis, S; Defoor, D; Linden, P
Purpose: To study the frequency of Multi-Leaf Collimator (MLC) leaf failures, investigate methods to predict them and reduce linac downtime. Methods: A Varian HD120 MLC was used in our study. The hyperterminal MLC errors logged from 06/2012 to 12/2014 were collected. Along with the hyperterminal errors, the MLC motor changes and all other MLC interventions by the linear accelerator engineer were recorded. The MLC dynalog files were also recorded on a daily basis for each treatment and during linac QA. The dynalog files were analyzed to calculate root mean square errors (RMS) and cumulative MLC travel distance per motor. Anmore » in-house MatLab code was used to analyze all dynalog files, record RMS errors and calculate the distance each MLC traveled per day. Results: A total of 269 interventions were recorded over a period of 18 months. Of these, 146 included MLC motor leaf change, 39 T-nut replacements, and 84 MLC cleaning sessions. Leaves close to the middle of each side required the most maintenance. In the A bank, leaves A27 to A40 recorded 73% of all interventions, while the same leaves in the B bank counted for 52% of the interventions. On average, leaves in the middle of the bank had their motors changed approximately every 1500m of travel. Finally, it was found that the number of RMS errors increased prior to an MLC motor change. Conclusion: An MLC dynalog file analysis software was developed that can be used to log daily MLC usage. Our eighteen-month data analysis showed that there is a correlation between the distance an MLC travels, the RMS and the life of the MLC motor. We plan to use this tool to predict MLC motor failures and with proper and timely intervention, reduce the downtime of the linac during clinical hours.« less
Correlations between chromatographic parameters and bioactivity predictors of potential herbicides.
Janicka, Małgorzata
2014-08-01
Different liquid chromatography techniques, including reversed-phase liquid chromatography on Purosphere RP-18e, IAM.PC.DD2 and Cosmosil Cholester columns and micellar liqud chromatography with a Purosphere RP-8e column and using buffered sodium dodecyl sulfate-acetonitrile as the mobile phase, were applied to study the lipophilic properties of 15 newly synthesized phenoxyacetic and carbamic acid derivatives, which are potential herbicides. Chromatographic lipophilicity descriptors were used to extrapolate log k parameters (log kw and log km) and log k values. Partitioning lipophilicity descriptors, i.e., log P coefficients in an n-octanol-water system, were computed from the molecular structures of the tested compounds. Bioactivity descriptors, including partition coefficients in a water-plant cuticle system and water-human serum albumin and coefficients for human skin partition and permeation were calculated in silico by ACD/ADME software using the linear solvation energy relationship of Abraham. Principal component analysis was applied to describe similarities between various chromatographic and partitioning lipophilicities. Highly significant, predictive linear relationships were found between chromatographic parameters and bioactivity descriptors. © The Author [2013]. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Test-bench system for a borehole azimuthal acoustic reflection imaging logging tool
NASA Astrophysics Data System (ADS)
Liu, Xianping; Ju, Xiaodong; Qiao, Wenxiao; Lu, Junqiang; Men, Baiyong; Liu, Dong
2016-06-01
The borehole azimuthal acoustic reflection imaging logging tool (BAAR) is a new generation of imaging logging tool, which is able to investigate stratums in a relatively larger range of space around the borehole. The BAAR is designed based on the idea of modularization with a very complex structure, so it has become urgent for us to develop a dedicated test-bench system to debug each module of the BAAR. With the help of a test-bench system introduced in this paper, test and calibration of BAAR can be easily achieved. The test-bench system is designed based on the client/server model. The hardware system mainly consists of a host computer, an embedded controlling board, a bus interface board, a data acquisition board and a telemetry communication board. The host computer serves as the human machine interface and processes the uploaded data. The software running on the host computer is designed based on VC++. The embedded controlling board uses Advanced Reduced Instruction Set Machines 7 (ARM7) as the micro controller and communicates with the host computer via Ethernet. The software for the embedded controlling board is developed based on the operating system uClinux. The bus interface board, data acquisition board and telemetry communication board are designed based on a field programmable gate array (FPGA) and provide test interfaces for the logging tool. To examine the feasibility of the test-bench system, it was set up to perform a test on BAAR. By analyzing the test results, an unqualified channel of the electronic receiving cabin was discovered. It is suggested that the test-bench system can be used to quickly determine the working condition of sub modules of BAAR and it is of great significance in improving production efficiency and accelerating industrial production of the logging tool.