Sample records for log files aggregation

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilsche, Thomas; Schuchart, Joseph; Cope, Joseph

    Event tracing is an important tool for understanding the performance of parallel applications. As concurrency increases in leadership-class computing systems, the quantity of performance log data can overload the parallel file system, perturbing the application being observed. In this work we present a solution for event tracing at leadership scales. We enhance the I/O forwarding system software to aggregate and reorganize log data prior to writing to the storage system, significantly reducing the burden on the underlying file system for this type of traffic. Furthermore, we augment the I/O forwarding system with a write buffering capability to limit the impactmore » of artificial perturbations from log data accesses on traced applications. To validate the approach, we modify the Vampir tracing tool to take advantage of this new capability and show that the approach increases the maximum traced application size by a factor of 5x to more than 200,000 processors.« less

  2. Who Goes There? Measuring Library Web Site Usage.

    ERIC Educational Resources Information Center

    Bauer, Kathleen

    2000-01-01

    Discusses how libraries can gather data on the use of their Web sites. Highlights include Web server log files, including the common log file, referrer log file, and agent log file; log file limitations; privacy concerns; and choosing log analysis software, both free and commercial. (LRW)

  3. Small file aggregation in a parallel computing system

    DOEpatents

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy; Grider, Gary; Zhang, Jingwang

    2014-09-02

    Techniques are provided for small file aggregation in a parallel computing system. An exemplary method for storing a plurality of files generated by a plurality of processes in a parallel computing system comprises aggregating the plurality of files into a single aggregated file; and generating metadata for the single aggregated file. The metadata comprises an offset and a length of each of the plurality of files in the single aggregated file. The metadata can be used to unpack one or more of the files from the single aggregated file.

  4. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). Version 3.5, Quick Reference Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, B.G.; Richards, R.E.; Reece, W.J.

    1992-10-01

    This Reference Guide contains instructions on how to install and use Version 3.5 of the NRC-sponsored Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). The NUCLARR data management system is contained in compressed files on the floppy diskettes that accompany this Reference Guide. NUCLARR is comprised of hardware component failure data (HCFD) and human error probability (HEP) data, both of which are available via a user-friendly, menu driven retrieval system. The data may be saved to a file in a format compatible with IRRAS 3.0 and commercially available statistical packages, or used to formulate log-plots and reports of data retrievalmore » and aggregation findings.« less

  5. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, B.G.; Richards, R.E.; Reece, W.J.

    1992-10-01

    This Reference Guide contains instructions on how to install and use Version 3.5 of the NRC-sponsored Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). The NUCLARR data management system is contained in compressed files on the floppy diskettes that accompany this Reference Guide. NUCLARR is comprised of hardware component failure data (HCFD) and human error probability (HEP) data, both of which are available via a user-friendly, menu driven retrieval system. The data may be saved to a file in a format compatible with IRRAS 3.0 and commercially available statistical packages, or used to formulate log-plots and reports of data retrievalmore » and aggregation findings.« less

  6. TH-AB-201-12: Using Machine Log-Files for Treatment Planning and Delivery QA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanhope, C; Liang, J; Drake, D

    2016-06-15

    Purpose: To determine the segment reduction and dose resolution necessary for machine log-files to effectively replace current phantom-based patient-specific quality assurance, while minimizing computational cost. Methods: Elekta’s Log File Convertor R3.2 records linac delivery parameters (dose rate, gantry angle, leaf position) every 40ms. Five VMAT plans [4 H&N, 1 Pulsed Brain] comprised of 2 arcs each were delivered on the ArcCHECK phantom. Log-files were reconstructed in Pinnacle on the phantom geometry using 1/2/3/4° control point spacing and 2/3/4mm dose grid resolution. Reconstruction effectiveness was quantified by comparing 2%/2mm gamma passing rates of the original and log-file plans. Modulation complexity scoresmore » (MCS) were calculated for each beam to correlate reconstruction accuracy and beam modulation. Percent error in absolute dose for each plan-pair combination (log-file vs. ArcCHECK, original vs. ArcCHECK, log-file vs. original) was calculated for each arc and every diode greater than 10% of the maximum measured dose (per beam). Comparing standard deviations of the three plan-pair distributions, relative noise of the ArcCHECK and log-file systems was elucidated. Results: The original plans exhibit a mean passing rate of 95.1±1.3%. The eight more modulated H&N arcs [MCS=0.088±0.014] and two less modulated brain arcs [MCS=0.291±0.004] yielded log-file pass rates most similar to the original plan when using 1°/2mm [0.05%±1.3% lower] and 2°/3mm [0.35±0.64% higher] log-file reconstructions respectively. Log-file and original plans displayed percent diode dose errors 4.29±6.27% and 3.61±6.57% higher than measurement. Excluding the phantom eliminates diode miscalibration and setup errors; log-file dose errors were 0.72±3.06% higher than the original plans – significantly less noisy. Conclusion: For log-file reconstructed VMAT arcs, 1° control point spacing and 2mm dose resolution is recommended, however, less modulated arcs may allow less stringent reconstructions. Following the aforementioned reconstruction recommendations, the log-file technique is capable of detecting delivery errors with equivalent accuracy and less noise than ArcCHECK QA. I am funded by an Elekta Research Grant.« less

  7. Log file-based patient dose calculations of double-arc VMAT for head-and-neck radiotherapy.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Majima, Kazuhiro; Matsushita, Haruo; Takeda, Ken; Jingu, Keiichi

    2018-04-01

    The log file-based method cannot display dosimetric changes due to linac component miscalibration because of the insensitivity of log files to linac component miscalibration. The purpose of this study was to supply dosimetric changes in log file-based patient dose calculations for double-arc volumetric-modulated arc therapy (VMAT) in head-and-neck cases. Fifteen head-and-neck cases participated in this study. For each case, treatment planning system (TPS) doses were produced by double-arc and single-arc VMAT. Miscalibration-simulated log files were generated by inducing a leaf miscalibration of ±0.5 mm into the log files that were acquired during VMAT irradiation. Subsequently, patient doses were estimated using the miscalibration-simulated log files. For double-arc VMAT, regarding planning target volume (PTV), the change from TPS dose to miscalibration-simulated log file dose in D mean was 0.9 Gy and that for tumor control probability was 1.4%. As for organ-at-risks (OARs), the change in D mean was <0.7 Gy and normal tissue complication probability was <1.8%. A comparison between double-arc and single-arc VMAT for PTV showed statistically significant differences in the changes evaluated by D mean and radiobiological metrics (P < 0.01), even though the magnitude of these differences was small. Similarly, for OARs, the magnitude of these changes was found to be small. Using the log file-based method for PTV and OARs, the log file-based method estimate of patient dose using the double-arc VMAT has accuracy comparable to that obtained using the single-arc VMAT. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  8. Clinical impact of dosimetric changes for volumetric modulated arc therapy in log file-based patient dose calculations.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Matsunaga, Kenichi; Matsushita, Haruo; Majima, Kazuhiro; Jingu, Keiichi

    2017-10-01

    A log file-based method cannot detect dosimetric changes due to linac component miscalibration because log files are insensitive to miscalibration. Herein, clinical impacts of dosimetric changes on a log file-based method were determined. Five head-and-neck and five prostate plans were applied. Miscalibration-simulated log files were generated by inducing a linac component miscalibration into the log file. Miscalibration magnitudes for leaf, gantry, and collimator at the general tolerance level were ±0.5mm, ±1°, and ±1°, respectively, and at a tighter tolerance level achievable on current linac were ±0.3mm, ±0.5°, and ±0.5°, respectively. Re-calculations were performed on patient anatomy using log file data. Changes in tumor control probability/normal tissue complication probability from treatment planning system dose to re-calculated dose at the general tolerance level was 1.8% on planning target volume (PTV) and 2.4% on organs at risk (OARs) in both plans. These changes at the tighter tolerance level were improved to 1.0% on PTV and to 1.5% on OARs, with a statistically significant difference. We determined the clinical impacts of dosimetric changes on a log file-based method using a general tolerance level and a tighter tolerance level for linac miscalibration and found that a tighter tolerance level significantly improved the accuracy of the log file-based method. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  9. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  10. Comparing Web and Touch Screen Transaction Log Files

    PubMed Central

    Huntington, Paul; Williams, Peter

    2001-01-01

    Background Digital health information is available on a wide variety of platforms including PC-access of the Internet, Wireless Application Protocol phones, CD-ROMs, and touch screen public kiosks. All these platforms record details of user sessions in transaction log files, and there is a growing body of research into the evaluation of this data. However, there is very little research that has examined the problems of comparing the transaction log files of kiosks and the Internet. Objectives To provide a first step towards examining the problems of comparing the transaction log files of kiosks and the Internet. Methods We studied two platforms: touch screen kiosks and a comparable Web site. For both of these platforms, we examined the menu structure (which affects transaction log file data), the log-file structure, and the metrics derived from log-file records. Results We found substantial differences between the generated metrics. Conclusions None of the metrics discussed can be regarded as an effective way of comparing the use of kiosks and Web sites. Two metrics stand out as potentially comparable and valuable: the number of user sessions per hour and user penetration of pages. PMID:11720960

  11. Cloud object store for archive storage of high performance computing data using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  12. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    ERIC Educational Resources Information Center

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  13. SU-E-T-392: Evaluation of Ion Chamber/film and Log File Based QA to Detect Delivery Errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, C; Mason, B; Kirsner, S

    2015-06-15

    Purpose: Ion chamber and film (ICAF) is a method used to verify patient dose prior to treatment. More recently, log file based QA has been shown as an alternative for measurement based QA. In this study, we delivered VMAT plans with and without errors to determine if ICAF and/or log file based QA was able to detect the errors. Methods: Using two VMAT patients, the original treatment plan plus 7 additional plans with delivery errors introduced were generated and delivered. The erroneous plans had gantry, collimator, MLC, gantry and collimator, collimator and MLC, MLC and gantry, and gantry, collimator, andmore » MLC errors. The gantry and collimator errors were off by 4{sup 0} for one of the two arcs. The MLC error introduced was one in which the opening aperture didn’t move throughout the delivery of the field. For each delivery, an ICAF measurement was made as well as a dose comparison based upon log files. Passing criteria to evaluate the plans were ion chamber less and 5% and film 90% of pixels pass the 3mm/3% gamma analysis(GA). For log file analysis 90% of voxels pass the 3mm/3% 3D GA and beam parameters match what was in the plan. Results: Two original plans were delivered and passed both ICAF and log file base QA. Both ICAF and log file QA met the dosimetry criteria on 4 of the 12 erroneous cases analyzed (2 cases were not analyzed). For the log file analysis, all 12 erroneous plans alerted a mismatch in delivery versus what was planned. The 8 plans that didn’t meet criteria all had MLC errors. Conclusion: Our study demonstrates that log file based pre-treatment QA was able to detect small errors that may not be detected using an ICAF and both methods of were able to detect larger delivery errors.« less

  14. Teaching an Old Log New Tricks with Machine Learning.

    PubMed

    Schnell, Krista; Puri, Colin; Mahler, Paul; Dukatz, Carl

    2014-03-01

    To most people, the log file would not be considered an exciting area in technology today. However, these relatively benign, slowly growing data sources can drive large business transformations when combined with modern-day analytics. Accenture Technology Labs has built a new framework that helps to expand existing vendor solutions to create new methods of gaining insights from these benevolent information springs. This framework provides a systematic and effective machine-learning mechanism to understand, analyze, and visualize heterogeneous log files. These techniques enable an automated approach to analyzing log content in real time, learning relevant behaviors, and creating actionable insights applicable in traditionally reactive situations. Using this approach, companies can now tap into a wealth of knowledge residing in log file data that is currently being collected but underutilized because of its overwhelming variety and volume. By using log files as an important data input into the larger enterprise data supply chain, businesses have the opportunity to enhance their current operational log management solution and generate entirely new business insights-no longer limited to the realm of reactive IT management, but extending from proactive product improvement to defense from attacks. As we will discuss, this solution has immediate relevance in the telecommunications and security industries. However, the most forward-looking companies can take it even further. How? By thinking beyond the log file and applying the same machine-learning framework to other log file use cases (including logistics, social media, and consumer behavior) and any other transactional data source.

  15. WE-G-213CD-03: A Dual Complementary Verification Method for Dynamic Tumor Tracking on Vero SBRT.

    PubMed

    Poels, K; Depuydt, T; Verellen, D; De Ridder, M

    2012-06-01

    to use complementary cine EPID and gimbals log file analysis for in-vivo tracking accuracy monitoring. A clinical prototype of dynamic tracking (DT) was installed on the Vero SBRT system. This prototype version allowed tumor tracking by gimballed linac rotations using an internal-external correspondence model. The DT prototype software allowed the detailed logging of all applied gimbals rotations during tracking. The integration of an EPID on the vero system allowed the acquisition of cine EPID images during DT. We quantified the tracking error on cine EPID (E-EPID) by subtracting the target center (fiducial marker detection) and the field centroid. Dynamic gimbals log file information was combined with orthogonal x-ray verification images to calculate the in-vivo tracking error (E-kVLog). The correlation between E-kVLog and E-EPID was calculated for validation of the gimbals log file. Further, we investigated the sensitivity of the log file tracking error by introducing predefined systematic tracking errors. As an application we calculate gimbals log file tracking error for dynamic hidden target tests to investigate gravity effects and decoupled gimbals rotation from gantry rotation. Finally, calculating complementary cine EPID and log file tracking errors evaluated the clinical accuracy of dynamic tracking. A strong correlation was found between log file and cine EPID tracking error distribution during concurrent measurements (R=0.98). We found sensitivity in the gimbals log files to detect a systematic tracking error up to 0.5 mm. Dynamic hidden target tests showed no gravity influence on tracking performance and high degree of decoupled gimbals and gantry rotation during dynamic arc dynamic tracking. A submillimetric agreement between clinical complementary tracking error measurements was found. Redundancy of the internal gimbals log file with x-ray verification images with complementary independent cine EPID images was implemented to monitor the accuracy of gimballed tumor tracking on Vero SBRT. Research was financially supported by the Flemish government (FWO), Hercules Foundation and BrainLAB AG. © 2012 American Association of Physicists in Medicine.

  16. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  17. SU-E-T-142: Automatic Linac Log File: Analysis and Reporting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gainey, M; Rothe, T

    Purpose: End to end QA for IMRT/VMAT is time consuming. Automated linac log file analysis and recalculation of daily recorded fluence, and hence dose, distribution bring this closer. Methods: Matlab (R2014b, Mathworks) software was written to read in and analyse IMRT/VMAT trajectory log files (TrueBeam 1.5, Varian Medical Systems) overnight, and are archived on a backed-up network drive (figure). A summary report (PDF) is sent by email to the duty linac physicist. A structured summary report (PDF) for each patient is automatically updated for embedding into the R&V system (Mosaiq 2.5, Elekta AG). The report contains cross-referenced hyperlinks to easemore » navigation between treatment fractions. Gamma analysis can be performed on planned (DICOM RTPlan) and treated (trajectory log) fluence distributions. Trajectory log files can be converted into RTPlan files for dose distribution calculation (Eclipse, AAA10.0.28, VMS). Results: All leaf positions are within +/−0.10mm: 57% within +/−0.01mm; 89% within 0.05mm. Mean leaf position deviation is 0.02mm. Gantry angle variations lie in the range −0.1 to 0.3 degrees, mean 0.04 degrees. Fluence verification shows excellent agreement between planned and treated fluence. Agreement between planned and treated dose distribution, the derived from log files, is very good. Conclusion: Automated log file analysis is a valuable tool for the busy physicist, enabling potential treated fluence distribution errors to be quickly identified. In the near future we will correlate trajectory log analysis with routine IMRT/VMAT QA analysis. This has the potential to reduce, but not eliminate, the QA workload.« less

  18. Catching errors with patient-specific pretreatment machine log file analysis.

    PubMed

    Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa

    2013-01-01

    A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  19. 46 CFR Appendix A to Part 530 - Instructions for the Filing of Service Contracts

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... file service contracts. BTCL will direct OIRM to provide approved filers with a log-on ID and password. Filers who wish a third party (publisher) to file their service contracts must so indicate on Form FMC-83... home page, http://www.fmc.gov. A. Registration, Log-on ID and Password To register for filing, a...

  20. 46 CFR Appendix A to Part 530 - Instructions for the Filing of Service Contracts

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... file service contracts. BTCL will direct OIRM to provide approved filers with a log-on ID and password. Filers who wish a third party (publisher) to file their service contracts must so indicate on Form FMC-83... home page, http://www.fmc.gov. A. Registration, Log-on ID and Password To register for filing, a...

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, S; Ho, M; Chen, C

    Purpose: The use of log files to perform patient specific quality assurance for both protons and IMRT has been established. Here, we extend that approach to a proprietary log file format and compare our results to measurements in phantom. Our goal was to generate a system that would permit gross errors to be found within 3 fractions until direct measurements. This approach could eventually replace direct measurements. Methods: Spot scanning protons pass through multi-wire ionization chambers which provide information about the charge, location, and size of each delivered spot. We have generated a program that calculates the dose in phantommore » from these log files and compares the measurements with the plan. The program has 3 different spot shape models: single Gaussian, double Gaussian and the ASTROID model. The program was benchmarked across different treatment sites for 23 patients and 74 fields. Results: The dose calculated from the log files were compared to those generate by the treatment planning system (Raystation). While the dual Gaussian model often gave better agreement, overall, the ASTROID model gave the most consistent results. Using a 5%–3 mm gamma with a 90% passing criteria and excluding doses below 20% of prescription all patient samples passed. However, the degree of agreement of the log file approach was slightly worse than that of the chamber array measurement approach. Operationally, this implies that if the beam passes the log file model, it should pass direct measurement. Conclusion: We have established and benchmarked a model for log file QA in an IBA proteus plus system. The choice of optimal spot model for a given class of patients may be affected by factors such as site, field size, and range shifter and will be investigated further.« less

  2. An EXCEL macro for importing log ASCII standard (LAS) files into EXCEL worksheets

    NASA Astrophysics Data System (ADS)

    Özkaya, Sait Ismail

    1996-02-01

    An EXCEL 5.0 macro is presented for converting a LAS text file into an EXCEL worksheet. Although EXCEL has commands for importing text files and parsing text lines, LAS files must be decoded line-by-line because three different delimiters are used to separate fields of differing length. The macro is intended to eliminate manual decoding of LAS version 2.0. LAS is a floppy disk format for storage and transfer of log data as text files. LAS was proposed by the Canadian Well Logging Society. The present EXCEL macro decodes different sections of a LAS file, separates, and places the fields into different columns of an EXCEL worksheet. To import a LAS file into EXCEL without errors, the file must not contain any unrecognized symbols, and the data section must be the last section. The program does not check for the presence of mandatory sections or fields as required by LAS rules. Once a file is incorporated into EXCEL, mandatory sections and fields may be inspected visually.

  3. SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeMarco, J; McCloskey, S; Low, D

    Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT planmore » file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment.« less

  4. SU-E-T-184: Clinical VMAT QA Practice Using LINAC Delivery Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, H; Jacobson, T; Gu, X

    2015-06-15

    Purpose: To evaluate the accuracy of volumetric modulated arc therapy (VMAT) treatment delivery dose clouds by comparing linac log data to doses measured using an ionization chamber and film. Methods: A commercial IMRT quality assurance (QA) process utilizing a DICOM-RT framework was tested for clinical practice using 30 prostate and 30 head and neck VMAT plans. Delivered 3D VMAT dose distributions were independently checked using a PinPoint ionization chamber and radiographic film in a solid water phantom. DICOM RT coordinates were used to extract the corresponding point and planar doses from 3D log file dose distributions. Point doses were evaluatedmore » by computing the percent error between log file and chamber measured values. A planar dose evaluation was performed for each plan using a 2D gamma analysis with 3% global dose difference and 3 mm isodose point distance criteria. The same analysis was performed to compare treatment planning system (TPS) doses to measured values to establish a baseline assessment of agreement. Results: The mean percent error between log file and ionization chamber dose was 1.0%±2.1% for prostate VMAT plans and −0.2%±1.4% for head and neck plans. The corresponding TPS calculated and measured ionization chamber values agree within 1.7%±1.6%. The average 2D gamma passing rates for the log file comparison to film are 98.8%±1.0% and 96.2%±4.2% for the prostate and head and neck plans, respectively. The corresponding passing rates for the TPS comparison to film are 99.4%±0.5% and 93.9%±5.1%. Overall, the point dose and film data indicate that log file determined doses are in excellent agreement with measured values. Conclusion: Clinical VMAT QA practice using LINAC treatment log files is a fast and reliable method for patient-specific plan evaluation.« less

  5. Automating linear accelerator quality assurance.

    PubMed

    Eckhause, Tobias; Al-Hallaq, Hania; Ritter, Timothy; DeMarco, John; Farrey, Karl; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Perez, Mario; Park, SungYong; Booth, Jeremy T; Thorwarth, Ryan; Moran, Jean M

    2015-10-01

    The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The standard deviation in MLC position, as determined by EPID measurements, across the consortium was 0.33 mm for IMRT fields. With respect to the log files, the deviations between expected and actual positions for parameters were small (<0.12 mm) for all Linacs. Considering both log files and EPID measurements, all parameters were well within published tolerance values. Variations in collimator angle, MLC position, and gantry sag were also evaluated for all Linacs. The performance of the TrueBeam Linac model was shown to be consistent based on automated analysis of trajectory log files and EPID images acquired during delivery of a standardized test suite. The results can be compared directly to tolerance thresholds. In addition, sharing of results from standard tests across institutions can facilitate the identification of QA process and Linac changes. These reference values are presented along with the standard deviation for common tests so that the test suite can be used by other centers to evaluate their Linac performance against those in this consortium.

  6. ';Best' Practices for Aggregating Subset Results from Archived Datasets

    NASA Astrophysics Data System (ADS)

    Baskin, W. E.; Perez, J.

    2013-12-01

    In response to the exponential growth in science data analysis and visualization capabilities Data Centers have been developing new delivery mechanisms to package and deliver large volumes of aggregated subsets of archived data. New standards are evolving to help data providers and application programmers deal with growing needs of the science community. These standards evolve from the best practices gleaned from new products and capabilities. The NASA Atmospheric Sciences Data Center (ASDC) has developed and deployed production provider-specific search and subset web applications for the CALIPSO, CERES, TES, and MOPITT missions. This presentation explores several use cases that leverage aggregated subset results and examines the standards and formats ASDC developers applied to the delivered files as well as the implementation strategies for subsetting and processing the aggregated products. The following topics will be addressed: - Applications of NetCDF CF conventions to aggregated level 2 satellite subsets - Data-Provider-Specific format requirements vs. generalized standards - Organization of the file structure of aggregated NetCDF subset output - Global Attributes of individual subsetted files vs. aggregated results - Specific applications and framework used for subsetting and delivering derivative data files

  7. Linking log files with dosimetric accuracy--A multi-institutional study on quality assurance of volumetric modulated arc therapy.

    PubMed

    Pasler, Marlies; Kaas, Jochem; Perik, Thijs; Geuze, Job; Dreindl, Ralf; Künzler, Thomas; Wittkamper, Frits; Georg, Dietmar

    2015-12-01

    To systematically evaluate machine specific quality assurance (QA) for volumetric modulated arc therapy (VMAT) based on log files by applying a dynamic benchmark plan. A VMAT benchmark plan was created and tested on 18 Elekta linacs (13 MLCi or MLCi2, 5 Agility) at 4 different institutions. Linac log files were analyzed and a delivery robustness index was introduced. For dosimetric measurements an ionization chamber array was used. Relative dose deviations were assessed by mean gamma for each control point and compared to the log file evaluation. Fourteen linacs delivered the VMAT benchmark plan, while 4 linacs failed by consistently terminating the delivery. The mean leaf error (±1SD) was 0.3±0.2 mm for all linacs. Large MLC maximum errors up to 6.5 mm were observed at reversal positions. Delivery robustness index accounting for MLC position correction (0.8-1.0) correlated with delivery time (80-128 s) and depended on dose rate performance. Dosimetric evaluation indicated in general accurate plan reproducibility with γ(mean)(±1 SD)=0.4±0.2 for 1 mm/1%. However single control point analysis revealed larger deviations and attributed well to log file analysis. The designed benchmark plan helped identify linac related malfunctions in dynamic mode for VMAT. Log files serve as an important additional QA measure to understand and visualize dynamic linac parameters. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. SU-F-T-469: A Clinically Observed Discrepancy Between Image-Based and Log- Based MLC Position

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neal, B; Ahmed, M; Siebers, J

    2016-06-15

    Purpose: To present a clinical case which challenges the base assumption of log-file based QA, by showing that the actual position of a MLC leaf can suddenly deviate from its programmed and logged position by >1 mm as observed with real-time imaging. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used in cine mode to capture portal images during treatment. Visual monitoring identified an anomalous MLC leaf pair gap not otherwise detected by the automatic position verification. The position of the erred leaf was measured on EPID images and log files were analyzed for themore » treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days. Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3±0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusion: It has been clinically observed that log-file derived leaf positions can differ from their actual positions by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trusting log file records. Intra-treatment EPID imaging provides a method to capture departures from MLC planned positions. Work was supported in part by Varian Medical Systems.« less

  9. Transaction aware tape-infrastructure monitoring

    NASA Astrophysics Data System (ADS)

    Nikolaidis, Fotios; Kruse, Daniele Francesco

    2014-06-01

    Administrating a large scale, multi protocol, hierarchical tape infrastructure like the CERN Advanced STORage manager (CASTOR)[2], which stores now 100 PB (with an increasing step of 25 PB per year), requires an adequate monitoring system for quick spotting of malfunctions, easier debugging and on demand report generation. The main challenges for such system are: to cope with CASTOR's log format diversity and its information scattered among several log files, the need for long term information archival, the strict reliability requirements and the group based GUI visualization. For this purpose, we have designed, developed and deployed a centralized system consisting of four independent layers: the Log Transfer layer for collecting log lines from all tape servers to a single aggregation server, the Data Mining layer for combining log data into transaction context, the Storage layer for archiving the resulting transactions and finally the Web UI layer for accessing the information. Having flexibility, extensibility and maintainability in mind, each layer is designed to work as a message broker for the next layer, providing a clean and generic interface while ensuring consistency, redundancy and ultimately fault tolerance. This system unifies information previously dispersed over several monitoring tools into a single user interface, using Splunk, which also allows us to provide information visualization based on access control lists (ACL). Since its deployment, it has been successfully used by CASTOR tape operators for quick overview of transactions, performance evaluation, malfunction detection and from managers for report generation.

  10. Identification and Management of Pump Thrombus in the HeartWare Left Ventricular Assist Device System: A Novel Approach Using Log File Analysis.

    PubMed

    Jorde, Ulrich P; Aaronson, Keith D; Najjar, Samer S; Pagani, Francis D; Hayward, Christopher; Zimpfer, Daniel; Schlöglhofer, Thomas; Pham, Duc T; Goldstein, Daniel J; Leadley, Katrin; Chow, Ming-Jay; Brown, Michael C; Uriel, Nir

    2015-11-01

    The study sought to characterize patterns in the HeartWare (HeartWare Inc., Framingham, Massachusetts) ventricular assist device (HVAD) log files associated with successful medical treatment of device thrombosis. Device thrombosis is a serious adverse event for mechanical circulatory support devices and is often preceded by increased power consumption. Log files of the pump power are easily accessible on the bedside monitor of HVAD patients and may allow early diagnosis of device thrombosis. Furthermore, analysis of the log files may be able to predict the success rate of thrombolysis or the need for pump exchange. The log files of 15 ADVANCE trial patients (algorithm derivation cohort) with 16 pump thrombus events treated with tissue plasminogen activator (tPA) were assessed for changes in the absolute and rate of increase in power consumption. Successful thrombolysis was defined as a clinical resolution of pump thrombus including normalization of power consumption and improvement in biochemical markers of hemolysis. Significant differences in log file patterns between successful and unsuccessful thrombolysis treatments were verified in 43 patients with 53 pump thrombus events implanted outside of clinical trials (validation cohort). The overall success rate of tPA therapy was 57%. Successful treatments had significantly lower measures of percent of expected power (130.9% vs. 196.1%, p = 0.016) and rate of increase in power (0.61 vs. 2.87, p < 0.0001). Medical therapy was successful in 77.7% of the algorithm development cohort and 81.3% of the validation cohort when the rate of power increase and percent of expected power values were <1.25% and 200%, respectively. Log file parameters can potentially predict the likelihood of successful tPA treatments and if validated prospectively, could substantially alter the approach to thrombus management. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  11. SU-F-T-233: Evaluation of Treatment Delivery Parameters Using High Resolution ELEKTA Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kabat, C; Defoor, D; Alexandrian, A

    2016-06-15

    Purpose: As modern linacs have become more technologically advanced with the implementation of IGRT and IMRT with HDMLCs, a requirement for more elaborate tracking techniques to monitor components’ integrity is paramount. ElektaLog files are generated every 40 milliseconds, which can be analyzed to track subtle changes and provide another aspect of quality assurance. This allows for constant monitoring of fraction consistency in addition to machine reliability. With this in mind, it was the aim of the study to evaluate if ElektaLog files can be utilized for linac consistency QA. Methods: ElektaLogs were reviewed for 16 IMRT patient plans with >16more » fractions. Logs were analyzed by creating fluence maps from recorded values of MLC locations, jaw locations, and dose per unit time. Fluence maps were then utilized to calculate a 2D gamma index with a 2%–2mm criteria for each fraction. ElektaLogs were also used to analyze positional errors for MLC leaves and jaws, which were used to compute an overall error for the MLC banks, Y-jaws, and X-jaws by taking the root-meansquare value of the individual recorded errors during treatment. Additionally, beam on time was calculated using the number of ElektaLog file entries within the file. Results: The average 2D gamma for all 16 patient plans was found to be 98.0±2.0%. Recorded gamma index values showed an acceptable correlation between fractions. Average RMS values for MLC leaves and the jaws resulted in a leaf variation of roughly 0.3±0.08 mm and jaw variation of about 0.15±0.04 mm, both of which fall within clinical tolerances. Conclusion: The use of ElektaLog files for day-to-day evaluation of linac integrity and patient QA can be utilized to allow for reliable analysis of system accuracy and performance.« less

  12. Log-less metadata management on metadata server for parallel file systems.

    PubMed

    Liao, Jianwei; Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally.

  13. Log-Less Metadata Management on Metadata Server for Parallel File Systems

    PubMed Central

    Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally. PMID:24892093

  14. Geophysical log database for the Floridan aquifer system and southeastern Coastal Plain aquifer system in Florida and parts of Georgia, Alabama, and South Carolina

    USGS Publications Warehouse

    Williams, Lester J.; Raines, Jessica E.; Lanning, Amanda E.

    2013-04-04

    A database of borehole geophysical logs and other types of data files were compiled as part of ongoing studies of water availability and assessment of brackish- and saline-water resources. The database contains 4,883 logs from 1,248 wells in Florida, Georgia, Alabama, South Carolina, and from a limited number of offshore wells of the eastern Gulf of Mexico and the Atlantic Ocean. The logs can be accessed through a download directory organized by state and county for onshore wells and in a single directory for the offshore wells. A flat file database is provided that lists the wells, their coordinates, and the file listings.

  15. Web usage data mining agent

    NASA Astrophysics Data System (ADS)

    Madiraju, Praveen; Zhang, Yanqing

    2002-03-01

    When a user logs in to a website, behind the scenes the user leaves his/her impressions, usage patterns and also access patterns in the web servers log file. A web usage mining agent can analyze these web logs to help web developers to improve the organization and presentation of their websites. They can help system administrators in improving the system performance. Web logs provide invaluable help in creating adaptive web sites and also in analyzing the network traffic analysis. This paper presents the design and implementation of a Web usage mining agent for digging in to the web log files.

  16. Index map of cross sections through parts of the Appalachian basin (Kentucky, New York, Ohio, Pennsylvania, Tennessee, Virginia, West Virginia): Chapter E.1 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    The appendixes in chapters E.4.1 and E.4.2 include (1) Log ASCII Standard (LAS) files, which encode gamma-ray, neutron, density, and other logs in text files that can be used by most well-logging software programs; and (2) graphic well-log traces. In the appendix to chapter E.4.1, the well-log traces are accompanied by lithologic descriptions with formation tops.

  17. Monte Carlo based, patient-specific RapidArc QA using Linac log files.

    PubMed

    Teke, Tony; Bergman, Alanah M; Kwa, William; Gill, Bradford; Duzenli, Cheryl; Popescu, I Antoniu

    2010-01-01

    A Monte Carlo (MC) based QA process to validate the dynamic beam delivery accuracy for Varian RapidArc (Varian Medical Systems, Palo Alto, CA) using Linac delivery log files (DynaLog) is presented. Using DynaLog file analysis and MC simulations, the goal of this article is to (a) confirm that adequate sampling is used in the RapidArc optimization algorithm (177 static gantry angles) and (b) to assess the physical machine performance [gantry angle and monitor unit (MU) delivery accuracy]. Ten clinically acceptable RapidArc treatment plans were generated for various tumor sites and delivered to a water-equivalent cylindrical phantom on the treatment unit. Three Monte Carlo simulations were performed to calculate dose to the CT phantom image set: (a) One using a series of static gantry angles defined by 177 control points with treatment planning system (TPS) MLC control files (planning files), (b) one using continuous gantry rotation with TPS generated MLC control files, and (c) one using continuous gantry rotation with actual Linac delivery log files. Monte Carlo simulated dose distributions are compared to both ionization chamber point measurements and with RapidArc TPS calculated doses. The 3D dose distributions were compared using a 3D gamma-factor analysis, employing a 3%/3 mm distance-to-agreement criterion. The dose difference between MC simulations, TPS, and ionization chamber point measurements was less than 2.1%. For all plans, the MC calculated 3D dose distributions agreed well with the TPS calculated doses (gamma-factor values were less than 1 for more than 95% of the points considered). Machine performance QA was supplemented with an extensive DynaLog file analysis. A DynaLog file analysis showed that leaf position errors were less than 1 mm for 94% of the time and there were no leaf errors greater than 2.5 mm. The mean standard deviation in MU and gantry angle were 0.052 MU and 0.355 degrees, respectively, for the ten cases analyzed. The accuracy and flexibility of the Monte Carlo based RapidArc QA system were demonstrated. Good machine performance and accurate dose distribution delivery of RapidArc plans were observed. The sampling used in the TPS optimization algorithm was found to be adequate.

  18. A clinically observed discrepancy between image-based and log-based MLC positions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neal, Brian, E-mail: bpn2p@virginia.edu; Ahmed, Mahmoud; Kathuria, Kunal

    2016-06-15

    Purpose: To present a clinical case in which real-time intratreatment imaging identified an multileaf collimator (MLC) leaf to be consistently deviating from its programmed and logged position by >1 mm. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used to capture cine during treatment images. The author serendipitously visually identified a suspected MLC leaf displacement that was not otherwise detected. The leaf position as recorded on the EPID images was measured and log-files were analyzed for the treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days.more » Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log-file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3 ± 0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusions: It has been clinically observed that log-file derived leaf positions can differ from their actual position by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trust log-file records. Intratreatment EPID imaging provides a method to capture departures from MLC planned positions.« less

  19. Quantification of residual dose estimation error on log file-based patient dose calculation.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Matsunaga, Kenichi; Matsushita, Haruo; Majima, Kazuhiro; Jingu, Keiichi

    2016-05-01

    The log file-based patient dose estimation includes a residual dose estimation error caused by leaf miscalibration, which cannot be reflected on the estimated dose. The purpose of this study is to determine this residual dose estimation error. Modified log files for seven head-and-neck and prostate volumetric modulated arc therapy (VMAT) plans simulating leaf miscalibration were generated by shifting both leaf banks (systematic leaf gap errors: ±2.0, ±1.0, and ±0.5mm in opposite directions and systematic leaf shifts: ±1.0mm in the same direction) using MATLAB-based (MathWorks, Natick, MA) in-house software. The generated modified and non-modified log files were imported back into the treatment planning system and recalculated. Subsequently, the generalized equivalent uniform dose (gEUD) was quantified for the definition of the planning target volume (PTV) and organs at risks. For MLC leaves calibrated within ±0.5mm, the quantified residual dose estimation errors that obtained from the slope of the linear regression of gEUD changes between non- and modified log file doses per leaf gap are in head-and-neck plans 1.32±0.27% and 0.82±0.17Gy for PTV and spinal cord, respectively, and in prostate plans 1.22±0.36%, 0.95±0.14Gy, and 0.45±0.08Gy for PTV, rectum, and bladder, respectively. In this work, we determine the residual dose estimation errors for VMAT delivery using the log file-based patient dose calculation according to the MLC calibration accuracy. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  20. Storage of sparse files using parallel log-structured file system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    A sparse file is stored without holes by storing a data portion of the sparse file using a parallel log-structured file system; and generating an index entry for the data portion, the index entry comprising a logical offset, physical offset and length of the data portion. The holes can be restored to the sparse file upon a reading of the sparse file. The data portion can be stored at a logical end of the sparse file. Additional storage efficiency can optionally be achieved by (i) detecting a write pattern for a plurality of the data portions and generating a singlemore » patterned index entry for the plurality of the patterned data portions; and/or (ii) storing the patterned index entries for a plurality of the sparse files in a single directory, wherein each entry in the single directory comprises an identifier of a corresponding sparse file.« less

  1. 46 CFR 97.35-3 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... voyage is completed, the master or person in charge shall file the logbook with the Officer in Charge.... Such logs or records are not filed with the Officer in Charge, Marine Inspection, but must be kept... logs for the period of validity of the vessel's certificate of inspection. [CGD 95-027, 61 FR 26007...

  2. 46 CFR 97.35-3 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... voyage is completed, the master or person in charge shall file the logbook with the Officer in Charge.... Such logs or records are not filed with the Officer in Charge, Marine Inspection, but must be kept... logs for the period of validity of the vessel's certificate of inspection. [CGD 95-027, 61 FR 26007...

  3. Constructing compact and effective graphs for recommender systems via node and edge aggregations

    DOE PAGES

    Lee, Sangkeun; Kahng, Minsuk; Lee, Sang-goo

    2014-12-10

    Exploiting graphs for recommender systems has great potential to flexibly incorporate heterogeneous information for producing better recommendation results. As our baseline approach, we first introduce a naive graph-based recommendation method, which operates with a heterogeneous log-metadata graph constructed from user log and content metadata databases. Although the na ve graph-based recommendation method is simple, it allows us to take advantages of heterogeneous information and shows promising flexibility and recommendation accuracy. However, it often leads to extensive processing time due to the sheer size of the graphs constructed from entire user log and content metadata databases. In this paper, we proposemore » node and edge aggregation approaches to constructing compact and e ective graphs called Factor-Item bipartite graphs by aggregating nodes and edges of a log-metadata graph. Furthermore, experimental results using real world datasets indicate that our approach can significantly reduce the size of graphs exploited for recommender systems without sacrificing the recommendation quality.« less

  4. SU-F-T-295: MLCs Performance and Patient-Specific IMRT QA Using Log File Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osman, A; American University of Biuret Medical Center, Biuret; Maalej, N

    2016-06-15

    Purpose: To analyze the performance of the multi-leaf collimators (MLCs) from the log files recorded during the intensity modulated radiotherapy (IMRT) treatment and to construct the relative fluence maps and do the gamma analysis to compare the planned and executed MLCs movement. Methods: We developed a program to extract and analyze the data from dynamic log files (dynalog files) generated from sliding window IMRT delivery treatments. The program extracts the planned and executed (actual or delivered) MLCs movement, calculates and compares the relative planned and executed fluences. The fluence maps were used to perform the gamma analysis (with 3% dosemore » difference and 3 mm distance to agreement) for 3 IMR patients. We compared our gamma analysis results with those obtained from portal dose image prediction (PDIP) algorithm performed using the EPID. Results: For 3 different IMRT patient treatments, the maximum difference between the planned and the executed MCLs positions was 1.2 mm. The gamma analysis results of the planned and delivered fluences were in good agreement with the gamma analysis from portal dosimetry. The maximum difference for number of pixels passing the gamma criteria (3%/3mm) was 0.19% with respect to portal dosimetry results. Conclusion: MLC log files can be used to verify the performance of the MLCs. Patientspecific IMRT QA based on MLC movement log files gives similar results to EPID dosimetry results. This promising method for patient-specific IMRT QA is fast, does not require dose measurements in a phantom, can be done before the treatment and for every fraction, and significantly reduces the IMRT workload. The author would like to thank King Fahd University of petroleum and Minerals for the support.« less

  5. Log ASCII Standard (LAS) Files for Geophysical Wireline Well Logs and Their Application to Geologic Cross Sections Through the Central Appalachian Basin

    USGS Publications Warehouse

    Crangle, Robert D.

    2007-01-01

    Introduction The U.S. Geological Survey (USGS) uses geophysical wireline well logs for a variety of purposes, including stratigraphic correlation (Hettinger, 2001, Ryder, 2002), petroleum reservoir analyses (Nelson and Bird, 2005), aquifer studies (Balch, 1988), and synthetic seismic profiles (Kulander and Ryder, 2005). Commonly, well logs are easier to visualize, manipulate, and interpret when available in a digital format. In recent geologic cross sections E-E' and D-D', constructed through the central Appalachian basin (Ryder, Swezey, and others, in press; Ryder, Crangle, and others, in press), gamma ray well log traces and lithologic logs were used to correlate key stratigraphic intervals (Fig. 1). The stratigraphy and structure of the cross sections are illustrated through the use of graphical software applications (e.g., Adobe Illustrator). The gamma ray traces were digitized in Neuralog (proprietary software) from paper well logs and converted to a Log ASCII Standard (LAS) format. Once converted, the LAS files were transformed to images through an LAS-reader application (e.g., GeoGraphix Prizm) and then overlain in positions adjacent to well locations, used for stratigraphic control, on each cross section. This report summarizes the procedures used to convert paper logs to a digital LAS format using a third-party software application, Neuralog. Included in this report are LAS files for sixteen wells used in geologic cross section E-E' (Table 1) and thirteen wells used in geologic cross section D-D' (Table 2).

  6. SU-E-T-325: The New Evaluation Method of the VMAT Plan Delivery Using Varian DynaLog Files and Modulation Complexity Score (MCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tateoka, K; Graduate School of Medicine, Sapporo Medical University, Sapporo, JP; Fujimomo, K

    2014-06-01

    Purpose: The aim of the study is to evaluate the use of Varian DynaLog files to verify VMAT plans delivery and modulation complexity score (MCS) of VMAT. Methods: Delivery accuracy of machine performance was quantified by multileaf collimator (MLC) position errors, gantry angle errors and fluence delivery accuracy for volumetric modulated arc therapy (VMAT). The relationship between machine performance and plan complexity were also investigated using the modulation complexity score (MCS). Plan and Actual MLC positions, gantry angles and delivered fraction of monitor units were extracted from Varian DynaLog files. These factors were taken from the record and verify systemmore » of MLC control file. Planned and delivered beam data were compared to determine leaf position errors and gantry angle errors. Analysis was also performed on planned and actual fluence maps reconstructed from those of the DynaLog files. This analysis was performed for all treatment fractions of 5 prostate VMAT plans. The analysis of DynaLog files have been carried out by in-house programming in Visual C++. Results: The root mean square of leaf position and gantry angle errors were about 0.12 and 0.15, respectively. The Gamma of planned and actual fluence maps at 3%/3 mm criterion was about 99.21. The gamma of the leaf position errors were not directly related to plan complexity as determined by the MCS. Therefore, the gamma of the gantry angle errors were directly related to plan complexity as determined by the MCS. Conclusion: This study shows Varian dynalog files for VMAT plan can be diagnosed delivery errors not possible with phantom based quality assurance. Furthermore, the MCS of VMAT plan can evaluate delivery accuracy for patients receiving of VMAT. Machine performance was found to be directly related to plan complexity but this is not the dominant determinant of delivery accuracy.« less

  7. 20 CFR 401.85 - Exempt systems.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... subsection (k)(2) of the Privacy Act: (A) The General Criminal Investigation Files, SSA; (B) The Criminal Investigations File, SSA; and, (C) The Program Integrity Case Files, SSA. (D) Civil and Administrative Investigative Files of the Inspector General, SSA/OIG. (E) Complaint Files and Log. SSA/OGC. (iii) Pursuant to...

  8. SU-F-T-288: Impact of Trajectory Log Files for Clarkson-Based Independent Dose Verification of IMRT and VMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, R; Kamima, T; Tachibana, H

    2016-06-15

    Purpose: To investigate the effect of the trajectory files from linear accelerator for Clarkson-based independent dose verification in IMRT and VMAT plans. Methods: A CT-based independent dose verification software (Simple MU Analysis: SMU, Triangle Products, Japan) with a Clarksonbased algorithm was modified to calculate dose using the trajectory log files. Eclipse with the three techniques of step and shoot (SS), sliding window (SW) and Rapid Arc (RA) was used as treatment planning system (TPS). In this study, clinically approved IMRT and VMAT plans for prostate and head and neck (HN) at two institutions were retrospectively analyzed to assess the dosemore » deviation between DICOM-RT plan (PL) and trajectory log file (TJ). An additional analysis was performed to evaluate MLC error detection capability of SMU when the trajectory log files was modified by adding systematic errors (0.2, 0.5, 1.0 mm) and random errors (5, 10, 30 mm) to actual MLC position. Results: The dose deviations for prostate and HN in the two sites were 0.0% and 0.0% in SS, 0.1±0.0%, 0.1±0.1% in SW and 0.6±0.5%, 0.7±0.9% in RA, respectively. The MLC error detection capability shows the plans for HN IMRT were the most sensitive and 0.2 mm of systematic error affected 0.7% dose deviation on average. Effect of the MLC random error did not affect dose error. Conclusion: The use of trajectory log files including actual information of MLC location, gantry angle, etc should be more effective for an independent verification. The tolerance level for the secondary check using the trajectory file may be similar to that of the verification using DICOM-RT plan file. From the view of the resolution of MLC positional error detection, the secondary check could detect the MLC position error corresponding to the treatment sites and techniques. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  9. 47 CFR 76.1706 - Signal leakage logs and repair records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the probable cause of the leakage. The log shall be kept on file for a period of two years and shall... 47 Telecommunication 4 2010-10-01 2010-10-01 false Signal leakage logs and repair records. 76.1706... leakage logs and repair records. Cable operators shall maintain a log showing the date and location of...

  10. 47 CFR 76.1706 - Signal leakage logs and repair records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... the probable cause of the leakage. The log shall be kept on file for a period of two years and shall... 47 Telecommunication 4 2011-10-01 2011-10-01 false Signal leakage logs and repair records. 76.1706... leakage logs and repair records. Cable operators shall maintain a log showing the date and location of...

  11. Replication in the Harp File System

    DTIC Science & Technology

    1981-07-01

    Shrira Michael Williams iadly 1991 © Massachusetts Institute of Technology (To appear In the Proceedings of the Thirteenth ACM Symposium on Operating...S., Spector, A. Z., and Thompson, D. S. Distributed Logging for Transaction Processing. ACM Special Interest Group on Management of Data 1987 Annual ...System. USENIX Conference Proceedings , June, 1990, pp. 63-71. 15. Hagmann, R. Reimplementing the Cedar File System Using Logging and Group Commit

  12. Estimating the carbon in coarse woody debris with perpendicular distance sampling. Chapter 6

    Treesearch

    Harry T. Valentine; Jeffrey H. Gove; Mark J. Ducey; Timothy G. Gregoire; Michael S. Williams

    2008-01-01

    Perpendicular distance sampling (PDS) is a design for sampling the population of pieces of coarse woody debris (logs) in a forested tract. In application, logs are selected at sample points with probability proportional to volume. Consequently, aggregate log volume per unit land area can be estimated from tallies of logs at sample points. In this chapter we provide...

  13. 17 CFR 274.402 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... for access codes to file on EDGAR. 274.402 Section 274.402 Commodity and Securities Exchanges... Forms for Electronic Filing § 274.402 Form ID, uniform application for access codes to file on EDGAR..., filing agent or training agent to log on to the EDGAR system, submit filings, and change its CCC. (d...

  14. 17 CFR 274.402 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... for access codes to file on EDGAR. 274.402 Section 274.402 Commodity and Securities Exchanges... Forms for Electronic Filing § 274.402 Form ID, uniform application for access codes to file on EDGAR..., filing agent or training agent to log on to the EDGAR system, submit filings, and change its CCC. (d...

  15. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file

    NASA Astrophysics Data System (ADS)

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-01

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases. This work was partly presented at the 58th Annual meeting of American Association of Physicists in Medicine.

  16. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file.

    PubMed

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-21

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases.

  17. Workload Characterization and Performance Implications of Large-Scale Blog Servers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeon, Myeongjae; Kim, Youngjae; Hwang, Jeaho

    With the ever-increasing popularity of social network services (SNSs), an understanding of the characteristics of these services and their effects on the behavior of their host servers is critical. However, there has been a lack of research on the workload characterization of servers running SNS applications such as blog services. To fill this void, we empirically characterized real-world web server logs collected from one of the largest South Korean blog hosting sites for 12 consecutive days. The logs consist of more than 96 million HTTP requests and 4.7 TB of network traffic. Our analysis reveals the followings: (i) The transfermore » size of non-multimedia files and blog articles can be modeled using a truncated Pareto distribution and a log-normal distribution, respectively; (ii) User access for blog articles does not show temporal locality, but is strongly biased towards those posted with image or audio files. We additionally discuss the potential performance improvement through clustering of small files on a blog page into contiguous disk blocks, which benefits from the observed file access patterns. Trace-driven simulations show that, on average, the suggested approach achieves 60.6% better system throughput and reduces the processing time for file access by 30.8% compared to the best performance of the Ext4 file system.« less

  18. SU-E-T-784: Using MLC Log Files for Daily IMRT Delivery Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stathakis, S; Defoor, D; Linden, P

    2015-06-15

    Purpose: To verify daily intensity modulated radiation therapy (IMRT) treatments using multi-leaf collimator (MLC) log files. Methods: The MLC log files from a NovalisTX Varian linear accelerator were used in this study. The MLC files were recorded daily for all patients undergoing IMRT or volumetric modulated arc therapy (VMAT). The first record of each patient was used as reference and all records for subsequent days were compared against the reference. An in house MATLAB software code was used for the comparisons. Each MLC log file was converted to a fluence map (FM) and a gamma index (γ) analysis was usedmore » for the evaluation of each daily delivery for every patient. The tolerance for the gamma index was set to 2% dose difference and 2mm distance to agreement while points with signal of 10% or lower of the maximum value were excluded from the comparisons. Results: The γ between each of the reference FMs and the consecutive daily fraction FMs had an average value of 99.1% (ranged from 98.2 to 100.0%). The FM images were reconstructed at various resolutions in order to study the effect of the resolution on the γ and at the same time reduce the time for processing the images. We found that the comparison of images with the highest resolution (768×1024) yielded on average a lower γ (99.1%) than the ones with low resolution (192×256) (γ 99.5%). Conclusion: We developed an in-house software that allows us to monitor the quality of daily IMRT and VMAT treatment deliveries using information from the MLC log files of the linear accelerator. The information can be analyzed and evaluated as early as after the completion of each daily treatment. Such tool can be valuable to assess the effect of MLC positioning on plan quality, especially in the context of adaptive radiotherapy.« less

  19. A high-speed scintillation-based electronic portal imaging device to quantitatively characterize IMRT delivery.

    PubMed

    Ranade, Manisha K; Lynch, Bart D; Li, Jonathan G; Dempsey, James F

    2006-01-01

    We have developed an electronic portal imaging device (EPID) employing a fast scintillator and a high-speed camera. The device is designed to accurately and independently characterize the fluence delivered by a linear accelerator during intensity modulated radiation therapy (IMRT) with either step-and-shoot or dynamic multileaf collimator (MLC) delivery. Our aim is to accurately obtain the beam shape and fluence of all segments delivered during IMRT, in order to study the nature of discrepancies between the plan and the delivered doses. A commercial high-speed camera was combined with a terbium-doped gadolinium-oxy-sulfide (Gd2O2S:Tb) scintillator to form an EPID for the unaliased capture of two-dimensional fluence distributions of each beam in an IMRT delivery. The high speed EPID was synchronized to the accelerator pulse-forming network and gated to capture every possible pulse emitted from the accelerator, with an approximate frame rate of 360 frames-per-second (fps). A 62-segment beam from a head-and-neck IMRT treatment plan requiring 68 s to deliver was recorded with our high speed EPID producing approximately 6 Gbytes of imaging data. The EPID data were compared with the MLC instruction files and the MLC controller log files. The frames were binned to provide a frame rate of 72 fps with a signal-to-noise ratio that was sufficient to resolve leaf positions and segment fluence. The fractional fluence from the log files and EPID data agreed well. An ambiguity in the motion of the MLC during beam on was resolved. The log files reported leaf motions at the end of 33 of the 42 segments, while the EPID observed leaf motions in only 7 of the 42 segments. The static IMRT segment shapes observed by the high speed EPID were in good agreement with the shapes reported in the log files. The leaf motions observed during beam-on for step-and-shoot delivery were not temporally resolved by the log files.

  20. Geologic cross section E-E' through the Appalachian basin from the Findlay arch, Wood County, Ohio, to the Valley and Ridge province, Pendleton County, West Virginia: Chapter E.4.2 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Swezey, Christopher S.; Crangle, Robert D.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    This chapter is a re-release of U.S. Geological Survey Scientific Investigations Map 2985, of the same title, by Ryder and others (2008). For this chapter, two appendixes have been added that do not appear with the original version. Appendix A provides Log ASCII Standard (LAS) files for each drill hole along cross-section E–E'; they are text files which encode gamma-ray, neutron, density, and other logs that can be used by most well-logging software. Appendix B provides graphic well-log traces from each drill hole.

  1. INSPIRE and SPIRES Log File Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Cole; /Wheaton Coll. /SLAC

    2012-08-31

    SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are mademore » between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.« less

  2. 78 FR 40474 - Sustaining Power Solutions LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-05

    .... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...

  3. 78 FR 34371 - Longfellow Wind, LLC: Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-07

    .... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...

  4. The new idea of transporting tailings-logs in tailings slurry pipeline and the innovation of technology of mining waste-fill method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin Yu; Wang Fuji; Tao Yan

    2000-07-01

    This paper introduced a new idea of transporting mine tailings-logs in mine tailings-slurry pipeline and a new technology of mine cemented filing of tailings-logs with tailings-slurry. The hydraulic principles, the compaction of tailings-logs and the mechanic function of fillbody of tailings-logs cemented by tailings-slurry have been discussed.

  5. Study of the IMRT interplay effect using a 4DCT Monte Carlo dose calculation.

    PubMed

    Jensen, Michael D; Abdellatif, Ady; Chen, Jeff; Wong, Eugene

    2012-04-21

    Respiratory motion may lead to dose errors when treating thoracic and abdominal tumours with radiotherapy. The interplay between complex multileaf collimator patterns and patient respiratory motion could result in unintuitive dose changes. We have developed a treatment reconstruction simulation computer code that accounts for interplay effects by combining multileaf collimator controller log files, respiratory trace log files, 4DCT images and a Monte Carlo dose calculator. Two three-dimensional (3D) IMRT step-and-shoot plans, a concave target and integrated boost were delivered to a 1D rigid motion phantom. Three sets of experiments were performed with 100%, 50% and 25% duty cycle gating. The log files were collected, and five simulation types were performed on each data set: continuous isocentre shift, discrete isocentre shift, 4DCT, 4DCT delivery average and 4DCT plan average. Analysis was performed using 3D gamma analysis with passing criteria of 2%, 2 mm. The simulation framework was able to demonstrate that a single fraction of the integrated boost plan was more sensitive to interplay effects than the concave target. Gating was shown to reduce the interplay effects. We have developed a 4DCT Monte Carlo simulation method that accounts for IMRT interplay effects with respiratory motion by utilizing delivery log files.

  6. User's manual for SEDCALC, a computer program for computation of suspended-sediment discharge

    USGS Publications Warehouse

    Koltun, G.F.; Gray, John R.; McElhone, T.J.

    1994-01-01

    Sediment-Record Calculations (SEDCALC), a menu-driven set of interactive computer programs, was developed to facilitate computation of suspended-sediment records. The programs comprising SEDCALC were developed independently in several District offices of the U.S. Geological Survey (USGS) to minimize the intensive labor associated with various aspects of sediment-record computations. SEDCALC operates on suspended-sediment-concentration data stored in American Standard Code for Information Interchange (ASCII) files in a predefined card-image format. Program options within SEDCALC can be used to assist in creating and editing the card-image files, as well as to reformat card-image files to and from formats used by the USGS Water-Quality System. SEDCALC provides options for creating card-image files containing time series of equal-interval suspended-sediment concentrations from 1. digitized suspended-sediment-concentration traces, 2. linear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals, and 3. nonlinear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals. Suspended-sediment discharge can be computed from the streamflow and suspended-sediment-concentration data or by application of transport relations derived by regressing log-transformed instantaneous streamflows on log-transformed instantaneous suspended-sediment concentrations or discharges. The computed suspended-sediment discharge data are stored in card-image files that can be either directly imported to the USGS Automated Data Processing System or used to generate plots by means of other SEDCALC options.

  7. 17 CFR 239.63 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... for access codes to file on EDGAR. 239.63 Section 239.63 Commodity and Securities Exchanges SECURITIES... Statements § 239.63 Form ID, uniform application for access codes to file on EDGAR. Form ID must be filed by... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification...

  8. 78 FR 54888 - Guzman Power Markets, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-06

    ... the eFiling link to log on and submit the intervention or protests. Persons unable to file... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... electronic service, persons with Internet access who will eFile a document and/or be listed as a contact for...

  9. 17 CFR 239.63 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... for access codes to file on EDGAR. 239.63 Section 239.63 Commodity and Securities Exchanges SECURITIES... Statements § 239.63 Form ID, uniform application for access codes to file on EDGAR. Form ID must be filed by... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification...

  10. 78 FR 28835 - Salton Sea Power Generation Company; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    .... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...

  11. 77 FR 55817 - Delek Crude Logistics, LLC; Notice of Petition for Waiver

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ... using the eRegistration link. Select the eFiling link to log on and submit the intervention or protests... number. eFiling is encouraged. More detailed information relating to filing requirements, interventions...'') grant a temporary waiver of the filing and reporting requirements of sections 6 and 201 of the...

  12. Geologic cross section D-D' through the Appalachian basin from the Findlay arch, Sandusky County, Ohio, to the Valley and Ridge province, Hardy County, West Virginia: Chapter E.4.1 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Crangle, Robert D.; Trippi, Michael H.; Swezey, Christopher S.; Lentz, Erika E.; Rowan, Elisabeth L.; Hope, Rebecca S.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    This chapter is a re-release of U.S. Geological Survey Scientific Investigations Map 3067, of the same title, by Ryder and others (2009). For this chapter, two appendixes have been added that do not appear with the original version. Appendix A provides Log ASCII Standard (LAS) files for each drill hole along cross-section D-D'; they are text files which encode gamma-ray, neutron, density, and other logs that can be used by most well-logging software. Appendix B provides graphic well-log traces and lithologic descriptions with formation tops from each drill hole.

  13. Parallel checksumming of data chunks of a shared data object using a log-structured file system

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-09-06

    Checksum values are generated and used to verify the data integrity. A client executing in a parallel computing system stores a data chunk to a shared data object on a storage node in the parallel computing system. The client determines a checksum value for the data chunk; and provides the checksum value with the data chunk to the storage node that stores the shared object. The data chunk can be stored on the storage node with the corresponding checksum value as part of the shared object. The storage node may be part of a Parallel Log-Structured File System (PLFS), and the client may comprise, for example, a Log-Structured File System client on a compute node or burst buffer. The checksum value can be evaluated when the data chunk is read from the storage node to verify the integrity of the data that is read.

  14. 15 CFR 762.3 - Records exempt from recordkeeping requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...; (2) Special export file list; (3) Vessel log from freight forwarder; (4) Inspection certificate; (5... form; (12) Financial hold form; (13) Export parts shipping problem form; (14) Draft number log; (15) Expense invoice mailing log; (16) Financial status report; (17) Bank release of guarantees; (18) Cash...

  15. 15 CFR 762.3 - Records exempt from recordkeeping requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...; (2) Special export file list; (3) Vessel log from freight forwarder; (4) Inspection certificate; (5... form; (12) Financial hold form; (13) Export parts shipping problem form; (14) Draft number log; (15) Expense invoice mailing log; (16) Financial status report; (17) Bank release of guarantees; (18) Cash...

  16. The NetLogger Toolkit V2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunter, Dan; Lee, Jason; Stoufer, Martin

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation ofmore » application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and penodically try to reconnect broken TCP pipe. A typical use for this is to store data on local disk while net is down. An event archiver can log one or more incoming NetLogger streams to a local disk file (netlogd) or to a mySQL database (netarchd). We have found exploratory, visual analysis of the log event data to be the most useful means of determining the causes of performance anomalies The NetLogger Visualization tool, niv, has been developed to provide a flexible and interactive graphical representation of system-level and application-level events.« less

  17. Analysis of the request patterns to the NSSDC on-line archive

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore

    1994-01-01

    NASA missions, both for earth science and for space science, collect huge amounts of data, and the rate at which data is being gathered is increasing. For example, the EOSDIS project is expected to collect petabytes per year. In addition, these archives are being made available to remote users over the Internet. The ability to manage the growth of the size and request activity of scientific archives depends on an understanding of the access patterns of scientific users. The National Space Science Data Center (NSSDC) of NASA Goddard Space Flight Center has run their on-line mass storage archive of space data, the National Data Archive and Distribution Service (NDADS), since November 1991. A large world-wide space research community makes use of NSSDC, requesting more than 20,000 files per month. Since the initiation of their service, they have maintained log files which record all accesses the archive. In this report, we present an analysis of the NDADS log files. We analyze the log files, and discuss several issues, including caching, reference patterns, clustering, and system loading.

  18. Aggregation Trade Offs in Family Based Recommendations

    NASA Astrophysics Data System (ADS)

    Berkovsky, Shlomo; Freyne, Jill; Coombe, Mac

    Personalized information access tools are frequently based on collaborative filtering recommendation algorithms. Collaborative filtering recommender systems typically suffer from a data sparsity problem, where systems do not have sufficient user data to generate accurate and reliable predictions. Prior research suggested using group-based user data in the collaborative filtering recommendation process to generate group-based predictions and partially resolve the sparsity problem. Although group recommendations are less accurate than personalized recommendations, they are more accurate than general non-personalized recommendations, which are the natural fall back when personalized recommendations cannot be generated. In this work we present initial results of a study that exploits the browsing logs of real families of users gathered in an eHealth portal. The browsing logs allowed us to experimentally compare the accuracy of two group-based recommendation strategies: aggregated group models and aggregated predictions. Our results showed that aggregating individual models into group models resulted in more accurate predictions than aggregating individual predictions into group predictions.

  19. Cooperative storage of shared files in a parallel computing system with dynamic block size

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-11-10

    Improved techniques are provided for parallel writing of data to a shared object in a parallel computing system. A method is provided for storing data generated by a plurality of parallel processes to a shared object in a parallel computing system. The method is performed by at least one of the processes and comprises: dynamically determining a block size for storing the data; exchanging a determined amount of the data with at least one additional process to achieve a block of the data having the dynamically determined block size; and writing the block of the data having the dynamically determined block size to a file system. The determined block size comprises, e.g., a total amount of the data to be stored divided by the number of parallel processes. The file system comprises, for example, a log structured virtual parallel file system, such as a Parallel Log-Structured File System (PLFS).

  20. 78 FR 70299 - Capacity Markets Partners, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-25

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  1. 78 FR 59923 - Buffalo Dunes Wind Project, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  2. 78 FR 28833 - Lighthouse Energy Group, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  3. 78 FR 29366 - Wheelabrator Baltimore, LP; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-20

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  4. 77 FR 64978 - Sunbury Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  5. 78 FR 62300 - Burgess Biopower LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-15

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  6. 78 FR 75561 - South Bay Energy Corp.; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  7. 78 FR 28833 - Ebensburg Power Company; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  8. 78 FR 72673 - Yellow Jacket Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-03

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street, NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  9. 78 FR 44557 - Guttman Energy Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  10. 78 FR 68052 - Covanta Haverhill Association, LP; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  11. 78 FR 49506 - Source Power & Gas LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  12. 77 FR 64980 - Noble Americas Energy Solutions LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE...://www.ferc.gov . To facilitate electronic service, persons with Internet access who will eFile a... using the eRegistration link. Select the eFiling link to log on and submit the intervention or protests...

  13. 78 FR 46939 - DWP Energy Holdings, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  14. 78 FR 28833 - CE Leathers Company; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  15. 78 FR 59014 - Lakeswind Power Partners, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-25

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street, NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  16. 78 FR 75560 - Green Current Solutions, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  17. 77 FR 64980 - Collegiate Clean Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  18. 77 FR 64977 - Frontier Utilities New York LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  19. 78 FR 62299 - West Deptford Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-15

    ... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  20. 78 FR 52913 - Allegany Generating Station LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-27

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  1. SedMob: A mobile application for creating sedimentary logs in the field

    NASA Astrophysics Data System (ADS)

    Wolniewicz, Pawel

    2014-05-01

    SedMob is an open-source, mobile software package for creating sedimentary logs, targeted for use in tablets and smartphones. The user can create an unlimited number of logs, save data from each bed in the log as well as export and synchronize the data with a remote server. SedMob is designed as a mobile interface to SedLog: a free multiplatform package for drawing graphic logs that runs on PC computers. Data entered into SedMob are saved in the CSV file format, fully compatible with SedLog.

  2. COMBATXXI, JDAFS, and LBC Integration Requirements for EASE

    DTIC Science & Technology

    2015-10-06

    process as linear and as new data is made available, any previous analysis is obsolete and has to start the process over again. Figure 2 proposes a...final line of the manifest file names the scenario file associated with the run. Under the usual practice, the analyst now starts the COMBATXXI...describes which events are to be logged. Finally the scenario is started with the click of a button. The simulation generates logs of a couple of sorts

  3. Model Analyst’s Toolkit User Guide, Version 7.1.0

    DTIC Science & Technology

    2015-08-01

    Help > About)  Environment details ( operating system )  metronome.log file, located in your MAT 7.1.0 installation folder  Any log file that...requirements to run the Model Analyst’s Toolkit:  Windows XP operating system (or higher) with Service Pack 2 and all critical Windows updates installed...application icon on your desktop  Create a Quick Launch icon – Creates a MAT application icon on the taskbar for operating systems released

  4. Users' information-seeking behavior on a medical library Website

    PubMed Central

    Rozic-Hristovski, Anamarija; Hristovski, Dimitar; Todorovski, Ljupco

    2002-01-01

    The Central Medical Library (CMK) at the Faculty of Medicine, University of Ljubljana, Slovenia, started to build a library Website that included a guide to library services and resources in 1997. The evaluation of Website usage plays an important role in its maintenance and development. Analyzing and exploring regularities in the visitors' behavior can be used to enhance the quality and facilitate delivery of information services, identify visitors' interests, and improve the server's performance. The analysis of the CMK Website users' navigational behavior was carried out by analyzing the Web server log files. These files contained information on all user accesses to the Website and provided a great opportunity to learn more about the behavior of visitors to the Website. The majority of the available tools for Web log file analysis provide a predefined set of reports showing the access count and the transferred bytes grouped along several dimensions. In addition to the reports mentioned above, the authors wanted to be able to perform interactive exploration and ad hoc analysis and discover trends in a user-friendly way. Because of that, we developed our own solution for exploring and analyzing the Web logs based on data warehousing and online analytical processing technologies. The analytical solution we developed proved successful, so it may find further application in the field of Web log file analysis. We will apply the findings of the analysis to restructuring the CMK Website. PMID:11999179

  5. 18 CFR 270.304 - Tight formation gas.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... determination that natural gas is tight formation gas must file with the jurisdictional agency an application... formation; (d) A complete copy of the well log, including the log heading identifying the designated tight...

  6. SU-F-T-177: Impacts of Gantry Angle Dependent Scanning Beam Properties for Proton Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Y; Clasie, B; Lu, H

    Purpose: In pencil beam scanning (PBS), the delivered spot MU, position and size are slightly different at different gantry angles. We investigated the level of delivery uncertainty at different gantry angles through a log file analysis. Methods: 34 PBS fields covering full 360 degrees gantry angle spread were collected retrospectively from 28 patients treated at our institution. All fields were delivered at zero gantry angle and the prescribed gantry angle, and measured at isocenter with the MatriXX 2D array detector at the prescribed gantry angle. The machine log files were analyzed to extract the delivered MU per spot and themore » beam position from the strip ionization chambers in the treatment nozzle. The beam size was separately measured as a function of gantry angle and beam energy. Using this information, the dose was calculated in a water phantom at both gantry angles and compared to the measurement using the 3D γ-index at 2mm/2%. Results: The spot-by-spot difference between the beam position in the log files from the delivery at the two gantry angles has a mean of 0.3 and 0.4 mm and a standard deviation of 0.6 and 0.7 mm for × and y directions, respectively. Similarly, the spot-by-spot difference between the MU in the log files from the delivery at the two gantry angles has a mean 0.01% and a standard deviation of 0.7%. These small deviations lead to an excellent agreement in dose calculations with an average γ pass rate for all fields being approximately 99.7%. When each calculation is compared to the measurement, a high correlation in γ was also found. Conclusion: Using machine logs files, we verified that PBS beam delivery at different gantry angles are sufficiently small and the planned spot position and MU. This study brings us one step closer to simplifying our patient-specific QA.« less

  7. MO-F-CAMPUS-I-01: A System for Automatically Calculating Organ and Effective Dose for Fluoroscopically-Guided Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Rana, V

    2015-06-15

    Purpose: A system was developed that automatically calculates the organ and effective dose for individual fluoroscopically-guided procedures using a log of the clinical exposure parameters. Methods: We have previously developed a dose tracking system (DTS) to provide a real-time color-coded 3D- mapping of skin dose. This software produces a log file of all geometry and exposure parameters for every x-ray pulse during a procedure. The data in the log files is input into PCXMC, a Monte Carlo program that calculates organ and effective dose for projections and exposure parameters set by the user. We developed a MATLAB program to readmore » data from the log files produced by the DTS and to automatically generate the definition files in the format used by PCXMC. The processing is done at the end of a procedure after all exposures are completed. Since there are thousands of exposure pulses with various parameters for fluoroscopy, DA and DSA and at various projections, the data for exposures with similar parameters is grouped prior to entry into PCXMC to reduce the number of Monte Carlo calculations that need to be performed. Results: The software developed automatically transfers data from the DTS log file to PCXMC and runs the program for each grouping of exposure pulses. When the dose from all exposure events are calculated, the doses for each organ and all effective doses are summed to obtain procedure totals. For a complicated interventional procedure, the calculations can be completed on a PC without manual intervention in less than 30 minutes depending on the level of data grouping. Conclusion: This system allows organ dose to be calculated for individual procedures for every patient without tedious calculations or data entry so that estimates of stochastic risk can be obtained in addition to the deterministic risk estimate provided by the DTS. Partial support from NIH grant R01EB002873 and Toshiba Medical Systems Corp.« less

  8. 78 FR 28834 - Salton Sea Power L.L.C.; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  9. 78 FR 28835 - Del Ranch Company; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  10. 78 FR 28835 - Patua Project LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  11. 78 FR 75561 - Great Bay Energy V, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  12. 77 FR 64981 - Homer City Generation, L.P.; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  13. 77 FR 69819 - Cirrus Wind 1, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-21

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  14. 77 FR 64979 - Great Bay Energy IV, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  15. 77 FR 53195 - H.A. Wagner LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-31

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  16. 78 FR 59923 - Mammoth Three LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  17. 78 FR 61945 - Tuscola Wind II, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-07

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  18. 77 FR 69819 - QC Power Strategies Fund LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-21

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  19. 78 FR 75561 - Astral Energy LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  20. A Scalable Monitoring for the CMS Filter Farm Based on Elasticsearch

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andre, J.M.; et al.

    2015-12-23

    A flexible monitoring system has been designed for the CMS File-based Filter Farm making use of modern data mining and analytics components. All the metadata and monitoring information concerning data flow and execution of the HLT are generated locally in the form of small documents using the JSON encoding. These documents are indexed into a hierarchy of elasticsearch (es) clusters along with process and system log information. Elasticsearch is a search server based on Apache Lucene. It provides a distributed, multitenant-capable search and aggregation engine. Since es is schema-free, any new information can be added seamlessly and the unstructured informationmore » can be queried in non-predetermined ways. The leaf es clusters consist of the very same nodes that form the Filter Farm thus providing natural horizontal scaling. A separate central” es cluster is used to collect and index aggregated information. The fine-grained information, all the way to individual processes, remains available in the leaf clusters. The central es cluster provides quasi-real-time high-level monitoring information to any kind of client. Historical data can be retrieved to analyse past problems or correlate them with external information. We discuss the design and performance of this system in the context of the CMS DAQ commissioning for LHC Run 2.« less

  1. 26 CFR 1.614-5 - Special rules as to aggregating nonoperating mineral interests.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... section. (b) Manner and scope of election—(1) Time for filing application for permission to aggregate separate nonoperating mineral interests under paragraph (a) of this section. The application for permission... and returns under permission. The application for permission to aggregate nonoperating mineral...

  2. Census of Population and Housing, 1980: Summary Tape File 1F, School Districts. Technical Documentation.

    ERIC Educational Resources Information Center

    Bureau of the Census (DOC), Washington, DC. Data User Services Div.

    This report provides technical documentation associated with a 1980 Census of Population and Housing Summary Tape File 1F--the School Districts File. The file contains complete-count data of population and housing aggregated by school district. Population items tabulated include age, race (provisional data), sex, marital status, Spanish origin…

  3. Sight Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G.

    2014-09-01

    Enables applications to emit log information into an output file and produced a structured visual summary of the log data, as well as various statistical analyses of it. This makes it easier for developers to understand the behavior of their applications.

  4. 75 FR 60122 - Notice of Public Information Collection(s) Being Reviewed by the Federal Communications...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-29

    ... the respondents, including the use of automated collection techniques or other forms of information...: OMB Control Number: 3060-0360. Title: Section 80.409, Station Logs. Form No.: N/A. Type of Review... for filing suits upon such claims. Section 80.409(d), Ship Radiotelegraph Logs: Logs of ship stations...

  5. 78 FR 28834 - Elmore Company; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  6. 78 FR 49507 - OriGen Energy LLC ; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... securities and assumptions of liability. Any person desiring to intervene or to protest should file with the... with Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log...

  7. 78 FR 49507 - ORNI 47 LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  8. 77 FR 64981 - BITHENERGY, Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and...

  9. 78 FR 40473 - eBay Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-05

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  10. 78 FR 28832 - CalEnergy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  11. 17 CFR 259.602 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...)—allows a filer, filing agent or training agent to log on to the EDGAR system, submit filings, and change... agent to change its Password. [69 FR 22710, Apr. 26, 2004] Editorial Note: For Federal Register... section of the printed volume and at at www.fdsys.gov. ...

  12. 17 CFR 259.602 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...)—allows a filer, filing agent or training agent to log on to the EDGAR system, submit filings, and change... agent to change its Password. [69 FR 22710, Apr. 26, 2004] Editorial Note: For Federal Register... section of the printed volume and on GPO Access. ...

  13. Online data handling and storage at the CMS experiment

    NASA Astrophysics Data System (ADS)

    Andre, J.-M.; Andronidis, A.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Demiragli, Z.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gómez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, RK; Morovic, S.; Nuñez-Barranco-Fernández, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.

    2015-12-01

    During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced by the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ∼62 sources produced with an aggregate rate of ∼2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.

  14. Online Data Handling and Storage at the CMS Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andre, J. M.; et al.

    2015-12-23

    During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced bymore » the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ~62 sources produced with an aggregate rate of ~2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.« less

  15. Integrated Autonomous Network Management (IANM) Multi-Topology Route Manager and Analyzer

    DTIC Science & Technology

    2008-02-01

    zebra tmg mtrcli xinetd (tftp) mysql configuration file (mtrrm.conf) configuration file (mtrrmAggregator.properties) tftp files /tftpboot NetFlow PDUs...configuration upload/download snmp, telnet OSPFv2 user interface tmg Figure 6-2. Internal software organization Figure 6-2 illustrates the main

  16. Modifications to the accuracy assessment analysis routine MLTCRP to produce an output file

    NASA Technical Reports Server (NTRS)

    Carnes, J. G.

    1978-01-01

    Modifications are described that were made to the analysis program MLTCRP in the accuracy assessment software system to produce a disk output file. The output files produced by this modified program are used to aggregate data for regions greater than a single segment.

  17. Logs Perl Module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owen, R. K.

    2007-04-04

    A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.

  18. Expansion of the roadway reference log : KYSPR-99-201.

    DOT National Transportation Integrated Search

    2000-05-01

    The objectives of this study were to: 1) expand the current route log to include milepoints for all intersections on state maintained roads and 2) recommend a procedure for establishing milepoints and maintaining the file with up-to-date information....

  19. 78 FR 52524 - Sunoco Pipeline LP; Notice of Petition for Declaratory Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-23

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... described in their petition. Any person desiring to intervene or to protest in this proceedings must file in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  20. 78 FR 62349 - Sunoco Pipeline L.P.; Notice of Petition for Declaratory Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-18

    ... to log on and submit the intervention or protests. Persons unable to file electronically should... petition. Any person desiring to intervene or to protest in this proceeding must file in accordance with..., persons with Internet access who will eFile a document and/or be listed as a contact for an intervenor...

  1. 17 CFR 249.446 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification Authorization Code (PMAC)—allows a filer, filing agent or training agent to change its Password. [69 FR 22710... Sections Affected, which appears in the Finding Aids section of the printed volume and on GPO Access. ...

  2. 78 FR 77155 - Grant Program To Assess, Evaluate, and Promote Development of Tribal Energy and Mineral Resources

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-20

    ... through DEMD's in-house databases; Well log interpretation, including correlation of formation tops.... Files must have descriptive file names to help DEMD quickly locate specific components of the proposal...

  3. Building analytical platform with Big Data solutions for log files of PanDA infrastructure

    NASA Astrophysics Data System (ADS)

    Alekseev, A. A.; Barreiro Megino, F. G.; Klimentov, A. A.; Korchuganova, T. A.; Maendo, T.; Padolski, S. V.

    2018-05-01

    The paper describes the implementation of a high-performance system for the processing and analysis of log files for the PanDA infrastructure of the ATLAS experiment at the Large Hadron Collider (LHC), responsible for the workload management of order of 2M daily jobs across the Worldwide LHC Computing Grid. The solution is based on the ELK technology stack, which includes several components: Filebeat, Logstash, ElasticSearch (ES), and Kibana. Filebeat is used to collect data from logs. Logstash processes data and export to Elasticsearch. ES are responsible for centralized data storage. Accumulated data in ES can be viewed using a special software Kibana. These components were integrated with the PanDA infrastructure and replaced previous log processing systems for increased scalability and usability. The authors will describe all the components and their configuration tuning for the current tasks, the scale of the actual system and give several real-life examples of how this centralized log processing and storage service is used to showcase the advantages for daily operations.

  4. Comments Regarding the Binary Power Law for Heterogeneity of Disease Incidence

    USDA-ARS?s Scientific Manuscript database

    The binary power law (BPL) has been successfully used to characterize heterogeneity (over dispersion or small-scale aggregation) of disease incidence for many plant pathosystems. With the BPL, the log of the observed variance is a linear function of the log of the theoretical variance for a binomial...

  5. 20 CFR 658.414 - Referral of non-JS-related complaints.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... applicable, were referred on the complaint log specified in § 658.410(c)(1). The JS official shall also prepare and keep the file specified in § 658.410(c)(3) for the complaints filed pursuant to paragraph (a...

  6. 78 FR 49506 - E.ON Global Commodities North America LLC; Supplemental Notice That Initial Market-Based Rate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  7. 78 FR 63977 - Enable Bakken Crude Services, LLC; Notice of Request For Waiver

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-25

    ... person desiring to intervene or to protest in this proceedings must file in accordance with Rules 211 and... Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and...

  8. 17 CFR 249.446 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification Authorization Code (PMAC)—allows a filer, filing agent or training agent to change its Password. [69 FR 22710... Sections Affected, which appears in the Finding Aids section of the printed volume and at at www.fdsys.gov. ...

  9. MAIL LOG, program theory, volume 2

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.

  10. SU-G-JeP1-08: Dual Modality Verification for Respiratory Gating Using New Real- Time Tumor Tracking Radiotherapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiinoki, T; Hanazawa, H; Shibuya, K

    Purpose: The respirato ry gating system combined the TrueBeam and a new real-time tumor-tracking radiotherapy system (RTRT) was installed. The RTRT system consists of two x-ray tubes and color image intensifiers. Using fluoroscopic images, the fiducial marker which was implanted near the tumor was tracked and was used as the internal surrogate for respiratory gating. The purposes of this study was to develop the verification technique of the respiratory gating with the new RTRT using cine electronic portal image device images (EPIDs) of TrueBeam and log files of the RTRT. Methods: A patient who underwent respiratory gated SBRT of themore » lung using the RTRT were enrolled in this study. For a patient, the log files of three-dimensional coordinate of fiducial marker used as an internal surrogate were acquired using the RTRT. Simultaneously, the cine EPIDs were acquired during respiratory gated radiotherapy. The data acquisition was performed for one field at five sessions during the course of SBRT. The residual motion errors were calculated using the log files (E{sub log}). The fiducial marker used as an internal surrogate into the cine EPIDs was automatically extracted by in-house software based on the template-matching algorithm. The differences between the the marker positions of cine EPIDs and digitally reconstructed radiograph were calculated (E{sub EPID}). Results: Marker detection on EPID using in-house software was influenced by low image contrast. For one field during the course of SBRT, the respiratory gating using the RTRT showed the mean ± S.D. of 95{sup th} percentile E{sub EPID} were 1.3 ± 0.3 mm,1.1 ± 0.5 mm,and those of E{sub log} were 1.5 ± 0.2 mm, 1.1 ± 0.2 mm in LR and SI directions, respectively. Conclusion: We have developed the verification method of respiratory gating combined TrueBeam and new real-time tumor-tracking radiotherapy system using EPIDs and log files.« less

  11. [Investigation of Elekta linac characteristics for VMAT].

    PubMed

    Luo, Guangwen; Zhang, Kunyi

    2012-01-01

    The aim of this study is to investigate the characteristics of Elekta delivery system for volumetric modulated arc therapy (VMAT). Five VMAT plans were delivered in service mode and dose rates, and speed of gantry and MLC leaves were analyzed by log files. Results showed that dose rates varied between 6 dose rates. Gantry and MLC leaf speed dynamically varied during delivery. The technique of VMAT requires linac to dynamically control more parameters, and these key dynamic variables during VMAT delivery can be checked by log files. Quality assurance procedure should be carried out for VMAT related parameter.

  12. 75 FR 15479 - Self-Regulatory Organizations; The National Securities Clearing Corporation; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-29

    ... SECURITIES AND EXCHANGE COMMISSION [Release No. 34-61762; File No. SR-NSCC-2010-02] Self-Regulatory Organizations; The National Securities Clearing Corporation; Notice of Filing and Immediate Effectiveness of Proposed Rule Change to Aggregate Obligations in Certain Securities Transactions Designated for Settlement on a Trade-for-Trade Basis Marc...

  13. TU-CD-304-11: Veritas 2.0: A Cloud-Based Tool to Facilitate Research and Innovation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra, P; Patankar, A; Etmektzoglou, A

    Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verifiedmore » via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto.« less

  14. The Feasibility of Using Cluster Analysis to Examine Log Data from Educational Video Games. CRESST Report 790

    ERIC Educational Resources Information Center

    Kerr, Deirdre; Chung, Gregory K. W. K.; Iseli, Markus R.

    2011-01-01

    Analyzing log data from educational video games has proven to be a challenging endeavor. In this paper, we examine the feasibility of using cluster analysis to extract information from the log files that is interpretable in both the context of the game and the context of the subject area. If cluster analysis can be used to identify patterns of…

  15. 18 CFR 401.110 - Fees.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... include staff time associated with: (A) Processing FOIA requests; (B) Locating and reviewing files; (C) Monitoring file reviews; (D) Generating computer records (electronic print-outs); and (E) Preparing logs of..., black and white copies. The charge for copying standard sized, black and white public records shall be...

  16. 18 CFR 401.110 - Fees.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... include staff time associated with: (A) Processing FOIA requests; (B) Locating and reviewing files; (C) Monitoring file reviews; (D) Generating computer records (electronic print-outs); and (E) Preparing logs of..., black and white copies. The charge for copying standard sized, black and white public records shall be...

  17. 9 CFR 327.10 - Samples; inspection of consignments; refusal of entry; marking.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... import establishment and approved by the Director, Import Inspection Division, is on file at the import... (iv) That the establishment will maintain a daily stamping log containing the following information... covering the product to be inspected. The daily stamping log must be retained by the establishment in...

  18. 46 CFR 78.37-3 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... completed, the master or person in charge shall file the logbook with the Officer in Charge, Marine... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  19. 46 CFR 131.610 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) OFFSHORE SUPPLY VESSELS OPERATIONS Logs § 131... them. (d) When a voyage is completed, or after a specified time has elapsed, the master shall file the... alternative log or record for making entries required by law, including regulations in this subchapter. This...

  20. 46 CFR 131.610 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) OFFSHORE SUPPLY VESSELS OPERATIONS Logs § 131... them. (d) When a voyage is completed, or after a specified time has elapsed, the master shall file the... alternative log or record for making entries required by law, including regulations in this subchapter. This...

  1. 9 CFR 327.10 - Samples; inspection of consignments; refusal of entry; marking.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... import establishment and approved by the Director, Import Inspection Division, is on file at the import... (iv) That the establishment will maintain a daily stamping log containing the following information... covering the product to be inspected. The daily stamping log must be retained by the establishment in...

  2. 46 CFR 78.37-3 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... completed, the master or person in charge shall file the logbook with the Officer in Charge, Marine... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  3. A Varian DynaLog file-based procedure for patient dose-volume histogram-based IMRT QA.

    PubMed

    Calvo-Ortega, Juan F; Teke, Tony; Moragues, Sandra; Pozo, Miquel; Casals-Farran, Joan

    2014-03-06

    In the present study, we describe a method based on the analysis of the dynamic MLC log files (DynaLog) generated by the controller of a Varian linear accelerator in order to perform patient-specific IMRT QA. The DynaLog files of a Varian Millennium MLC, recorded during an IMRT treatment, can be processed using a MATLAB-based code in order to generate the actual fluence for each beam and so recalculate the actual patient dose distribution using the Eclipse treatment planning system. The accuracy of the DynaLog-based dose reconstruction procedure was assessed by introducing ten intended errors to perturb the fluence of the beams of a reference plan such that ten subsequent erroneous plans were generated. In-phantom measurements with an ionization chamber (ion chamber) and planar dose measurements using an EPID system were performed to investigate the correlation between the measured dose changes and the expected ones detected by the reconstructed plans for the ten intended erroneous cases. Moreover, the method was applied to 20 cases of clinical plans for different locations (prostate, lung, breast, and head and neck). A dose-volume histogram (DVH) metric was used to evaluate the impact of the delivery errors in terms of dose to the patient. The ionometric measurements revealed a significant positive correlation (R² = 0.9993) between the variations of the dose induced in the erroneous plans with respect to the reference plan and the corresponding changes indicated by the DynaLog-based reconstructed plans. The EPID measurements showed that the accuracy of the DynaLog-based method to reconstruct the beam fluence was comparable with the dosimetric resolution of the portal dosimetry used in this work (3%/3 mm). The DynaLog-based reconstruction method described in this study is a suitable tool to perform a patient-specific IMRT QA. This method allows us to perform patient-specific IMRT QA by evaluating the result based on the DVH metric of the planning CT image (patient DVH-based IMRT QA).

  4. Ultraviolet (UV)-reflective paint with ultraviolet germicidal irradiation (UVGI) improves decontamination of nosocomial bacteria on hospital room surfaces.

    PubMed

    Jelden, Katelyn C; Gibbs, Shawn G; Smith, Philip W; Hewlett, Angela L; Iwen, Peter C; Schmid, Kendra K; Lowe, John J

    2017-06-01

    An ultraviolet germicidal irradiation (UVGI) generator (the TORCH, ClorDiSys Solutions, Inc.) was used to compare the disinfection of surface coupons (plastic from a bedrail, stainless steel, and chrome-plated light switch cover) in a hospital room with walls coated with ultraviolet (UV)-reflective paint (Lumacept) or standard paint. Each surface coupon was inoculated with methicillin-resistant Staphylococcus aureus (MRSA) or vancomycin-resistant Enterococcus faecalis (VRE), placed at 6 different sites within a hospital room coated with UV-reflective paint or standard paint, and treated by 10 min UVC exposure (UVC dose of 0-688 mJ/cm 2 between sites with standard paint and 0-553 mJ/cm 2 with UV-reflective paint) in 8 total trials. Aggregated MRSA concentrations on plastic bedrail surface coupons were reduced on average by 3.0 log 10 (1.8 log 10 Geometric Standard Deviation [GSD]) with standard paint and 4.3 log 10 (1.3 log 10 GSD) with UV-reflective paint (p = 0.0005) with no significant reduction differences between paints on stainless steel and chrome. Average VRE concentrations were reduced by ≥4.9 log 10 (<1.2 log 10 GSD) on all surface types with UV-reflective paint and ≤4.1 log 10 (<1.7 log 10 GSD) with standard paint (p < 0.05). At 5 aggregated sites directly exposed to UVC light, MRSA concentrations on average were reduced by 5.2 log 10 (1.4 log 10 GSD) with standard paint and 5.1 log 10 (1.2 log 10 GSD) with UV-reflective paint (p = 0.017) and VRE by 4.4 log 10 (1.4 log 10 GSD) with standard paint and 5.3 log 10 (1.1 log 10 GSD) with UV-reflective paint (p < 0.0001). At one indirectly exposed site on the opposite side of the hospital bed from the UVGI generator, MRSA concentrations on average were reduced by 1.3 log 10 (1.7 log 10 GSD) with standard paint and 4.7 log 10 (1.3 log 10 GSD) with UV-reflective paint (p < 0.0001) and VRE by 1.2 log 10 (1.5 log 10 GSD) with standard paint and 4.6 log 10 (1.1 log 10 GSD) with UV-reflective paint (p < 0.0001). Coating hospital room walls with UV-reflective paint enhanced UVGI disinfection of nosocomial bacteria on various surfaces compared to standard paint, particularly at a surface placement site indirectly exposed to UVC light.

  5. 40 CFR 60.288a - Reporting.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... test to generate a submission package file, which documents performance test data. You must then submit the file generated by the ERT through the EPA's Compliance and Emissions Data Reporting Interface (CEDRI), which can be accessed by logging in to the EPA's Central Data Exchange (CDX) (https://cdx.epa...

  6. 27 CFR 46.107 - Penalty for failure to file return or to pay tax.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... before the date for filing prescribed in § 46.103 must pay, in addition to the tax, a delinquency penalty....109). The delinquency penalty for failure to file the return on or before the last date prescribed... during which the delinquency continues, but not more than 25 percent in the aggregate. (b) Failure to pay...

  7. Developing a Complete and Effective ACT-R Architecture

    DTIC Science & Technology

    2008-01-01

    of computational primitives , as contrasted with the predominant “one-off” and “grab-bag” cognitive models in the field. These architectures have...transport/ semaphore protocols connected via a glue script. Both protocols rely on the fact that file rename and file remove operations are atomic...the Trial Log file until just prior to processing the next input request. Thus, to perform synchronous identifications it is necessary to run an

  8. Techtalk: Telecommunications for Improving Developmental Education.

    ERIC Educational Resources Information Center

    Caverly, David C.; Broderick, Bill

    1993-01-01

    Explains how to access the Internet, discussing hardware and software considerations, connectivity, and types of access available to users. Describes the uses of electronic mail; TELNET, a method for remotely logging onto another computer; and anonymous File Transfer Protocol (FTP), a method for downloading files from a remote computer. (MAB)

  9. Use patterns of health information exchange through a multidimensional lens: conceptual framework and empirical validation.

    PubMed

    Politi, Liran; Codish, Shlomi; Sagy, Iftach; Fink, Lior

    2014-12-01

    Insights about patterns of system use are often gained through the analysis of system log files, which record the actual behavior of users. In a clinical context, however, few attempts have been made to typify system use through log file analysis. The present study offers a framework for identifying, describing, and discerning among patterns of use of a clinical information retrieval system. We use the session attributes of volume, diversity, granularity, duration, and content to define a multidimensional space in which each specific session can be positioned. We also describe an analytical method for identifying the common archetypes of system use in this multidimensional space. We demonstrate the value of the proposed framework with a log file of the use of a health information exchange (HIE) system by physicians in an emergency department (ED) of a large Israeli hospital. The analysis reveals five distinct patterns of system use, which have yet to be described in the relevant literature. The results of this study have the potential to inform the design of HIE systems for efficient and effective use, thus increasing their contribution to the clinical decision-making process. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. 12 CFR 1402.27 - Aggregating requests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Information § 1402.27 Aggregating requests. A requester may not file multiple requests at the same time, each... in concert, is attempting to break a request down into a series of requests for the purpose of... reasonable is the time period over which the requests have occurred. ...

  11. 12 CFR 1402.27 - Aggregating requests.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Information § 1402.27 Aggregating requests. A requester may not file multiple requests at the same time, each... in concert, is attempting to break a request down into a series of requests for the purpose of... reasonable is the time period over which the requests have occurred. ...

  12. 12 CFR 1402.27 - Aggregating requests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Information § 1402.27 Aggregating requests. A requester may not file multiple requests at the same time, each... in concert, is attempting to break a request down into a series of requests for the purpose of... reasonable is the time period over which the requests have occurred. ...

  13. 46 CFR 196.35-3 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... form CG-706 or in the owner's format for an official logbook. Such logs must be kept available for a... master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection. (b) The... of making entries therein as required by law or regulations in this subchapter. Such logs or records...

  14. 46 CFR 196.35-3 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... form CG-706 or in the owner's format for an official logbook. Such logs must be kept available for a... master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection. (b) The... of making entries therein as required by law or regulations in this subchapter. Such logs or records...

  15. 46 CFR 35.07-5 - Logbooks and records-TB/ALL.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., the master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  16. 29 CFR 1960.28 - Employee reports of unsafe or unhealthful working conditions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... report of an existing or potential unsafe or unhealthful working condition should be recorded on a log maintained at the establishment. If an agency finds it inappropriate to maintain a log of written reports at... sequentially numbered case file, coded for identification, should be assigned for purposes of maintaining an...

  17. 20 CFR 658.422 - Handling of non-JS-related complaints by the Regional Administrator.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... non-JS-related complaints alleging violations of employment related laws shall be logged. The... which the complainant (or complaint) was referred on a complaint log, similar to the one described in § 658.410(c)(1). The appropriate regional official shall also prepare and keep the file specified in...

  18. 46 CFR 35.07-5 - Logbooks and records-TB/ALL.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., the master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  19. LogScope

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex

    2012-01-01

    LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).

  20. National Geocoding Converter File 1 : Volume 3. Montana to Wyoming.

    DOT National Transportation Integrated Search

    1974-01-01

    This file contains a record for each county, county equivalent (as defined by the Census Bureau), SMSA county segment and SPLC county segment in the U.S. A record identifies for an area all major county codes and the associated county aggregate codes

  1. National Geocoding Converter File 1 : Volume 1. Structure & Content.

    DOT National Transportation Integrated Search

    1974-01-01

    This file contains a record for each county, county equivalent (as defined by the Census Bureau), SMSA county segment and SPLC county segment in the U.S. A record identifies for an area all major county codes and the associated county aggregate codes

  2. 78 FR 3483 - Self-Regulatory Organizations; The Options Clearing Corporation; Notice of Filing of Advance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-16

    ... Harris Bank N.A. (``Lender'') having a maximum aggregate principal loan amount not to exceed $25 million... February 6, 2013. By the Commission. Kevin O'Neill, Deputy Secretary. [FR Doc. 2013-00795 Filed 1-15-13; 8...

  3. Network issues for large mass storage requirements

    NASA Technical Reports Server (NTRS)

    Perdue, James

    1992-01-01

    File Servers and Supercomputing environments need high performance networks to balance the I/O requirements seen in today's demanding computing scenarios. UltraNet is one solution which permits both high aggregate transfer rates and high task-to-task transfer rates as demonstrated in actual tests. UltraNet provides this capability as both a Server-to-Server and Server-to-Client access network giving the supercomputing center the following advantages highest performance Transport Level connections (to 40 MBytes/sec effective rates); matches the throughput of the emerging high performance disk technologies, such as RAID, parallel head transfer devices and software striping; supports standard network and file system applications using SOCKET's based application program interface such as FTP, rcp, rdump, etc.; supports access to the Network File System (NFS) and LARGE aggregate bandwidth for large NFS usage; provides access to a distributed, hierarchical data server capability using DISCOS UniTree product; supports file server solutions available from multiple vendors, including Cray, Convex, Alliant, FPS, IBM, and others.

  4. The ALFA (Activity Log Files Aggregation) toolkit: a method for precise observation of the consultation.

    PubMed

    de Lusignan, Simon; Kumarapeli, Pushpa; Chan, Tom; Pflug, Bernhard; van Vlymen, Jeremy; Jones, Beryl; Freeman, George K

    2008-09-08

    There is a lack of tools to evaluate and compare Electronic patient record (EPR) systems to inform a rational choice or development agenda. To develop a tool kit to measure the impact of different EPR system features on the consultation. We first developed a specification to overcome the limitations of existing methods. We divided this into work packages: (1) developing a method to display multichannel video of the consultation; (2) code and measure activities, including computer use and verbal interactions; (3) automate the capture of nonverbal interactions; (4) aggregate multiple observations into a single navigable output; and (5) produce an output interpretable by software developers. We piloted this method by filming live consultations (n = 22) by 4 general practitioners (GPs) using different EPR systems. We compared the time taken and variations during coded data entry, prescribing, and blood pressure (BP) recording. We used nonparametric tests to make statistical comparisons. We contrasted methods of BP recording using Unified Modeling Language (UML) sequence diagrams. We found that 4 channels of video were optimal. We identified an existing application for manual coding of video output. We developed in-house tools for capturing use of keyboard and mouse and to time stamp speech. The transcript is then typed within this time stamp. Although we managed to capture body language using pattern recognition software, we were unable to use this data quantitatively. We loaded these observational outputs into our aggregation tool, which allows simultaneous navigation and viewing of multiple files. This also creates a single exportable file in XML format, which we used to develop UML sequence diagrams. In our pilot, the GP using the EMIS LV (Egton Medical Information Systems Limited, Leeds, UK) system took the longest time to code data (mean 11.5 s, 95% CI 8.7-14.2). Nonparametric comparison of EMIS LV with the other systems showed a significant difference, with EMIS PCS (Egton Medical Information Systems Limited, Leeds, UK) (P = .007), iSoft Synergy (iSOFT, Banbury, UK) (P = .014), and INPS Vision (INPS, London, UK) (P = .006) facilitating faster coding. In contrast, prescribing was fastest with EMIS LV (mean 23.7 s, 95% CI 20.5-26.8), but nonparametric comparison showed no statistically significant difference. UML sequence diagrams showed that the simplest BP recording interface was not the easiest to use, as users spent longer navigating or looking up previous blood pressures separately. Complex interfaces with free-text boxes left clinicians unsure of what to add. The ALFA method allows the precise observation of the clinical consultation. It enables rigorous comparison of core elements of EPR systems. Pilot data suggests its capacity to demonstrate differences between systems. Its outputs could provide the evidence base for making more objective choices between systems.

  5. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hellfire Missile Test at Yuma Proving Ground

    DTIC Science & Technology

    2001-11-01

    that there were· no· target misses. The Hellfire missile does not have a depleted uranium head . . -,, 2.2.2.3 Tank movement During the test, the...guide otber users through the use of this. complicated program. The_input data files for NOISEMAP consist of a root file name with several extensions...SOURCES subdirectory. This file will have the root file name followed by an accession number, then the .bps extension. The user must check the *.log

  6. Development of a user-friendly system for image processing of electron microscopy by integrating a web browser and PIONE with Eos.

    PubMed

    Tsukamoto, Takafumi; Yasunaga, Takuo

    2014-11-01

    Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also need to provide common workspace for analysis because the client is physically separated from a server. We solved the file format problem by extension of rules of OptionControlFile of Eos. Furthermore, to solve workspace problems, we have developed two type of system. The first system is to use only local environments. The user runs a web server provided by Eos, access to a web client through a web browser, and manipulate the local files with GUI on the web browser. The second system is employing PIONE (Process-rule for Input/Output Negotiation Environment), which is our developing platform that works under heterogenic distributed environment. The users can put their resources, such as microscopic images, text files and so on, into the server-side environment supported by PIONE, and so experts can write PIONE rule definition, which defines a workflow of image processing. PIONE run each image processing on suitable computers, following the defined rule. PIONE has the ability of interactive manipulation, and user is able to try a command with various setting values. In this situation, we contribute to auto-generation of GUI for a PIONE workflow.As advanced functions, we have developed a module to log user actions. The logs include information such as setting values in image processing, procedure of commands and so on. If we use the logs effectively, we can get a lot of advantages. For example, when an expert may discover some know-how of image processing, other users can also share logs including his know-hows and so we may obtain recommendation workflow of image analysis, if we analyze logs. To implement social platform of image processing for electron microscopists, we have developed system infrastructure, as well. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. 25 CFR 215.23 - Cooperation between superintendent and district mining supervisor.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... notices, reports, drill logs, maps, and records, and all other information relating to mining operations required by said regulations to be submitted by lessees, and shall maintain a file thereof for the superintendent. (b) The files of the Geological Survey supervisor relating to lead and zinc leases of Quapaw...

  8. Agentless Cloud-Wide Monitoring of Virtual Disk State

    DTIC Science & Technology

    2015-10-01

    packages include Apache, MySQL , PHP, Ruby on Rails, Java Application Servers, and many others. Figure 2.12 shows the results of a run of the Software...Linux, Apache, MySQL , PHP (LAMP) set of applications. Thus, many file-level update logs will contain the same versions of files repeated across many

  9. Military Standard Common APSE (Ada Programming Support Environment) Interface Set (CAIS).

    DTIC Science & Technology

    1985-01-01

    QUEUEASE. LAST-KEY (QUEENAME) . LASTREI.TIONI(QUEUE-NAME). FILE-NODE. PORN . ATTRIBUTTES. ACCESSCONTROL. LEVEL); CLOSE (QUEUE BASE); CLOSE(FILE NODE...PROPOSED XIIT-STD-C.4 31 J NNUAfY logs procedure zTERT (ITERATOR: out NODE ITERATON; MAMIE: NAME STRING.KIND: NODE KID : KEY : RELATIONSHIP KEY PA1TTE1 :R

  10. 47 CFR 76.1704 - Proof-of-performance test data.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-performance test data. (a) The proof of performance tests required by § 76.601 shall be maintained on file at... subscribers, subject to the requirements of § 76.601(d). Note to § 76.1704: If a signal leakage log is being... log must be retained for the period specified in § 76.601(d). ...

  11. 49 CFR Appendix A to Part 225 - Schedule of Civil Penalties 1

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... $1,000 $2,000 225.11Reports of accidents/ incidents 2,500 5,000 225.12(a): Failure to file Railroad... noncompliance: (1) a missing or incomplete log entry for a particular employee's injury or illness; or (2) a missing or incomplete log record for a particular rail equipment accident or incident. Each day a...

  12. 47 CFR 76.1704 - Proof-of-performance test data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-performance test data. (a) The proof of performance tests required by § 76.601 shall be maintained on file at... subscribers, subject to the requirements of § 76.601(d). Note to § 76.1704: If a signal leakage log is being... log must be retained for the period specified in § 76.601(d). ...

  13. Beyond Logging of Fingertip Actions: Analysis of Collaborative Learning Using Multiple Sources of Data

    ERIC Educational Resources Information Center

    Avouris, N.; Fiotakis, G.; Kahrimanis, G.; Margaritis, M.; Komis, V.

    2007-01-01

    In this article, we discuss key requirements for collecting behavioural data concerning technology-supported collaborative learning activities. It is argued that the common practice of analysis of computer generated log files of user interactions with software tools is not enough for building a thorough view of the activity. Instead, more…

  14. A Clustering Methodology of Web Log Data for Learning Management Systems

    ERIC Educational Resources Information Center

    Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros

    2012-01-01

    Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…

  15. Consistency of Students' Pace in Online Learning

    ERIC Educational Resources Information Center

    Hershkovitz, Arnon; Nachmias, Rafi

    2009-01-01

    The purpose of this study is to investigate the consistency of students' behavior regarding their pace of actions over sessions within an online course. Pace in a session is defined as the number of logged actions divided by session length (in minutes). Log files of 6,112 students were collected, and datasets were constructed for examining pace…

  16. Streamlining CASTOR to manage the LHC data torrent

    NASA Astrophysics Data System (ADS)

    Lo Presti, G.; Espinal Curull, X.; Cano, E.; Fiorini, B.; Ieri, A.; Murray, S.; Ponce, S.; Sindrilaru, E.

    2014-06-01

    This contribution describes the evolution of the main CERN storage system, CASTOR, as it manages the bulk data stream of the LHC and other CERN experiments, achieving over 90 PB of stored data by the end of LHC Run 1. This evolution was marked by the introduction of policies to optimize the tape sub-system throughput, going towards a cold storage system where data placement is managed by the experiments' production managers. More efficient tape migrations and recalls have been implemented and deployed where bulk meta-data operations greatly reduce the overhead due to small files. A repack facility is now integrated in the system and it has been enhanced in order to automate the repacking of several tens of petabytes, required in 2014 in order to prepare for the next LHC run. Finally the scheduling system has been evolved to integrate the internal monitoring. To efficiently manage the service a solid monitoring infrastructure is required, able to analyze the logs produced by the different components (about 1 kHz of log messages). A new system has been developed and deployed, which uses a transport messaging layer provided by the CERN-IT Agile Infrastructure and exploits technologies including Hadoop and HBase. This enables efficient data mining by making use of MapReduce techniques, and real-time data aggregation and visualization. The outlook for the future is also presented. Directions and possible evolution will be discussed in view of the restart of data taking activities.

  17. Analysis of the Impact of Data Normalization on Cyber Event Correlation Query Performance

    DTIC Science & Technology

    2012-03-01

    2003). Organizations use it in planning, target marketing , decision-making, data analysis, and customer services (Shin, 2003). Organizations that...Following this IP address is a router message sequence number. This is a globally unique number for each router terminal and can range from...Appendix G, invokes the PERL parser for the log files from a particular USAF base, and invokes the CTL file that loads the resultant CSV file into the

  18. Sawmill: A Logging File System for a High-Performance RAID Disk Array

    DTIC Science & Technology

    1995-01-01

    from limiting disk performance, new controller architectures connect the disks directly to the network so that data movement bypasses the file server...These developments raise two questions for file systems: how to get the best performance from a RAID, and how to use such a controller architecture ...the RAID-II storage system; this architecture provides a fast data path that moves data rapidly among the disks, high-speed controller memory, and the

  19. Dataset of aggregate producers in New Mexico

    USGS Publications Warehouse

    Orris, Greta J.

    2000-01-01

    This report presents data, including latitude and longitude, for aggregate sites in New Mexico that were believed to be active in the period 1997-1999. The data are presented in paper form in Part A of this report and as Microsoft Excel 97 and Data Interchange Format (DIF) files in Part B. The work was undertaken as part of the effort to update information for the National Atlas. This compilation includes data from: the files of U.S. Geological Survey (USGS); company contacts; the New Mexico Bureau of Mines and Mineral Resources, New Mexico Bureau of Mine Inspection, and the Mining and Minerals Division of the New Mexico Energy, Minerals and Natural Resources Department (Hatton and others, 1998); the Bureau of Land Management Information; and direct communications with some of the aggregate operators. Additional information on most of the sites is available in Hatton and others (1998).

  20. New Mexico aggregate production sites, 1997-1999

    USGS Publications Warehouse

    Orris, Greta J.

    2000-01-01

    This report presents data, including latitude and longitude, for aggregate sites in New Mexico that were believed to be active in the period 1997-1999. The data are presented in paper form in Part A of this report and as Microsoft Excel 97 and Data Interchange Format (DIF) files in Part B. The work was undertaken as part of the effort to update information for the National Atlas. This compilation includes data from: the files of U.S. Geological Survey (USGS); company contacts; the New Mexico Bureau of Mines and Mineral Resources, New Mexico Bureau of Mine Inspection, and the Mining and Minerals Division of the New Mexico Energy, Minerals and Natural Resources Department (Hatton and others, 1998); the Bureau of Land Management Information; and direct communications with some of the aggregate operators. Additional information on most of the sites is available in Hatton and others (1998).

  1. 32 CFR 776.80 - Initial screening and Rules Counsel.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Director, JA Division, HQMC, to JAR. (b) JAG(13) and JAR shall log all complaints received and will ensure... within 30 days of the date of its return, the Rules Counsel may close the file without further action... action to close the file. (2) Complaints that comply with the requirements shall be further reviewed by...

  2. Coastal bathymetry data collected in 2011 from the Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    DeWitt, Nancy T.; Pfeiffer, William R.; Bernier, Julie C.; Buster, Noreen A.; Miselis, Jennifer L.; Flocks, James G.; Reynolds, Billy J.; Wiese, Dana S.; Kelso, Kyle W.

    2014-01-01

    This report serves as an archive of processed interferometric swath and single-beam bathymetry data. Geographic Iinformation System data products include a 50-meter cell-size interpolated bathymetry grid surface, trackline maps, and point data files. Additional files include error analysis maps, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.

  3. Online Courses Assessment through Measuring and Archetyping of Usage Data

    ERIC Educational Resources Information Center

    Kazanidis, Ioannis; Theodosiou, Theodosios; Petasakis, Ioannis; Valsamidis, Stavros

    2016-01-01

    Database files and additional log files of Learning Management Systems (LMSs) contain an enormous volume of data which usually remain unexploited. A new methodology is proposed in order to analyse these data both on the level of both the courses and the learners. Specifically, "regression analysis" is proposed as a first step in the…

  4. SU-E-T-261: Plan Quality Assurance of VMAT Using Fluence Images Reconstituted From Log-Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katsuta, Y; Shimizu, E; Matsunaga, K

    2014-06-01

    Purpose: A successful VMAT plan delivery includes precise modulations of dose rate, gantry rotational and multi-leaf collimator (MLC) shapes. One of the main problem in the plan quality assurance is dosimetric errors associated with leaf-positional errors are difficult to analyze because they vary with MU delivered and leaf number. In this study, we calculated integrated fluence error image (IFEI) from log-files and evaluated plan quality in the area of all and individual MLC leaves scanned. Methods: The log-file reported the expected and actual position for inner 20 MLC leaves and the dose fraction every 0.25 seconds during prostate VMAT onmore » Elekta Synergy. These data were imported to in-house software that developed to calculate expected and actual fluence images from the difference of opposing leaf trajectories and dose fraction at each time. The IFEI was obtained by adding all of the absolute value of the difference between expected and actual fluence images corresponding. Results: In the area all MLC leaves scanned in the IFEI, the average and root mean square (rms) were 2.5 and 3.6 MU, the area of errors below 10, 5 and 3 MU were 98.5, 86.7 and 68.1 %, the 95 % of area was covered with less than error of 7.1 MU. In the area individual MLC leaves scanned in the IFEI, the average and rms value were 2.1 – 3.0 and 3.1 – 4.0 MU, the area of errors below 10, 5 and 3 MU were 97.6 – 99.5, 81.7 – 89.5 and 51.2 – 72.8 %, the 95 % of area was covered with less than error of 6.6 – 8.2 MU. Conclusion: The analysis of the IFEI reconstituted from log-file was provided detailed information about the delivery in the area of all and individual MLC leaves scanned.« less

  5. Example MODIS Global Cloud Optical and Microphysical Properties: Comparisons between Terra and Aqua

    NASA Technical Reports Server (NTRS)

    Hubanks, P. A.; Platnick, S.; King, M. D.; Ackerman, S. A.; Frey, R. A.

    2003-01-01

    MODIS observations from the NASA EOS Terra spacecraft (launched in December 1999, 1030 local time equatorial crossing) have provided a unique data set of Earth observations. With the launch of the NASA Aqua spacecraft in May 2002 (1330 local time), two MODIS daytime (sunlit) and nighttime observations are now available in a 24 hour period, allowing for some measure of diurnal variability. We report on an initial analysis of several operational global (Level-3) cloud products from the two platforms. The MODIS atmosphere Level-3 products, which include clear-sky and aerosol products in addition to cloud products, are available as three separate files providing daily, eight-day, and monthly aggregations; each temporal aggregation is spatially aggregated to a 1 degree grid. The files contain approximately 600 statisitical datasets (from simple means and standard deviations to 1 - and 2-dimensional histograms). Operational cloud products include detection (cloud fraction), cloud-top properties, and daytimeonly cloud optical thickness and particle effective radius for both water and ice clouds. We will compare example global Terra and Aqua cloud fraction, optical thickness, and effective radius aggregations.

  6. Predicting Correctness of Problem Solving from Low-Level Log Data in Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Cetintas, Suleyman; Si, Luo; Xin, Yan Ping; Hord, Casey

    2009-01-01

    This paper proposes a learning based method that can automatically determine how likely a student is to give a correct answer to a problem in an intelligent tutoring system. Only log files that record students' actions with the system are used to train the model, therefore the modeling process doesn't require expert knowledge for identifying…

  7. 9 CFR 381.204 - Marking of poultry products offered for entry; official import inspection marks and devices.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., Import Inspection Division, is on file at the import inspection facility where the inspection is to be... stamping log containing the following information for each lot of product: the date of inspection, the... container marks, and the MP-410 number covering the product to be inspected. The daily stamping log must be...

  8. From Log Files to Assessment Metrics: Measuring Students' Science Inquiry Skills Using Educational Data Mining

    ERIC Educational Resources Information Center

    Gobert, Janice D.; Sao Pedro, Michael; Raziuddin, Juelaila; Baker, Ryan S.

    2013-01-01

    We present a method for assessing science inquiry performance, specifically for the inquiry skill of designing and conducting experiments, using educational data mining on students' log data from online microworlds in the Inq-ITS system (Inquiry Intelligent Tutoring System; www.inq-its.org). In our approach, we use a 2-step process: First we use…

  9. PIYAS-proceeding to intelligent service oriented memory allocation for flash based data centric sensor devices in wireless sensor networks.

    PubMed

    Rizvi, Sanam Shahla; Chung, Tae-Sun

    2010-01-01

    Flash memory has become a more widespread storage medium for modern wireless devices because of its effective characteristics like non-volatility, small size, light weight, fast access speed, shock resistance, high reliability and low power consumption. Sensor nodes are highly resource constrained in terms of limited processing speed, runtime memory, persistent storage, communication bandwidth and finite energy. Therefore, for wireless sensor networks supporting sense, store, merge and send schemes, an efficient and reliable file system is highly required with consideration of sensor node constraints. In this paper, we propose a novel log structured external NAND flash memory based file system, called Proceeding to Intelligent service oriented memorY Allocation for flash based data centric Sensor devices in wireless sensor networks (PIYAS). This is the extended version of our previously proposed PIYA [1]. The main goals of the PIYAS scheme are to achieve instant mounting and reduced SRAM space by keeping memory mapping information to a very low size of and to provide high query response throughput by allocation of memory to the sensor data by network business rules. The scheme intelligently samples and stores the raw data and provides high in-network data availability by keeping the aggregate data for a longer period of time than any other scheme has done before. We propose effective garbage collection and wear-leveling schemes as well. The experimental results show that PIYAS is an optimized memory management scheme allowing high performance for wireless sensor networks.

  10. Visual behavior characterization for intrusion and misuse detection

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.; Frincke, Deborah

    2001-05-01

    As computer and network intrusions become more and more of a concern, the need for better capabilities, to assist in the detection and analysis of intrusions also increase. System administrators typically rely on log files to analyze usage and detect misuse. However, as a consequence of the amount of data collected by each machine, multiplied by the tens or hundreds of machines under the system administrator's auspices, the entirety of the data available is neither collected nor analyzed. This is compounded by the need to analyze network traffic data as well. We propose a methodology for analyzing network and computer log information visually based on the analysis of the behavior of the users. Each user's behavior is the key to determining their intent and overriding activity, whether they attempt to hide their actions or not. Proficient hackers will attempt to hide their ultimate activities, which hinders the reliability of log file analysis. Visually analyzing the users''s behavior however, is much more adaptable and difficult to counteract.

  11. Parallel file system with metadata distributed across partitioned key-value store c

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  12. Ground-water data for the Hanna and Carbon basins, south-central Wyoming, through 1980

    USGS Publications Warehouse

    Daddow, P.B.

    1986-01-01

    Groundwater resources in the Hanna and Carbon Basins of Wyoming were assessed in a study from 1974 through 1980 because of the development of coal mining in the area. Data collected from 105 wells during that study, including well-completion records, lithologic logs, and water levels, are presented. The data are from stock wells, coal-test holes completed as observation wells by the U.S. Geological Survey. The data are mostly from mined coal-bearing formations: the Tertiary Hanna Formation and the Tertiary and Cretaceous Ferris Formation. Well-completion data and lithologic logs were collected on-site during drilling of the wells or from U.S. Geological Survey files, company records, Wyoming State Engineer well-permit files, and published reports. (USGS)

  13. VizieR Online Data Catalog: The Gemini Observation Log (CADC, 2001-)

    NASA Astrophysics Data System (ADS)

    Association of Universities For Research in Astronomy

    2018-01-01

    This database contains a log of the Gemini Telescope observations since 2001, managed by the Canadian Astronomical Data Center (CADC). The data are regularly updated (see the date of the last version at the end of this file). The Gemini Observatory consists of twin 8.1-meter diameter optical/infrared telescopes located on two of the best observing sites on the planet. From their locations on mountains in Hawai'i and Chile, Gemini Observatory's telescopes can collectively access the entire sky. Gemini is operated by a partnership of five countries including the United States, Canada, Brazil, Argentina and Chile. Any astronomer in these countries can apply for time on Gemini, which is allocated in proportion to each partner's financial stake. (1 data file).

  14. REPHLEX II: An information management system for the ARS Water Data Base

    NASA Astrophysics Data System (ADS)

    Thurman, Jane L.

    1993-08-01

    The REPHLEX II computer system is an on-line information management system which allows scientists, engineers, and other researchers to retrieve data from the ARS Water Data Base using asynchronous communications. The system features two phone lines handling baud rates from 300 to 2400, customized menus to facilitate browsing, help screens, direct access to information and data files, electronic mail processing, file transfers using the XMODEM protocol, and log-in procedures which capture information on new users, process passwords, and log activity for a permanent audit trail. The primary data base on the REPHLEX II system is the ARS Water Data Base which consists of rainfall and runoff data from experimental agricultural watersheds located in the United States.

  15. A new simplified method for measuring the permeability characteristics of highly porous media

    NASA Astrophysics Data System (ADS)

    Qin, Yinghong; Zhang, Mingyi; Mei, Guoxiong

    2018-07-01

    Fluid flow through highly porous media is important in a variety of science and technology fields, including hydrology, chemical engineering, convections in porous media, and others. While many methods have been available to measure the permeability of tight solid materials, such as concrete and rock, the technique for measuring the permeability of highly porous media is limited (such as gravel, aggregated soils, and crushed rock). This study proposes a new simplified method for measuring the permeability of highly porous media with a permeability of 10-8-10-4 m2, using a Venturi tube to gauge the gas flowing rate through the sample. Using crushed rocks and glass beads as the test media, we measure the permeability and inertial resistance factor of six types of single-size aggregate columns. We compare the testing results with the published permeability and inertial resistance factor of crushed rock and of glass beads. We found that in a log-log graph, the permeability and inertial resistance factor of a single-size aggregate heap increases linearly with the mean diameter of the aggregate. We speculate that the proposed simplified method is suitable to efficiently test the permeability and inertial resistance factor of a variety of porous media with an intrinsic permeability of 10-8-10-4 m2.

  16. Ontology based log content extraction engine for a posteriori security control.

    PubMed

    Azkia, Hanieh; Cuppens-Boulahia, Nora; Cuppens, Frédéric; Coatrieux, Gouenou

    2012-01-01

    In a posteriori access control, users are accountable for actions they performed and must provide evidence, when required by some legal authorities for instance, to prove that these actions were legitimate. Generally, log files contain the needed data to achieve this goal. This logged data can be recorded in several formats; we consider here IHE-ATNA (Integrating the healthcare enterprise-Audit Trail and Node Authentication) as log format. The difficulty lies in extracting useful information regardless of the log format. A posteriori access control frameworks often include a log filtering engine that provides this extraction function. In this paper we define and enforce this function by building an IHE-ATNA based ontology model, which we query using SPARQL, and show how the a posteriori security controls are made effective and easier based on this function.

  17. A Prototype Implementation of a Time Interval File Protection System in Linux

    DTIC Science & Technology

    2006-09-01

    when a user logs in, the /etc/ passwd file is read by the system to get the user’s home directory. The user’s login shell then changes the directory...and don. • Users can be added with the command: # useradd – m <username> • Set the password by: # passwd <username> • Make a copy of the

  18. 77 FR 59234 - Self-Regulatory Organizations; NYSE MKT LLC; Notice of Filing of Proposed Rule Change To Amend...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-26

    ... as the product of (i) the Founding Firm aggregate target market share for such measurement period... exceeded its ``Individual Target'' during the measurement period. A Founding Firm's Individual Target is its pro rata portion of an aggregate Founding Firm target contribution to the annual volume of the...

  19. 31 CFR 103.22 - Reports of transactions in currency.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... exemption would not apply); and (D) Jackpots from slot machines or video lottery terminals. (c) Aggregation...'s obligation, to file a report required by § 103.18 with respect to any transaction, including any... transaction that is described in § 103.18(a)(2)(i), (ii), or (iii), or relieves a bank of any reporting or...

  20. Well 9-1 Logs and Data: Roosevelt Hot Spring Area, Utah (FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 9-1 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  1. Understanding Academic Information Seeking Habits through Analysis of Web Server Log Files: The Case of the Teachers College Library Website

    ERIC Educational Resources Information Center

    Asunka, Stephen; Chae, Hui Soo; Hughes, Brian; Natriello, Gary

    2009-01-01

    Transaction logs of user activity on an academic library website were analyzed to determine general usage patterns on the website. This paper reports on insights gained from the analysis, and identifies and discusses issues relating to content access, interface design and general functionality of the website. (Contains 13 figures and 8 tables.)

  2. The Use of OPAC in a Large Academic Library: A Transactional Log Analysis Study of Subject Searching

    ERIC Educational Resources Information Center

    Villen-Rueda, Luis; Senso, Jose A.; de Moya-Anegon, Felix

    2007-01-01

    The analysis of user searches in catalogs has been the topic of research for over four decades, involving numerous studies and diverse methodologies. The present study looks at how different types of users effect queries in the catalog of a university library. For this purpose, we analyzed log files to determine which was the most frequent type of…

  3. Exploring Online Students' Self-Regulated Learning with Self-Reported Surveys and Log Files: A Data Mining Approach

    ERIC Educational Resources Information Center

    Cho, Moon-Heum; Yoo, Jin Soung

    2017-01-01

    Many researchers who are interested in studying students' online self-regulated learning (SRL) have heavily relied on self-reported surveys. Data mining is an alternative technique that can be used to discover students' SRL patterns from large data logs saved on a course management system. The purpose of this study was to identify students' online…

  4. Family Child Care Inventory-Keeper: The Complete Log for Depreciating and Insuring Your Property. Redleaf Business Series.

    ERIC Educational Resources Information Center

    Copeland, Tom

    Figuring depreciation can be the most difficult aspect of filing tax returns for a family child care program. This inventory log for family child care programs is designed to assist in keeping track of the furniture, appliances, and other property used in the child care business; once these items have been identified, they can be deducted as…

  5. Measuring firm size distribution with semi-nonparametric densities

    NASA Astrophysics Data System (ADS)

    Cortés, Lina M.; Mora-Valencia, Andrés; Perote, Javier

    2017-11-01

    In this article, we propose a new methodology based on a (log) semi-nonparametric (log-SNP) distribution that nests the lognormal and enables better fits in the upper tail of the distribution through the introduction of new parameters. We test the performance of the lognormal and log-SNP distributions capturing firm size, measured through a sample of US firms in 2004-2015. Taking different levels of aggregation by type of economic activity, our study shows that the log-SNP provides a better fit of the firm size distribution. We also formally introduce the multivariate log-SNP distribution, which encompasses the multivariate lognormal, to analyze the estimation of the joint distribution of the value of the firm's assets and sales. The results suggest that sales are a better firm size measure, as indicated by other studies in the literature.

  6. 75 FR 40010 - Self-Regulatory Organizations; NYSE Amex LLC; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-13

    ... Proposed Rule Change 1. Purpose Currently, the Exchange aggregates all of an ATP Holder's volume at the trading permit level for purposes of the Firm Proprietary Manual tiers. Recently, certain ATP Holders have... this filing, the Exchange proposes to allow its ATP Holders to elect to have their Firm Proprietary...

  7. 76 FR 54267 - Self-Regulatory Organizations; EDGA Exchange, Inc.; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-31

    ... accommodates diverse business models and trading preferences. The Exchange utilizes technology to aggregate and... at least five business days prior to the date of filing of the proposed rule change, or such shorter... Room, 100 F Street, NE., Washington, DC 20549, on official business days between the hours of 10 a.m...

  8. Modifications to the accuracy assessment analysis routine SPATL to produce an output file

    NASA Technical Reports Server (NTRS)

    Carnes, J. G.

    1978-01-01

    The SPATL is an analysis program in the Accuracy Assessment Software System which makes comparisons between ground truth information and dot labeling for an individual segment. In order to facilitate the aggregation cf this information, SPATL was modified to produce a disk output file containing the necessary information about each segment.

  9. 26 CFR 1.332-6 - Records to be kept and information to be filed with return.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... corporation during the current tax year; (3) The aggregate fair market value and basis, determined immediately... should specifically include information regarding the amount, basis, and fair market value of all... 26 Internal Revenue 4 2010-04-01 2010-04-01 false Records to be kept and information to be filed...

  10. Wister, CA Downhole and Seismic Data

    DOE Data Explorer

    Akerley, John

    2010-12-18

    This submission contains Downhole geophysical logs associated with Wister, CA Wells 12-27 and 85-20. The logs include Spontaneous Potential (SP), HILT Caliper (HCAL), Gamma Ray (GR), Array Induction (AIT), and Neutron Porosity (NPOR) data. Also included are a well log, Injection Test, Pressure Temperature Spinner log, shut in temperature survey, a final well schematic, and files about the well's location and drilling history. This submission also contains data from a three-dimensional (3D) multi-component (3C) seismic reflection survey on the Wister Geothermal prospect area in the northern portion of the Imperial Valley, California. The Wister seismic survey area was 13.2 square miles. (Resistivity image logs (Schlumberger FMI) in 85-20 indicate that maximum horizontal stress (Shmax) is oriented NNE but that open fractures are oriented suboptimally).

  11. The ALFA (Activity Log Files Aggregation) Toolkit: A Method for Precise Observation of the Consultation

    PubMed Central

    2008-01-01

    Background There is a lack of tools to evaluate and compare Electronic patient record (EPR) systems to inform a rational choice or development agenda. Objective To develop a tool kit to measure the impact of different EPR system features on the consultation. Methods We first developed a specification to overcome the limitations of existing methods. We divided this into work packages: (1) developing a method to display multichannel video of the consultation; (2) code and measure activities, including computer use and verbal interactions; (3) automate the capture of nonverbal interactions; (4) aggregate multiple observations into a single navigable output; and (5) produce an output interpretable by software developers. We piloted this method by filming live consultations (n = 22) by 4 general practitioners (GPs) using different EPR systems. We compared the time taken and variations during coded data entry, prescribing, and blood pressure (BP) recording. We used nonparametric tests to make statistical comparisons. We contrasted methods of BP recording using Unified Modeling Language (UML) sequence diagrams. Results We found that 4 channels of video were optimal. We identified an existing application for manual coding of video output. We developed in-house tools for capturing use of keyboard and mouse and to time stamp speech. The transcript is then typed within this time stamp. Although we managed to capture body language using pattern recognition software, we were unable to use this data quantitatively. We loaded these observational outputs into our aggregation tool, which allows simultaneous navigation and viewing of multiple files. This also creates a single exportable file in XML format, which we used to develop UML sequence diagrams. In our pilot, the GP using the EMIS LV (Egton Medical Information Systems Limited, Leeds, UK) system took the longest time to code data (mean 11.5 s, 95% CI 8.7-14.2). Nonparametric comparison of EMIS LV with the other systems showed a significant difference, with EMIS PCS (Egton Medical Information Systems Limited, Leeds, UK) (P = .007), iSoft Synergy (iSOFT, Banbury, UK) (P = .014), and INPS Vision (INPS, London, UK) (P = .006) facilitating faster coding. In contrast, prescribing was fastest with EMIS LV (mean 23.7 s, 95% CI 20.5-26.8), but nonparametric comparison showed no statistically significant difference. UML sequence diagrams showed that the simplest BP recording interface was not the easiest to use, as users spent longer navigating or looking up previous blood pressures separately. Complex interfaces with free-text boxes left clinicians unsure of what to add. Conclusions The ALFA method allows the precise observation of the clinical consultation. It enables rigorous comparison of core elements of EPR systems. Pilot data suggests its capacity to demonstrate differences between systems. Its outputs could provide the evidence base for making more objective choices between systems. PMID:18812313

  12. 78 FR 77545 - Self-Regulatory Organizations; Topaz Exchange LLC; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-23

    .... The Top Quote Feed provides real-time aggregated volume of all quotes and orders at the top price... aggregated volume of all quotes and orders available at each of the top five price levels on the Exchange... information that can help subscribers make informed investment decisions, and operate in the same manner as...

  13. 17 CFR 230.504 - Exemption for limited offerings and sales of securities not exceeding $1,000,000.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... require the public filing and delivery to investors of a substantive disclosure document before sale, and... only to “accredited investors” as defined in § 230.501(a). (2) The aggregate offering price for an... § 230.504 fails to meet the limitation on the aggregate offering price, it does not affect the...

  14. Well 14-2 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 14-2 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  15. Well 52-21 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 52-21 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  16. Well 82-33 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 82-33 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  17. Well Acord 1-26 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joe Moore

    This is a compilation of logs and data from Well Acord 1-26 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  18. Patterns of usage for a Web-based clinical information system.

    PubMed

    Chen, Elizabeth S; Cimino, James J

    2004-01-01

    Understanding how clinicians are using clinical information systems to assist with their everyday tasks is valuable to the system design and development process. Developers of such systems are interested in monitoring usage in order to make enhancements. System log files are rich resources for gaining knowledge about how the system is being used. We have analyzed the log files of our Web-based clinical information system (WebCIS) to obtain various usage statistics including which WebCIS features are frequently being used. We have also identified usage patterns, which convey how the user is traversing the system. We present our method and these results as well as describe how the results can be used to customize menus, shortcut lists, and patient reports in WebCIS and similar systems.

  19. Simulation Control Graphical User Interface Logging Report

    NASA Technical Reports Server (NTRS)

    Hewling, Karl B., Jr.

    2012-01-01

    One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.

  20. Geohydrologic and water-quality characterization of a fractured-bedrock test hole in an area of Marcellus shale gas development, Bradford County, Pennsylvania

    USGS Publications Warehouse

    Risser, Dennis W.; Williams, John H.; Hand, Kristen L.; Behr, Rose-Anna; Markowski, Antonette K.

    2013-01-01

    Open-File Miscellaneous Investigation 13–01.1 presents the results of geohydrologic investigations on a 1,664-foot-deep core hole drilled in the Bradford County part of the Gleason 7.5-minute quadrangle in north-central Pennsylvania. In the text, the authors discuss their methods of investigation, summarize physical and analytical results, and place those results in context. Four appendices include (1) a full description of the core in an Excel worksheet; (2) water-quality and core-isotope analytical results in Excel workbooks; (3) geophysical logs in LAS and PDF files, and an Excel workbook containing attitudes of bedding and fractures calculated from televiewer logs; and (4) MP4 clips from the downhole video at selected horizons.

  1. Autoplot: a Browser for Science Data on the Web

    NASA Astrophysics Data System (ADS)

    Faden, J.; Weigel, R. S.; West, E. E.; Merka, J.

    2008-12-01

    Autoplot (www.autoplot.org) is software for plotting data from many different sources and in many different file formats. Data from CDF, CEF, Fits, NetCDF, and OpenDAP can be plotted, along with many other sources such as ASCII tables and Excel spreadsheets. This is done by adapting these various data formats and APIs into a common data model that borrows from the netCDF and CDF data models. Autoplot uses a web browser metaphor to simplify use. The user specifies a parameter URL, for example a CDF file accessible via http with a parameter name appended, and the file resource is downloaded and the parameter is rendered in a scientifically meaningful way. When data span multiple files, the user can use a file name template in the URL to aggregate (combine) a set of remote files. So the problem of aggregating data across file boundaries is handled on the client side, allowing simple web servers to be used. The das2 graphics library provides rich controls for exploring the data. Scripting is supported through Python, providing not just programmatic control, but for calculating new parameters in a language that will look familiar to IDL and Matlab users. Autoplot is Java-based software, and will run on most computers without a burdensome installation process. It can also used as an applet or as a servlet that serves static images. Autoplot was developed as part of the Virtual Radiation Belt Observatory (ViRBO) project, and is also being used for the Virtual Magnetospheric Observatory (VMO). It is expected that this flexible, general-purpose plotting tool will be useful for allowing a data provider to add instant visualization capabilities to a directory of files or for general use in the Virtual Observatory environment.

  2. The intersection of health and wealth: association between personal bankruptcy and myocardial infarction rates in Canada.

    PubMed

    Savu, Anamaria; Schopflocher, Donald; Scholnick, Barry; Kaul, Padma

    2016-01-13

    We examined the association between personal bankruptcy filing and acute myocardial infarction (AMI) rates in Canada. Between 2002 and 2009, aggregate and yearly bankruptcy and AMI rates were estimated for 1,155 forward sortation areas of Canada. Scatter plot and correlations were used to assess the association of the aggregate rates. Cross-lagged structural equation models were used to explore the longitudinal relationship between bankruptcy and AMI after adjustment for socio-economic factors. A cross-lagged structural equation model estimated that on average, an increase of 100 in bankruptcy filing count is associated with an increase of 1.5 (p = 0.02) in AMI count in the following year, and an increase of 100 in AMI count is associated with an increase of 7 (p < 0.01) in bankruptcy filing count. We found that regions with higher rates of AMI corresponded to those with higher levels of economic and financial stress, as indicated by personal bankruptcy rate, and vice-versa.

  3. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates large-scale quality control of FASTQ files by carrying out a QC protocol, parsing results, and aggregating quality metrics within and across experiments into an interactive dashboard. The dashboard utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  4. Zebra: A striped network file system

    NASA Technical Reports Server (NTRS)

    Hartman, John H.; Ousterhout, John K.

    1992-01-01

    The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.

  5. Version 4.0 of code Java for 3D simulation of the CCA model

    NASA Astrophysics Data System (ADS)

    Fan, Linyu; Liao, Jianwei; Zuo, Junsen; Zhang, Kebo; Li, Chao; Xiong, Hailing

    2018-07-01

    This paper presents a new version Java code for the three-dimensional simulation of Cluster-Cluster Aggregation (CCA) model to replace the previous version. Many redundant traverses of clusters-list in the program were totally avoided, so that the consumed simulation time is significantly reduced. In order to show the aggregation process in a more intuitive way, we have labeled different clusters with varied colors. Besides, a new function is added for outputting the particle's coordinates of aggregates in file to benefit coupling our model with other models.

  6. Navigating Streams of Paper.

    ERIC Educational Resources Information Center

    Bennett-Abney, Cheryl

    2001-01-01

    Three organizational tools for counselors are described: three-ring binder for notes, forms, and schedules; daily log of time and activities; and a tickler file with tasks arranged by days of the week. (SK)

  7. SU-E-J-182: Reproducibility of Tumor Motion Probability Distribution Function in Stereotactic Body Radiation Therapy of Lung Using Real-Time Tumor-Tracking Radiotherapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiinoki, T; Hanazawa, H; Park, S

    2015-06-15

    Purpose: We aim to achieve new four-dimensional radiotherapy (4DRT) using the next generation real-time tumor-tracking (RTRT) system and flattening-filter-free techniques. To achieve new 4DRT, it is necessary to understand the respiratory motion of tumor. The purposes of this study were: 1.To develop the respiratory motion analysis tool using log files. 2.To evaluate the reproducibility of tumor motion probability distribution function (PDF) during stereotactic body RT (SBRT) of lung tumor. Methods: Seven patients having fiducial markers closely implanted to the lung tumor were enrolled in this study. The positions of fiducial markers were measured using the RTRT system (Mitsubishi Electronics Co.,more » JP) and recorded as two types of log files during the course of SBRT. For each patients, tumor motion range and tumor motion PDFs in left-right (LR), anterior-posterior (AP) and superior-inferior (SI) directions were calculated using log files of all beams per fraction (PDFn). Fractional PDF reproducibility (Rn) was calculated as Kullback-Leibler (KL) divergence between PDF1 and PDFn of tumor motion. The mean of Rn (Rm) was calculated for each patient and correlated to the patient’s mean tumor motion range (Am). The change of Rm during the course of SBRT was also evluated. These analyses were performed using in-house developed software. Results: The Rm were 0.19 (0.07–0.30), 0.14 (0.07–0.32) and 0.16 (0.09–0.28) in LR, AP and SI directions, respectively. The Am were 5.11 mm (2.58–9.99 mm), 7.81 mm (2.87–15.57 mm) and 11.26 mm (3.80–21.27 mm) in LR, AP and SI directions, respectively. The PDF reproducibility decreased as the tumor motion range increased in AP and SI direction. That decreased slightly through the course of RT in SI direction. Conclusion: We developed the respiratory motion analysis tool for 4DRT using log files and quantified the range and reproducibility of respiratory motion for lung tumors.« less

  8. Seasonal Terpene Variation in Needles of Pinus radiata (Pinales: Pinaceae) Trees Attacked by Tomicus piniperda (Coleoptera: Scolytinae) and the Effect of Limonene on Beetle Aggregation

    PubMed Central

    Romón, Pedro; Aparicio, Domitila; Palacios, Francisco; Iturrondobeitia, Juan Carlos; Hance, Thierry

    2017-01-01

    Abstract Concentrations of four monoterpenes were determined in needles of Pinus radiata (D.Don) (Pinales: Pinaceae) trees that were attacked or nonattacked by Tomicus piniperda (L.) (Coleoptera: Scolytinae). Compounds were identified and quantified by gas chromatography–mass spectrometry. The mean ambient temperature was obtained using climate-recording data loggers. The effect of limonene on field aggregation was also evaluated at three limonene release rates using Lindgren attractant-baited traps and trap logs. Attacked trees produced less α-pinene in March, July, and November than nonattacked trees, less β-pinene in July and November, and less limonene from May to November. Limonene reduced the attraction of T. piniperda to attractant-baited traps and trap logs. Results were linked to better responses to high temperatures, with respect to terpene contents, by the nonattacked trees after the spring attack. PMID:29117373

  9. Cyber Fundamental Exercises

    DTIC Science & Technology

    2013-03-01

    the /bin, /sbin, /etc, /var/log, /home, /proc, /root, /dev, /tmp, and /lib directories • Describe the purpose of the /etc/shadow and /etc/ passwd ...UNLIMITED 19 2.6.2 /etc/ passwd and /etc/shadow The /etc/shadow file didn’t exist on early Linux distributions. Originally only root could access the...etc/ passwd file, which stored user names, user configuration information, and passwords. However, when common programs such as ls running under

  10. S-wave refraction survey of alluvial aggregate

    USGS Publications Warehouse

    Ellefsen, Karl J.; Tuttle, Gary J.; Williams, Jackie M.; Lucius, Jeffrey E.

    2005-01-01

    An S-wave refraction survey was conducted in the Yampa River valley near Steamboat Springs, Colo., to determine how well this method could map alluvium, a major source of construction aggregate. At the field site, about 1 m of soil overlaid 8 m of alluvium that, in turn, overlaid sedimentary bedrock. The traveltimes of the direct and refracted S-waves were used to construct velocity cross sections whose various regions were directly related to the soil, alluvium, and bed-rock. The cross sections were constrained to match geologic logs that were developed from drill-hole data. This constraint minimized the ambiguity in estimates of the thickness and the velocity of the alluvium, an ambiguity that is inherent to the S-wave refraction method. In the cross sections, the estimated S-wave velocity of the alluvium changed in the horizontal direction, and these changes were attributed to changes in composition of the alluvium. The estimated S-wave velocity of the alluvium was practically constant in the vertical direc-tion, indicating that the fine layering observed in the geologic logs could not be detected. The S-wave refraction survey, in conjunction with independent information such as geologic logs, was found to be suitable for mapping the thickness of the alluvium.

  11. Big Bicycle Data Processing: from Personal Data to Urban Applications

    NASA Astrophysics Data System (ADS)

    Pettit, C. J.; Lieske, S. N.; Leao, S. Z.

    2016-06-01

    Understanding the flows of people moving through the built environment is a vital source of information for the planners and policy makers who shape our cities. Smart phone applications enable people to trace themselves through the city and these data can potentially be then aggregated and visualised to show hot spots and trajectories of macro urban movement. In this paper our aim is to develop procedures for cleaning, aggregating and visualising human movement data and translating this into policy relevant information. In conducting this research we explore using bicycle data collected from a smart phone application known as RiderLog. We focus on the RiderLog application initially in the context of Sydney, Australia and discuss the procedures and challenges in processing and cleaning this data before any analysis can be made. We then present some preliminary map results using the CartoDB online mapping platform where data are aggregated and visualised to show hot spots and trajectories of macro urban movement. We conclude the paper by highlighting some of the key challenges in working with such data and outline some next steps in processing the data and conducting higher volume and more extensive analysis.

  12. 47 CFR 22.359 - Emission limitations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... + 10 log (P) dB. (b) Measurement procedure. Compliance with these rules is based on the use of... contract in their station files and disclose it to prospective assignees or transferees and, upon request...

  13. 7 CFR 274.5 - Record retention and forms security.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... control logs, or similar controls from the point of initial receipt through the issuance and.... (2) For notices of change which initiate, update or terminate the master issuance file, the State...

  14. Network Basics.

    ERIC Educational Resources Information Center

    Tennant, Roy

    1992-01-01

    Explains how users can find and access information resources available on the Internet. Highlights include network information centers (NICs); lists, both formal and informal; computer networking protocols, including international standards; electronic mail; remote log-in; and file transfer. (LRW)

  15. [27- Hydroxycholesterol reverses estradiol induced inhibition of platelet aggregation in postmenopausal women].

    PubMed

    Rocha, Gladys; Sierralta, Walter; Valladares, Luis

    2016-11-01

    The decline of estrogen levels increases cardiovascular risk in women. Platelets express estrogen receptors and 17β-estradiol- (E2) can produce a protective effect on thrombus formation. The hydroxylation of cholesterol generates several sterols and 27-hydroxycholesterol (27HC) predominates in circulation. To evaluate the effect of 27HC as an endogenous antagonist of the anti-aggregating properties of E2 in platelets of postmenopausal women. Platelet function of postmenopausal women was evaluated ex-vivo. Platelets pre-incubated with 27HC in the presence or absence of E2, were stimulated with collagen. Aggregation was evaluated using turbidimetry using a Chrono-log aggregometer. Collagen-stimulated platelet aggregation was significantly inhibited by E2. The inhibitory effect of E2 on collagen-stimulated platelet aggregation was significantly reversed in the presence of 27HC. The suppressive effect of E2 on platelet aggregation is inhibited by 27HC, which could contribute to increase cardiovascular risk in postmenopausal women.

  16. User-Friendly Data Servers for Climate Studies at the Asia-Pacific Data-Research Center (APDRC)

    NASA Astrophysics Data System (ADS)

    Yuan, G.; Shen, Y.; Zhang, Y.; Merrill, R.; Waseda, T.; Mitsudera, H.; Hacker, P.

    2002-12-01

    The APDRC was recently established within the International Pacific Research Center (IPRC) at the University of Hawaii. The APDRC mission is to increase understanding of climate variability in the Asia-Pacific region by developing the computational, data-management, and networking infrastructure necessary to make data resources readily accessible and usable by researchers, and by undertaking data-intensive research activities that will both advance knowledge and lead to improvements in data preparation and data products. A focus of recent activity is the implementation of user-friendly data servers. The APDRC is currently running a Live Access Server (LAS) developed at NOAA/PMEL to provide access to and visualization of gridded climate products via the web. The LAS also allows users to download the selected data subsets in various formats (such as binary, netCDF and ASCII). Most of the datasets served by the LAS are also served through our OPeNDAP server (formerly DODS), which allows users to directly access the data using their desktop client tools (e.g. GrADS, Matlab and Ferret). In addition, the APDRC is running an OPeNDAP Catalog/Aggregation Server (CAS) developed by Unidata at UCAR to serve climate data and products such as model output and satellite-derived products. These products are often large (> 2 GB) and are therefore stored as multiple files (stored separately in time or in parameters). The CAS remedies the inconvenience of multiple files and allows access to the whole dataset (or any subset that cuts across the multiple files) via a single request command from any DODS enabled client software. Once the aggregation of files is configured at the server (CAS), the process of aggregation is transparent to the user. The user only needs to know a single URL for the entire dataset, which is, in fact, stored as multiple files. CAS even allows aggregation of files on different systems and at different locations. Currently, the APDRC is serving NCEP, ECMWF, SODA, WOCE-Satellite, TMI, GPI and GSSTF products through the CAS. The APDRC is also running an EPIC server developed by PMEL/NOAA. EPIC is a web-based, data search and display system suited for in situ (station versus gridded) data. The process of locating and selecting individual station data from large collections (millions of profiles or time series, etc.) of in situ data is a major challenge. Serving in situ data on the Internet faces two problems: the irregularity of data formats; and the large quantity of data files. To solve the first problem, we have converted the in situ data into netCDF data format. The second problem was solved by using the EPIC server, which allows users to easily subset the files using a friendly graphical interface. Furthermore, we enhanced the capability of EPIC and configured OPeNDAP into EPIC to serve the numerous in situ data files and to export them to users through two different options: 1) an OPeNDAP pointer file of user-selected data files; and 2) a data package that includes meta-information (e.g., location, time, cruise no, etc.), a local pointer file, and the data files that the user selected. Option 1) is for those who do not want to download the selected data but want to use their own application software (such as GrADS, Matlab and Ferret) for access and analysis; option 2) is for users who want to store the data on their own system (e.g. laptops before going for a cruise) for subsequent analysis. Currently, WOCE CTD and bottle data, the WOCE current meter data, and some Argo float data are being served on the EPIC server.

  17. The Galley Parallel File System

    NASA Technical Reports Server (NTRS)

    Nieuwejaar, Nils; Kotz, David

    1996-01-01

    Most current multiprocessor file systems are designed to use multiple disks in parallel, using the high aggregate bandwidth to meet the growing I/0 requirements of parallel scientific applications. Many multiprocessor file systems provide applications with a conventional Unix-like interface, allowing the application to access multiple disks transparently. This interface conceals the parallelism within the file system, increasing the ease of programmability, but making it difficult or impossible for sophisticated programmers and libraries to use knowledge about their I/O needs to exploit that parallelism. In addition to providing an insufficient interface, most current multiprocessor file systems are optimized for a different workload than they are being asked to support. We introduce Galley, a new parallel file system that is intended to efficiently support realistic scientific multiprocessor workloads. We discuss Galley's file structure and application interface, as well as the performance advantages offered by that interface.

  18. VizieR Online Data Catalog: CoRoT red giants abundances (Morel+, 2014)

    NASA Astrophysics Data System (ADS)

    Morel, T.; Miglio, A.; Lagarde, N.; Montalban, J.; Rainer, M.; Poretti, E.; Eggenberger, P.; Hekker, S.; Kallinger, T.; Mosser, B.; Valentini, M.; Carrier, F.; Hareter, M.; Mantegazza, L.

    2014-02-01

    The equivalent widths were measured manually assuming Gaussian profiles or Voigt profiles for the few lines with extended damping wings. Lines with an unsatisfactory fit or significantly affected by telluric features were discarded. Only values eventually retained for the analysis are provided. For the chemical abundances, the usual notation is used: [X/Y]=[log({epsilon}(X))-log({epsilon}(Y))]star - [log({epsilon}(X))-log({epsilon}(Y))]⊙ with log{epsilon}(X)=12+log[N(X)/N(H)] (N is the number density of the species). For lithium, the following notation is used: [Li/H]=log(N(Li))star-log(N(Li))⊙. The adopted solar abundances are taken from Grevesse & Sauval (1998SSRv...85..161G), except for Li for which we adopt our derived values: log({epsilon}(Li))⊙=1.09 and 1.13 in LTE and NLTE, respectively (see text). All the abundances are computed under the assumption of LTE, except Li for which values corrected for departures from LTE using the data of Lind et al. (2009A&A...503..541L) are also provided. All the quoted error bars are 1-sigma uncertainties. (6 data files).

  19. Aggregation of Adenovirus 2 in Source Water and Impacts on Disinfection by Chlorine

    PubMed Central

    Cromeans, Theresa L.; Metcalfe, Maureen G.; Humphrey, Charles D.; Hill, Vincent R.

    2016-01-01

    It is generally accepted that viral particles in source water are likely to be found as aggregates attached to other particles. For this reason, it is important to investigate the disinfection efficacy of chlorine on aggregated viruses. A method to produce adenovirus particle aggregation was developed for this study. Negative stain electron microscopy was used to measure aggregation before and after addition of virus particles to surface water at different pH and specific conductance levels. The impact of aggregation on the efficacy of chlorine disinfection was also examined. Disinfection experiments with human adenovirus 2 (HAdV2) in source water were conducted using 0.2 mg/L free chlorine at 5 °C. Aggregation of HAdV2 in source water (≥3 aggregated particles) remained higher at higher specific conductance and pH levels. However, aggregation was highly variable, with the percentage of particles present in aggregates ranging from 43 to 71 %. Upon addition into source water, the aggregation percentage dropped dramatically. On average, chlorination CT values (chlorine concentration in mg/L × time in min) for 3-log10 inactivation of aggregated HAdV2 were up to three times higher than those for dispersed HAdV2, indicating that aggregation reduced the disinfection rate. This information can be used by water utilities and regulators to guide decision making regarding disinfection of viruses in water. PMID:26910058

  20. Aggregation of Adenovirus 2 in Source Water and Impacts on Disinfection by Chlorine.

    PubMed

    Kahler, Amy M; Cromeans, Theresa L; Metcalfe, Maureen G; Humphrey, Charles D; Hill, Vincent R

    2016-06-01

    It is generally accepted that viral particles in source water are likely to be found as aggregates attached to other particles. For this reason, it is important to investigate the disinfection efficacy of chlorine on aggregated viruses. A method to produce adenovirus particle aggregation was developed for this study. Negative stain electron microscopy was used to measure aggregation before and after addition of virus particles to surface water at different pH and specific conductance levels. The impact of aggregation on the efficacy of chlorine disinfection was also examined. Disinfection experiments with human adenovirus 2 (HAdV2) in source water were conducted using 0.2 mg/L free chlorine at 5 °C. Aggregation of HAdV2 in source water (≥3 aggregated particles) remained higher at higher specific conductance and pH levels. However, aggregation was highly variable, with the percentage of particles present in aggregates ranging from 43 to 71 %. Upon addition into source water, the aggregation percentage dropped dramatically. On average, chlorination CT values (chlorine concentration in mg/L × time in min) for 3-log10 inactivation of aggregated HAdV2 were up to three times higher than those for dispersed HAdV2, indicating that aggregation reduced the disinfection rate. This information can be used by water utilities and regulators to guide decision making regarding disinfection of viruses in water.

  1. SU-F-T-230: A Simple Method to Assess Accuracy of Dynamic Wave Arc Irradiation Using An Electronic Portal Imaging Device and Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirashima, H; Miyabe, Y; Yokota, K

    2016-06-15

    Purpose: The Dynamic Wave Arc (DWA) technique, where the multi-leaf collimator (MLC) and gantry/ring move simultaneously in a predefined non-coplanar trajectory, has been developed on the Vero4DRT. The aim of this study is to develop a simple method for quality assurance of DWA delivery using an electronic portal imaging device (EPID) measurements and log files analysis. Methods: The Vero4DRT has an EPID on the beam axis, the resolution of which is 0.18 mm/pixel at the isocenter plane. EPID images were acquired automatically. To verify the detection accuracy of the MLC position by EPID images, the MLC position with intentional errorsmore » was assessed. Tests were designed considering three factors: (1) accuracy of the MLC position (2) dose output consistency with variable dose rate (160–400 MU/min), gantry speed (2.4–6°/s), ring speed (0.5–2.5°/s), and (3) MLC speed (1.6–4.2 cm/s). All the patterns were delivered to the EPID and compared with those obtained with a stationary radiation beam with a 0° gantry angle. The irradiation log, including the MLC position and gantry/ring angle, were recorded simultaneously. To perform independent checks of the machine accuracy, the MLC position and gantry/ring angle position were assessed using log files. Results: 0.1 mm intentional error can be detected by the EPID, which is smaller than the EPID pixel size. The dose outputs with different conditions of the dose rate and gantry/ring speed and MLC speed showed good agreement, with a root mean square (RMS) error of 0.76%. The RMS error between the detected and recorded data were 0.1 mm for the MLC position, 0.12° for the gantry angle, and 0.07° for the ring angle. Conclusion: The MLC position and dose outputs in variable conditions during DWA irradiation can be easily detected using EPID measurements and log file analysis. The proposed method is useful for routine verification. This research is (partially) supported by the Practical Research for Innovative Cancer Control (15Ack0106151h0001) from Japan Agency for Medical Research and development, AMED. Authors Takashi Mizowaki and Masahiro Hiraoka have consultancy agreement with Mitsubishi Heavy Industries, Ltd., Japan.« less

  2. VizieR Online Data Catalog: NGC 2264, NGC 2547 and NGC 2516 stellar radii (Jackson+, 2016)

    NASA Astrophysics Data System (ADS)

    Jackson, R. J.; Jeffries, R. D.; Randich, S.; Bragaglia, A.; Carraro, G.; Costado, M. T.; Flaccomio, E.; Lanzafame; Lardo, C.; Monaco, L.; Morbidelli, L.; Smiljanic, R.; Zaggia, S.

    2015-11-01

    File Table1.dat contains Photometric and spectroscopic data of GES Survey targets in clusters in NGC 2547, NGC 2516, NGC 22264 downloaded from the Edinburugh GES archive (http://ges/roe.ac.uk/) . Photometric data comprised the (Cousins) I magnitude and 2MASS J, H and K magnitudes. Spectroscopic data comprises the signal to noise ratio, S/N of the target spectrum, the radial velocity, RV (in km/s), the projected equatorial velocity, vsini (in km/s), the number of separate observations co-added to produce the target spectrum and the log of effective temperature (logTeff) of the template spectrum fitted to measure RV and vsini. The absolute precision in RV, pRV (in km/s) and relative precision vsini (pvsini) were estimated, as a function of the logTeff, vsini and S/N, using the prescription described in Jackson et al. (2015A&A...580A..75J, Cat. J/A+A/580/A75). File Table3.dat contains measured and calculated properties of cluster targets with resolved vsini and a reported rotation period. The cluster name, right ascension, RA (deg) and declination, Dec (deg) are given for targets with measured periods given in the literature. Dynamic properties comprise: the radial velocity, RV (in km/s), the absolute precision in RV, pRV (km/s), the projected equatorial velocity, vsini (in km/s), the relative precision in vsini (pvsini) and the rotational period (in days). Also shown are values of absolute K magnitude, MK log of luminosity, log L (in solar units) and probability of cluster membership estimated using cluster data given in the text. Period shows reported values of cluster taken from the literature Estimated values of the projected radius, Rsini (in Rsolar) and uncertainty in projected radius, e_Rsini (in Rsolar) are given for targets where vsini>5km/s and pvsini>0.2. The final column shows a flag which is set to 1 for targets in cluster NGC 2264 where a (H-K) versus (J-H) colour-colour plot indicates possible infra-red excess. Period shows reported values of cluster taken from the literature (2 data files).

  3. Optimizing Earth Data Search Ranking using Deep Learning and Real-time User Behaviour

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C. P.; Armstrong, E. M.; Huang, T.; Moroni, D. F.; McGibbney, L. J.; Greguska, F. R., III

    2017-12-01

    Finding Earth science data has been a challenging problem given both the quantity of data available and the heterogeneity of the data across a wide variety of domains. Current search engines in most geospatial data portals tend to induce end users to focus on one single data characteristic dimension (e.g., term frequency-inverse document frequency (TF-IDF) score, popularity, release date, etc.). This approach largely fails to take account of users' multidimensional preferences for geospatial data, and hence may likely result in a less than optimal user experience in discovering the most applicable dataset out of a vast range of available datasets. With users interacting with search engines, sufficient information is already hidden in the log files. Compared with explicit feedback data, information that can be derived/extracted from log files is virtually free and substantially more timely. In this dissertation, I propose an online deep learning framework that can quickly update the learning function based on real-time user clickstream data. The contributions of this framework include 1) a log processor that can ingest, process and create training data from web logs in a real-time manner; 2) a query understanding module to better interpret users' search intent using web log processing results and metadata; 3) a feature extractor that identifies ranking features representing users' multidimensional interests of geospatial data; and 4) a deep learning based ranking algorithm that can be trained incrementally using user behavior data. The search ranking results will be evaluated using precision at K and normalized discounted cumulative gain (NDCG).

  4. 43 CFR 2743.3 - Leased disposal sites.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... review of all records and inspection reports on file with the Bureau of Land Management, State, and local... landfill concerning site management and a review of all reports and logs pertaining to the type and amount...

  5. 25 CFR 214.13 - Diligence; annual expenditures; mining records.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... within 90 days after an ore body of sufficient quantity is discovered, and shown by the logs or records.... Lessee shall, before commencing operations, file with the superintendent a plat and preliminary statement...

  6. 47 CFR 22.861 - Emission limitations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... below the transmitting power (P) by a factor of at least 43 + 10 log (P) dB. (b) Measurement procedure... maintain a copy of the contract in their station files and disclose it to prospective assignees or...

  7. Development of Cross-Platform Software for Well Logging Data Visualization

    NASA Astrophysics Data System (ADS)

    Akhmadulin, R. K.; Miraev, A. I.

    2017-07-01

    Well logging data processing is one of the main sources of information in the oil-gas field analysis and is of great importance in the process of its development and operation. Therefore, it is important to have the software which would accurately and clearly provide the user with processed data in the form of well logs. In this work, there have been developed a software product which not only has the basic functionality for this task (loading data from .las files, well log curves display, etc.), but can be run in different operating systems and on different devices. In the article a subject field analysis and task formulation have been performed, and the software design stage has been considered. At the end of the work the resulting software product interface has been described.

  8. Contamination Analysis Tools

    NASA Technical Reports Server (NTRS)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  9. AliEn—ALICE environment on the GRID

    NASA Astrophysics Data System (ADS)

    Saiz, P.; Aphecetche, L.; Bunčić, P.; Piskač, R.; Revsbech, J.-E.; Šego, V.; Alice Collaboration

    2003-04-01

    AliEn ( http://alien.cern.ch) (ALICE Environment) is a Grid framework built on top of the latest Internet standards for information exchange and authentication (SOAP, PKI) and common Open Source components. AliEn provides a virtual file catalogue that allows transparent access to distributed datasets and a number of collaborating Web services which implement the authentication, job execution, file transport, performance monitor and event logging. In the paper we will present the architecture and components of the system.

  10. Parallel compression of data chunks of a shared data object using a log-structured file system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-10-25

    Techniques are provided for parallel compression of data chunks being written to a shared object. A client executing on a compute node or a burst buffer node in a parallel computing system stores a data chunk generated by the parallel computing system to a shared data object on a storage node by compressing the data chunk; and providing the data compressed data chunk to the storage node that stores the shared object. The client and storage node may employ Log-Structured File techniques. The compressed data chunk can be de-compressed by the client when the data chunk is read. A storagemore » node stores a data chunk as part of a shared object by receiving a compressed version of the data chunk from a compute node; and storing the compressed version of the data chunk to the shared data object on the storage node.« less

  11. Design and development of an automatic data acquisition system for a balance study using a smartcard system.

    PubMed

    Ambrozy, C; Kolar, N A; Rattay, F

    2010-01-01

    For measurement value logging of board angle values during balance training, it is necessary to develop a measurement system. This study will provide data for a balance study using the smartcard. The data acquisition comes automatically. An individually training plan for each proband is necessary. To store the proband identification a smartcard with an I2C data bus protocol and an E2PROM memory system is used. For reading the smartcard data a smartcard reader is connected via universal serial bus (USB) to a notebook. The data acquisition and smartcard read programme is designed with Microsoft® Visual C#. A training plan file contains the individual training plan for each proband. The data of the test persons are saved in a proband directory. Each event is automatically saved as a log-file for the exact documentation. This system makes study development easy and time-saving.

  12. VizieR Online Data Catalog: GOALS sample PACS and SPIRE fluxes (Chu+, 2017)

    NASA Astrophysics Data System (ADS)

    Chu, J. K.; Sanders, D. B.; Larson, K. L.; Mazzarella, J. M.; Howell, J. H.; Diaz-Santos, T.; Xu, K. C.; Paladini, R.; Schulz, B.; Shupe, D.; Appleton, P.; Armus, L.; Billot, N.; Chan, B. H. P.; Evans, A. S.; Fadda, D.; Frayer, D. T.; Haan, S.; Ishida, C. M.; Iwasawa, K.; Kim, D.-C.; Lord, S.; Murphy, E.; Petric, A.; Privon, G. C.; Surace, J. A.; Treister, E.

    2017-06-01

    The IRAS RBGS contains 179 LIRGs (log(LIR/Lȯ)= 22 ultra-luminous infrared galaxies (ULIRGs: log(LIR/Lȯ)>=12.0); these 201 total objects comprise the GOALS sample (Armus et al. 2009), a statistically complete flux-limited sample of infrared-luminous galaxies in the local universe. This paper presents imaging and photometry for all 201 LIRGs and LIRG systems in the IRAS RBGS that were observed during our GOALS Herschel OT1 program. (4 data files).

  13. The key image and case log application: new radiology software for teaching file creation and case logging that incorporates elements of a social network.

    PubMed

    Rowe, Steven P; Siddiqui, Adeel; Bonekamp, David

    2014-07-01

    To create novel radiology key image software that is easy to use for novice users, incorporates elements adapted from social networking Web sites, facilitates resident and fellow education, and can serve as the engine for departmental sharing of interesting cases and follow-up studies. Using open-source programming languages and software, radiology key image software (the key image and case log application, KICLA) was developed. This system uses a lightweight interface with the institutional picture archiving and communications systems and enables the storage of key images, image series, and cine clips. It was designed to operate with minimal disruption to the radiologists' daily workflow. Many features of the user interface have been inspired by social networking Web sites, including image organization into private or public folders, flexible sharing with other users, and integration of departmental teaching files into the system. We also review the performance, usage, and acceptance of this novel system. KICLA was implemented at our institution and achieved widespread popularity among radiologists. A large number of key images have been transmitted to the system since it became available. After this early experience period, the most commonly encountered radiologic modalities are represented. A survey distributed to users revealed that most of the respondents found the system easy to use (89%) and fast at allowing them to record interesting cases (100%). Hundred percent of respondents also stated that they would recommend a system such as KICLA to their colleagues. The system described herein represents a significant upgrade to the Digital Imaging and Communications in Medicine teaching file paradigm with efforts made to maximize its ease of use and inclusion of characteristics inspired by social networking Web sites that allow the system additional functionality such as individual case logging. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  14. High-Performance, Multi-Node File Copies and Checksums for Clustered File Systems

    NASA Technical Reports Server (NTRS)

    Kolano, Paul Z.; Ciotti, Robert B.

    2012-01-01

    Modern parallel file systems achieve high performance using a variety of techniques, such as striping files across multiple disks to increase aggregate I/O bandwidth and spreading disks across multiple servers to increase aggregate interconnect bandwidth. To achieve peak performance from such systems, it is typically necessary to utilize multiple concurrent readers/writers from multiple systems to overcome various singlesystem limitations, such as number of processors and network bandwidth. The standard cp and md5sum tools of GNU coreutils found on every modern Unix/Linux system, however, utilize a single execution thread on a single CPU core of a single system, and hence cannot take full advantage of the increased performance of clustered file systems. Mcp and msum are drop-in replacements for the standard cp and md5sum programs that utilize multiple types of parallelism and other optimizations to achieve maximum copy and checksum performance on clustered file systems. Multi-threading is used to ensure that nodes are kept as busy as possible. Read/write parallelism allows individual operations of a single copy to be overlapped using asynchronous I/O. Multinode cooperation allows different nodes to take part in the same copy/checksum. Split-file processing allows multiple threads to operate concurrently on the same file. Finally, hash trees allow inherently serial checksums to be performed in parallel. Mcp and msum provide significant performance improvements over standard cp and md5sum using multiple types of parallelism and other optimizations. The total speed-ups from all improvements are significant. Mcp improves cp performance over 27x, msum improves md5sum performance almost 19x, and the combination of mcp and msum improves verified copies via cp and md5sum by almost 22x. These improvements come in the form of drop-in replacements for cp and md5sum, so are easily used and are available for download as open source software at http://mutil.sourceforge.net.

  15. 78 FR 38743 - Self-Regulatory Organizations; National Securities Clearing Corporation; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-27

    ... effective, and timely benchmarking and other relevant information mechanism, than other similar aggregating... about their own businesses and business relationships, and benchmarking information about the overall...

  16. BenMAP Downloads

    EPA Pesticide Factsheets

    Download the current and legacy versions of the BenMAP program. Download configuration and aggregation/pooling/valuation files to estimate benefits. BenMAP-CE is free and open source software, and the source code is available upon request.

  17. Log-linear human chorionic gonadotropin elimination in cases of retained placenta percreta.

    PubMed

    Stitely, Michael L; Gerard Jackson, M; Holls, William H

    2014-02-01

    To describe the human chorionic gonadotropin (hCG) elimination rate in patients with intentionally retained placenta percreta. Medical records for cases of placenta percreta with intentional retention of the placenta were reviewed. The natural log of the hCG levels were plotted versus time and then the elimination rate equations were derived. The hCG elimination rate equations were log-linear in three cases individually (R (2) = 0.96-0.99) and in aggregate R (2) = 0.92). The mean half-life of hCG elimination was 146.3 h (6.1 days). The elimination of hCG in patients with intentionally retained placenta percreta is consistent with a two-compartment elimination model. The hCG elimination in retained placenta percreta is predictable in a log-linear manner that is similar to other reports of retained abnormally adherent placentae treated with or without methotrexate.

  18. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    PubMed

    Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen

    2015-01-01

    Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  19. SU-E-T-144: Effective Analysis of VMAT QA Generated Trajectory Log Files for Medical Accelerator Predictive Maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Able, CM; Baydush, AH; Nguyen, C

    Purpose: To determine the effectiveness of SPC analysis for a model predictive maintenance process that uses accelerator generated parameter and performance data contained in trajectory log files. Methods: Each trajectory file is decoded and a total of 131 axes positions are recorded (collimator jaw position, gantry angle, each MLC, etc.). This raw data is processed and either axis positions are extracted at critical points during the delivery or positional change over time is used to determine axis velocity. The focus of our analysis is the accuracy, reproducibility and fidelity of each axis. A reference positional trace of the gantry andmore » each MLC is used as a motion baseline for cross correlation (CC) analysis. A total of 494 parameters (482 MLC related) were analyzed using Individual and Moving Range (I/MR) charts. The chart limits were calculated using a hybrid technique that included the use of the standard 3σ limits and parameter/system specifications. Synthetic errors/changes were introduced to determine the initial effectiveness of I/MR charts in detecting relevant changes in operating parameters. The magnitude of the synthetic errors/changes was based on: TG-142 and published analysis of VMAT delivery accuracy. Results: All errors introduced were detected. Synthetic positional errors of 2mm for collimator jaw and MLC carriage exceeded the chart limits. Gantry speed and each MLC speed are analyzed at two different points in the delivery. Simulated Gantry speed error (0.2 deg/sec) and MLC speed error (0.1 cm/sec) exceeded the speed chart limits. Gantry position error of 0.2 deg was detected by the CC maximum value charts. The MLC position error of 0.1 cm was detected by the CC maximum value location charts for every MLC. Conclusion: SPC I/MR evaluation of trajectory log file parameters may be effective in providing an early warning of performance degradation or component failure for medical accelerator systems.« less

  20. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  1. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE PAGES

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  2. 47 CFR 22.917 - Emission limitations for cellular equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... frequency ranges must be attenuated below the transmitting power (P) by a factor of at least 43 + 10 log(P... such contract shall maintain a copy of the contract in their station files and disclose it to...

  3. Seasonal Terpene Variation in Needles of Pinus radiata (Pinales: Pinaceae) Trees Attacked by Tomicus piniperda (Coleoptera: Scolytinae) and the Effect of Limonene on Beetle Aggregation.

    PubMed

    Romón, Pedro; Aparicio, Domitila; Palacios, Francisco; Iturrondobeitia, Juan Carlos; Hance, Thierry; Goldarazena, Arturo

    2017-09-01

    Concentrations of four monoterpenes were determined in needles of Pinus radiata (D.Don) (Pinales: Pinaceae) trees that were attacked or nonattacked by Tomicus piniperda (L.) (Coleoptera: Scolytinae). Compounds were identified and quantified by gas chromatography-mass spectrometry. The mean ambient temperature was obtained using climate-recording data loggers. The effect of limonene on field aggregation was also evaluated at three limonene release rates using Lindgren attractant-baited traps and trap logs. Attacked trees produced less α-pinene in March, July, and November than nonattacked trees, less β-pinene in July and November, and less limonene from May to November. Limonene reduced the attraction of T. piniperda to attractant-baited traps and trap logs. Results were linked to better responses to high temperatures, with respect to terpene contents, by the nonattacked trees after the spring attack. © The Author 2017. Published by Oxford University Press on behalf of Entomological Society of America.

  4. Modelling the structure of sludge aggregates

    PubMed Central

    Smoczyński, Lech; Ratnaweera, Harsha; Kosobucka, Marta; Smoczyński, Michał; Kalinowski, Sławomir; Kvaal, Knut

    2016-01-01

    ABSTRACT The structure of sludge is closely associated with the process of wastewater treatment. Synthetic dyestuff wastewater and sewage were coagulated using the PAX and PIX methods, and electro-coagulated on aluminium electrodes. The processes of wastewater treatment were supported with an organic polymer. The images of surface structures of the investigated sludge were obtained using scanning electron microscopy (SEM). The software image analysis permitted obtaining plots log A vs. log P, wherein A is the surface area and P is the perimeter of the object, for individual objects comprised in the structure of the sludge. The resulting database confirmed the ‘self-similarity’ of the structural objects in the studied groups of sludge, which enabled calculating their fractal dimension and proposing models for these objects. A quantitative description of the sludge aggregates permitted proposing a mechanism of the processes responsible for their formation. In the paper, also, the impact of the structure of the investigated sludge on the process of sedimentation, and dehydration of the thickened sludge after sedimentation, was discussed. PMID:26549812

  5. Trucks involved in fatal accidents factbook 2007.

    DOT National Transportation Integrated Search

    2010-01-01

    This document presents aggregate statistics on trucks involved in traffic accidents in 2007. The : statistics are derived from the Trucks Involved in Fatal Accidents (TIFA) file, compiled by the : University of Michigan Transportation Research Instit...

  6. Buses involved in fatal accidents factbook 2007

    DOT National Transportation Integrated Search

    2010-03-01

    This document presents aggregate statistics on buses involved in traffic accidents in 2007. The : statistics are derived from the Buses Involved in Fatal Accidents (BIFA) file, compiled by the : University of Michigan Transportation Research Institut...

  7. Trucks involved in fatal accidents factbook 2008.

    DOT National Transportation Integrated Search

    2011-03-01

    This document presents aggregate statistics on trucks involved in traffic accidents in 2008. The : statistics are derived from the Trucks Involved in Fatal Accidents (TIFA) file, compiled by the : University of Michigan Transportation Research Instit...

  8. 7 CFR 984.437 - Methods for proposing names of additional candidates to be included on walnut growers' nomination...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....35(a)(5) and (b)(6), any ten or more such growers who marketed an aggregate of 500 or more tons of... specified in § 984.35(a)(3) and (4) or § 984.35(b)(4) and (5) and who marketed an aggregate of 500 or more... section shall be on forms supplied by the Board and filed no later than April 1 of the nomination year...

  9. Paleomagnetic dating: Methods, MATLAB software, example

    NASA Astrophysics Data System (ADS)

    Hnatyshin, Danny; Kravchinsky, Vadim A.

    2014-09-01

    A MATLAB software tool has been developed to provide an easy to use graphical interface for the plotting and interpretation of paleomagnetic data. The tool takes either paleomagnetic directions or paleopoles and compares them to a user defined apparent polar wander path or secular variation curve to determine the age of a paleomagnetic sample. Ages can be determined in two ways, either by translating the data onto the reference curve, or by rotating it about a set location (e.g. sampling location). The results are then compiled in data tables which can be exported as an excel file. This data can also be plotted using variety of built-in stereographic projections, which can then be exported as an image file. This software was used to date the giant Sukhoi Log gold deposit in Russia. Sukhoi Log has undergone a complicated history of faulting, folding, metamorphism, and is the vicinity of many granitic bodies. Paleomagnetic analysis of Sukhoi Log allowed for the timing of large scale thermal or chemical events to be determined. Paleomagnetic analysis from gold mineralized black shales was used to define the natural remanent magnetization recorded at Sukhoi Log. The obtained paleomagnetic direction from thermal demagnetization produced a paleopole at 61.3°N, 155.9°E, with the semi-major axis and semi-minor axis of the 95% confidence ellipse being 16.6° and 15.9° respectively. This paleopole is compared to the Siberian apparent polar wander path (APWP) by translating the paleopole to the nearest location on the APWP. This produced an age of 255.2- 31.0+ 32.0Ma and is the youngest well defined age known for Sukhoi Log. We propose that this is the last major stage of activity at Sukhoi Log, and likely had a role in determining the present day state of mineralization seen at the deposit.

  10. 75 FR 76426 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-08

    ..., access control lists, file system permissions, intrusion detection and prevention systems and log..., address, mailing address, country, organization, phone, fax, mobile, pager, Defense Switched Network (DSN..., address, mailing address, country, organization, phone, fax, mobile, pager, Defense Switched Network (DSN...

  11. An Efficient Format for Nearly Constant-Time Access to Arbitrary Time Intervals in Large Trace Files

    DOE PAGES

    Chan, Anthony; Gropp, William; Lusk, Ewing

    2008-01-01

    A powerful method to aid in understanding the performance of parallel applications uses log or trace files containing time-stamped events and states (pairs of events). These trace files can be very large, often hundreds or even thousands of megabytes. Because of the cost of accessing and displaying such files, other methods are often used that reduce the size of the tracefiles at the cost of sacrificing detail or other information. This paper describes a hierarchical trace file format that provides for display of an arbitrary time window in a time independent of the total size of the file and roughlymore » proportional to the number of events within the time window. This format eliminates the need to sacrifice data to achieve a smaller trace file size (since storage is inexpensive, it is necessary only to make efficient use of bandwidth to that storage). The format can be used to organize a trace file or to create a separate file of annotations that may be used with conventional trace files. We present an analysis of the time to access all of the events relevant to an interval of time and we describe experiments demonstrating the performance of this file format.« less

  12. A Scientific Data Provenance Harvester for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan, Eric G.; Raju, Bibi; Elsethagen, Todd O.

    Data provenance provides a way for scientists to observe how experimental data originates, conveys process history, and explains influential factors such as experimental rationale and associated environmental factors from system metrics measured at runtime. The US Department of Energy Office of Science Integrated end-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project has developed a provenance harvester that is capable of collecting observations from file based evidence typically produced by distributed applications. To achieve this, file based evidence is extracted and transformed into an intermediate data format inspired in part by W3C CSV on the Web recommendations, calledmore » the Harvester Provenance Application Interface (HAPI) syntax. This syntax provides a general means to pre-stage provenance into messages that are both human readable and capable of being written to a provenance store, Provenance Environment (ProvEn). HAPI is being applied to harvest provenance from climate ensemble runs for Accelerated Climate Modeling for Energy (ACME) project funded under the U.S. Department of Energy’s Office of Biological and Environmental Research (BER) Earth System Modeling (ESM) program. ACME informally provides provenance in a native form through configuration files, directory structures, and log files that contain success/failure indicators, code traces, and performance measurements. Because of its generic format, HAPI is also being applied to harvest tabular job management provenance from Belle II DIRAC scheduler relational database tables as well as other scientific applications that log provenance related information.« less

  13. Characterization of an Aggregation Pheromone in Hylesinus pruinosus (Coleoptera: Curculionidae: Scolytinae)

    Treesearch

    William Shepherd; Brian Sullivan; Bradley Hoosier; JoAnne Barrett; Tessa Bauman

    2010-01-01

    We conducted laboratory and field bioassays to characterize the pheromone system of an ash bark beetle, Hylesinus pruinosus Eichhoff (Coleoptera: Curculionidae: Scolytinae). Solitary females in newly initiated galleries in ash logs produced (+)-exo-brevicomin, whereas male beetles paired with females produced (+)-endo-brevicomin, lesser quantities of...

  14. 75 FR 4728 - Occupational Injury and Illness Recording and Reporting Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-29

    ... and aggregate results available for both research and for public information. BLS only publishes... stores, and shipyards, the information from that column would have provided baseline and post... absence of the column, a person interested in MSD incidence must study every entry on the log to determine...

  15. 77 FR 74006 - Polychlorinated Biphenyls (PCBs); Recycling Plastics From Shredder Residue

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-12

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an... from metals recycling facilities (referred to by ISRI as automobile shredder residue (ASR) aggregate...

  16. Stabilization techniques for reactive aggregate in soil-cement base course.

    DOT National Transportation Integrated Search

    2003-01-01

    Anhydrite (CaSO4) beds occur as a cap rock on a salt dome in Winn Parish in north Louisiana. Locally known as Winn Rock, it has been quarried for gravel for road building. It has been used as a surface course for local parish and logging roads. Stabi...

  17. 77 FR 10451 - Fishing Tackle Containing Lead; Disposition of Petition Filed Pursuant to TSCA Section 21

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-22

    ... visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...

  18. Gamma-index method sensitivity for gauging plan delivery accuracy of volumetric modulated arc therapy.

    PubMed

    Park, Jong In; Park, Jong Min; Kim, Jung-In; Park, So-Yeon; Ye, Sung-Joon

    2015-12-01

    The aim of this study was to investigate the sensitivity of the gamma-index method according to various gamma criteria for volumetric modulated arc therapy (VMAT). Twenty head and neck (HN) and twenty prostate VMAT plans were retrospectively selected for this study. Both global and local 2D gamma evaluations were performed with criteria of 3%/3 mm, 2%/2 mm, 1%/2 mm and 2%/1 mm. In this study, the global and local gamma-index calculated the differences in doses relative to the maximum dose and the dose at the current measurement point, respectively. Using log files acquired during delivery, the differences in parameters at every control point between the VMAT plans and the log files were acquired. The differences in dose-volumetric parameters between reconstructed VMAT plans using the log files and the original VMAT plans were calculated. The Spearman's rank correlation coefficients (rs) were calculated between the passing rates and those differences. Considerable correlations with statistical significances were observed between global 1%/2 mm, local 1%/2 mm and local 2%/1 mm and the MLC position differences (rs = -0.712, -0.628 and -0.581). The numbers of rs values with statistical significance between the passing rates and the changes in dose-volumetric parameters were largest in global 2%/2 mm (n = 16), global 2%/1 mm (n = 15) and local 2%/1 mm (n = 13) criteria. Local gamma-index method with 2%/1 mm generally showed higher sensitivity to detect deviations between a VMAT plan and the delivery of the VMAT plan. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. The Scalable Checkpoint/Restart Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moody, A.

    The Scalable Checkpoint/Restart (SCR) library provides an interface that codes may use to worite our and read in application-level checkpoints in a scalable fashion. In the current implementation, checkpoint files are cached in local storage (hard disk or RAM disk) on the compute nodes. This technique provides scalable aggregate bandwidth and uses storage resources that are fully dedicated to the job. This approach addresses the two common drawbacks of checkpointing a large-scale application to a shared parallel file system, namely, limited bandwidth and file system contention. In fact, on current platforms, SCR scales linearly with the number of compute nodes.more » It has been benchmarked as high as 720GB/s on 1094 nodes of Atlas, which is nearly two orders of magnitude faster thanthe parallel file system.« less

  20. Neuropsychological constraints to human data production on a global scale

    NASA Astrophysics Data System (ADS)

    Gros, C.; Kaczor, G.; Marković, D.

    2012-01-01

    Which are the factors underlying human information production on a global level? In order to gain an insight into this question we study a corpus of 252-633 mil. publicly available data files on the Internet corresponding to an overall storage volume of 284-675 Terabytes. Analyzing the file size distribution for several distinct data types we find indications that the neuropsychological capacity of the human brain to process and record information may constitute the dominant limiting factor for the overall growth of globally stored information, with real-world economic constraints having only a negligible influence. This supposition draws support from the observation that the files size distributions follow a power law for data without a time component, like images, and a log-normal distribution for multimedia files, for which time is a defining qualia.

  1. 77 FR 47453 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-08

    ... acronym or MPID may aggregate their trading activity for purposes of these rates. Qualification for these rates will require that a market participant appropriately indicate his trading acronym and/or MPID in...

  2. 76 FR 2742 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-14

    ... relationship. Direct liquidity provision is beneficial to NASDAQ and to the marketplace generally. Direct... liquidity provision rather than compensating the effort required to aggregate order flow. To encourage the... [[Page 2743

  3. SU-E-T-100: Designing a QA Tool for Enhance Dynamic Wedges Based On Dynalog Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yousuf, A; Hussain, A

    2014-06-01

    Purpose: A robust quality assurance (QA) program for computer controlled enhanced dynamic wedge (EDW) has been designed and tested. Calculations to perform such QA test is based upon the EDW dynamic log files generated during dose delivery. Methods: Varian record and verify system generates dynamic log (dynalog) files during dynamic dose delivery. The system generated dynalog files contain information such as date and time of treatment, energy, monitor units, wedge orientation, and type of treatment. It also contains the expected calculated segmented treatment tables (STT) and the actual delivered STT for the treatment delivery as a verification record. These filesmore » can be used to assess the integrity and precision of the treatment plan delivery. The plans were delivered with a 6 MV beam from a Varian linear accelerator. For available EDW angles (10°, 15°, 20°, 25°, 30°, 45°, and 60°) Varian STT values were used to manually calculate monitor units for each segment. It can also be used to calculate the EDW factors. Independent verification of fractional MUs per segment was performed against those generated from dynalog files. The EDW factors used to calculate MUs in TPS were dosimetrically verified in solid water phantom with semiflex chamber on central axis. Results: EDW factors were generated from the STT provided by Varian and verified against practical measurements. The measurements were in agreement of the order of 1 % to the calculated EDW data. Variation between the MUs per segment obtained from dynalog files and those manually calculated was found to be less than 2%. Conclusion: An efficient and easy tool to perform routine QA procedure of EDW is suggested. The method can be easily implemented in any institution without a need for expensive QA equipment. An error of the order of ≥2% can be easily detected.« less

  4. Tsunami Size Distributions at Far-Field Locations from Aggregated Earthquake Sources

    NASA Astrophysics Data System (ADS)

    Geist, E. L.; Parsons, T.

    2015-12-01

    The distribution of tsunami amplitudes at far-field tide gauge stations is explained by aggregating the probability of tsunamis derived from individual subduction zones and scaled by their seismic moment. The observed tsunami amplitude distributions of both continental (e.g., San Francisco) and island (e.g., Hilo) stations distant from subduction zones are examined. Although the observed probability distributions nominally follow a Pareto (power-law) distribution, there are significant deviations. Some stations exhibit varying degrees of tapering of the distribution at high amplitudes and, in the case of the Hilo station, there is a prominent break in slope on log-log probability plots. There are also differences in the slopes of the observed distributions among stations that can be significant. To explain these differences we first estimate seismic moment distributions of observed earthquakes for major subduction zones. Second, regression models are developed that relate the tsunami amplitude at a station to seismic moment at a subduction zone, correcting for epicentral distance. The seismic moment distribution is then transformed to a site-specific tsunami amplitude distribution using the regression model. Finally, a mixture distribution is developed, aggregating the transformed tsunami distributions from all relevant subduction zones. This mixture distribution is compared to the observed distribution to assess the performance of the method described above. This method allows us to estimate the largest tsunami that can be expected in a given time period at a station.

  5. 75 FR 27051 - Privacy Act of 1974: System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-13

    ... address and appears below: DOT/FMCSA 004 SYSTEM NAME: National Consumer Complaint Database (NCCDB.... A system, database, and procedures for filing and logging consumer complaints relating to household... are stored in an automated system operated and maintained at the Volpe National Transportation Systems...

  6. 20 CFR 655.201 - Temporary labor certification applications.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Temporary labor certification applications... applications. (a)(1) An employer who anticipates a labor shortage of workers for agricultural or logging... an agent file, in duplicate, a temporary labor certification application, signed by the employer...

  7. Information Retrieval Using Hadoop Big Data Analysis

    NASA Astrophysics Data System (ADS)

    Motwani, Deepak; Madan, Madan Lal

    This paper concern on big data analysis which is the cognitive operation of probing huge amounts of information in an attempt to get uncovers unseen patterns. Through Big Data Analytics Applications such as public and private organization sectors have formed a strategic determination to turn big data into cut throat benefit. The primary occupation of extracting value from big data give rise to a process applied to pull information from multiple different sources; this process is known as extract transforms and lode. This paper approach extract information from log files and Research Paper, awareness reduces the efforts for blueprint finding and summarization of document from several positions. The work is able to understand better Hadoop basic concept and increase the user experience for research. In this paper, we propose an approach for analysis log files for finding concise information which is useful and time saving by using Hadoop. Our proposed approach will be applied on different research papers on a specific domain and applied for getting summarized content for further improvement and make the new content.

  8. Application of Architectural Patterns and Lightweight Formal Method for the Validation and Verification of Safety Critical Systems

    DTIC Science & Technology

    2013-09-01

    to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS

  9. Development of a Methodology for Customizing Insider Threat Auditing on a Linux Operating System

    DTIC Science & Technology

    2010-03-01

    information /etc/group, passwd ,gshadow,shadow,/security/opasswd 16 User A attempts to access User B directory 17 User A attempts to access User B file w/o...configuration Handled by audit rules for root actions Audit user write attempts to system files -w /etc/group –p wxa -w /etc/ passwd –p wxa -w /etc/gshadow –p...information (/etc/group, /etc/ passwd , /etc/gshadow, /etc/shadow, /etc/sudoers, /etc/security/opasswd) Procedure: 1. User2 logs into the system

  10. 27 CFR 24.274 - Failure to timely pay tax or file a return.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... percent for each month or fraction thereof of the delinquency, not exceeding 25 percent in the aggregate, unless it is shown that the delinquency is due to reasonable cause and not to willful neglect. (Sec. 201...

  11. Marine Science/Business & Office. B7. CHOICE: Challenging Options in Career Education.

    ERIC Educational Resources Information Center

    Putnam and Northern Westchester Counties Board of Cooperative Educational Services, Yorktown Heights, NY.

    The documents aggregated here comprise the grade six unit of a career education curriculum designed for migrant students. Focusing on marine science, business, and office occupations, the combined teacher and student logs contain learning activities related to nine jobs: hydrographer, marine biologist, fish hatchery technician, boat builder,…

  12. 75 FR 69644 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-15

    ..., organization, phone, fax, mobile, pager, Defense Switched Network (DSN) phone, other fax, other mobile, other.../Transport Layer Security (SSL/ TLS) connections, access control lists, file system permissions, intrusion detection and prevention systems and log monitoring. Complete access to all records is restricted to and...

  13. The RIACS Intelligent Auditing and Categorizing System

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1988-01-01

    The organization of the RIACS auditing package is described along with how to installation instructions and how to interpret the output. How to set up both local and remote file system auditing is given. Logging is done on a time driven basis, and auditing in a passive mode.

  14. VizieR Online Data Catalog: Wide binaries in Tycho-Gaia: search method (Andrews+, 2017)

    NASA Astrophysics Data System (ADS)

    Andrews, J. J.; Chaname, J.; Agueros, M. A.

    2017-11-01

    Our catalogue of wide binaries identified in the Tycho-Gaia Astrometric Solution catalogue. The Gaia source IDs, Tycho IDs, astrometry, posterior probabilities for both the log-flat prior and power-law prior models, and angular separation are presented. (1 data file).

  15. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  16. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  17. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  18. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  19. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doug Blankenship

    Natural fracture data from wells 33-7, 33A-7,52A-7, 52B-7 and 83-11 at West Flank. Fracture orientations were determined from image logs of these wells (see accompanying submissions). Data files contain depth, apparent (in wellbore reference frame) and true (in geographic reference frame) azimuth and dip, respectively.

  1. Modulation of electrostatic interactions to reveal a reaction network unifying the aggregation behaviour of the Aβ42 peptide and its variants† †Electronic supplementary information (ESI) available. See DOI: 10.1039/c7sc00215g Click here for additional data file.

    PubMed Central

    Meisl, Georg; Yang, Xiaoting

    2017-01-01

    The aggregation of the amyloid β peptide (Aβ42), which is linked to Alzheimer's disease, can be altered significantly by modulations of the peptide's intermolecular electrostatic interactions. Variations in sequence and solution conditions have been found to lead to highly variable aggregation behaviour. Here we modulate systematically the electrostatic interactions governing the aggregation kinetics by varying the ionic strength of the solution. We find that changes in the solution ionic strength induce a switch in the reaction pathway, altering the dominant mechanisms of aggregate multiplication. This strategy thereby allows us to continuously sample a large space of different reaction mechanisms and develop a minimal reaction network that unifies the experimental kinetics under a wide range of different conditions. More generally, this universal reaction network connects previously separate systems, such as charge mutants of the Aβ42 peptide, on a continuous mechanistic landscape, providing a unified picture of the aggregation mechanism of Aβ42. PMID:28979758

  2. TU-D-209-05: Automatic Calculation of Organ and Effective Dose for CBCT and Interventional Fluoroscopic Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Oines, A

    Purpose: To compare PCXMC and EGSnrc calculated organ and effective radiation doses from cone-beam computed tomography (CBCT) and interventional fluoroscopically-guided procedures using automatic exposure-event grouping. Methods: For CBCT, we used PCXMC20Rotation.exe to automatically calculate the doses and compared the results to those calculated using EGSnrc with the Zubal patient phantom. For interventional procedures, we use the dose tracking system (DTS) which we previously developed to produce a log file of all geometry and exposure parameters for every x-ray pulse during a procedure, and the data in the log file is input into PCXMC and EGSnrc for dose calculation. A MATLABmore » program reads data from the log files and groups similar exposures to reduce calculation time. The definition files are then automatically generated in the format used by PCXMC and EGSnrc. Processing is done at the end of the procedure after all exposures are completed. Results: For the Toshiba Infinix CBCT LCI-Middle-Abdominal protocol, most organ doses calculated with PCXMC20Rotation closely matched those calculated with EGSnrc. The effective doses were 33.77 mSv with PCXMC20Rotation and 32.46 mSv with EGSnrc. For a simulated interventional cardiac procedure, similar close agreement in organ dose was obtained between the two codes; the effective doses were 12.02 mSv with PCXMC and 11.35 mSv with EGSnrc. The calculations can be completed on a PC without manual intervention in less than 15 minutes with PCXMC and in about 10 hours with EGSnrc, depending on the level of data grouping and accuracy desired. Conclusion: Effective dose and most organ doses in CBCT and interventional radiology calculated by PCXMC closely match those calculated by EGSnrc. Data grouping, which can be done automatically, makes the calculation time with PCXMC on a standard PC acceptable. This capability expands the dose information that can be provided by the DTS. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.« less

  3. VizieR Online Data Catalog: New atmospheric parameters of MILES cool stars (Sharma+, 2016)

    NASA Astrophysics Data System (ADS)

    Sharma, K.; Prugniel, P.; Singh, H. P.

    2015-11-01

    MILES V2 spectral interpolator The FITS file is an improved version of MILES interpolator previously presented in PVK. It contains the coefficients of the interpolator, which allows one to compute an interpolated spectrum, giving an effective temperature, log of surface gravity and metallicity (Teff, logg, and [Fe/H]). The file consists of three extensions containing the three temperature regimes described in the paper. Extension Teff range 0 warm 4000-9000K 1 hot >7000K 2 cold <4550K The three functions are linearly interpolated in the Teff overlapping regions. Each extension contains a 2D image-type array, whose first axis is the wavelength described by a WCS (Air wavelength, starting at 3536Å, step=0.9Å). This FITS file can be used by the ULySS v1.3 or higher. (5 data files).

  4. Catalog of electronic data products

    NASA Astrophysics Data System (ADS)

    1990-07-01

    The catalog lists and describes the public-use data files produced by the National Center for Health Statistics (NCHS). More than 500 public-use data files, representing most of the NCHS data collection programs, are available for purchase and use. Public-use data files are prepared and disseminated to speed and enhance access to the full scope of data. NCHS data systems include a national vital registration program; household interview and health examination surveys; surveys of hospitals, nursing homes, physicians, and other health care providers; and other periodic or occasional data collection activities to produce a wide spectrum of health and health-related data. NCHS data users encompass all levels of government, the academic and research communities, and business. The majority of the data files released by NCHS contain microdata to allow researchers to aggregate findings in whatever format appropriate for their analyses.

  5. Collaborative Online Communities for Increased MILSATCOM Performance

    DTIC Science & Technology

    2009-09-01

    47 3. Folksonomy ...60 2. Aggregation in the NMT Data Collection System...........................63 3. Folksonomies in the NMT Data Collection System...2008). 3. Folksonomy The challenge with any filing system, whether physical or electronic, is making a decision about where to place something. Once

  6. 77 FR 42654 - Trifloxystrobin; Pesticide Tolerance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-20

    ... code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). This... filing. III. Aggregate Risk Assessment and Determination of Safety Section 408(b)(2)(A)(i) of FFDCA... dose at which adverse effects of concern are identified (the LOAEL). Uncertainty/safety factors are...

  7. 88. Photographic copy of historic photo, June 25, 1930 (original ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    88. Photographic copy of historic photo, June 25, 1930 (original print filed in Record Group 115, National Archives, Washington, D.C.). CEMENT AGGREGATE AND WATER CONTROLS AT OWYHEE DAM MIXING PLANT. - Owyhee Dam, Across Owyhee River, Nyssa, Malheur County, OR

  8. Mesoscale properties of clay aggregates from potential of mean force representation of interactions between nanoplatelets

    NASA Astrophysics Data System (ADS)

    Ebrahimi, Davoud; Whittle, Andrew J.; Pellenq, Roland J.-M.

    2014-04-01

    Face-to-face and edge-to-edge free energy interactions of Wyoming Na-montmorillonite platelets were studied by calculating potential of mean force along their center to center reaction coordinate using explicit solvent (i.e., water) molecular dynamics and free energy perturbation methods. Using a series of configurations, the Gay-Berne potential was parametrized and used to examine the meso-scale aggregation and properties of platelets that are initially random oriented under isothermal-isobaric conditions. Aggregates of clay were defined by geometrical analysis of face-to-face proximity of platelets with size distribution described by a log-normal function. The isotropy of the microstructure was assessed by computing a scalar order parameter. The number of platelets per aggregate and anisotropy of the microstructure both increases with platelet plan area. The system becomes more ordered and aggregate size increases with increasing pressure until maximum ordered state at confining pressure of 50 atm. Further increase of pressure slides platelets relative to each other leading to smaller aggregate size. The results show aggregate size of (3-8) platelets for sodium-smectite in agreement with experiments (3-10). The geometrical arrangement of aggregates affects mechanical properties of the system. The elastic properties of the meso-scale aggregate assembly are reported and compared with nanoindentation experiments. It is found that the elastic properties at this scale are close to the cubic systems. The elastic stiffness and anisotropy of the assembly increases with the size of the platelets and the level of external pressure.

  9. VizieR Online Data Catalog: Distances to RRab stars from WISE and Gaia (Sesar+, 2017)

    NASA Astrophysics Data System (ADS)

    Sesar, B.; Fouesneau, M.; Price-Whelan, A. M.; Bailer-Jones, C. A. L.; Gould, A.; Rix, H.-W.

    2017-10-01

    To constrain the period-luminosity-metallicity (PLZ) relations for RR Lyrae stars in WISE W1 and W2 bands, we use TGAS trigonometric parallaxes (barω), spectroscopic metallicities ([Fe/H]; Fernley+ 1998, J/A+A/330/515), log-periods (logP, base 10), and apparent magnitudes (m; Klein+ 2014, J/MNRAS/440/L96) for 102 RRab stars within ~2.5kpc from the Sun. The E(B-V) reddening at a star's position is obtained from the Schlegel+ (1998ApJ...500..525S) dust map. (1 data file).

  10. LACIE performance predictor final operational capability program description, volume 3

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The requirements and processing logic for the LACIE Error Model program (LEM) are described. This program is an integral part of the Large Area Crop Inventory Experiment (LACIE) system. LEM is that portion of the LPP (LACIE Performance Predictor) which simulates the sample segment classification, strata yield estimation, and production aggregation. LEM controls repetitive Monte Carlo trials based on input error distributions to obtain statistical estimates of the wheat area, yield, and production at different levels of aggregation. LEM interfaces with the rest of the LPP through a set of data files.

  11. Wave-Ice Interaction and the Marginal Ice Zone

    DTIC Science & Technology

    2013-09-30

    concept, using a high-quality attitude and heading reference system ( AHRS ) together with an accurate twin-antennae GPS compass. The instruments logged...the AHRS parameters at 50Hz, together with GPS-derived fixes, heading (accurate to better than 1o) and velocities at 10Hz. The 30MB hourly files

  12. Log on to the Future: One School's Success Story.

    ERIC Educational Resources Information Center

    Hovenic, Ginger

    This paper describes Clear View Elementary School's (California) successful experience with integrating technology into the curriculum. Since its inception seven years ago, the school has acquired 250 computers, networked them all on two central file servers, and computerized the library and trained all staff members to be proficient facilitators…

  13. 40 CFR 146.14 - Information to be considered by the Director.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., logging procedures, deviation checks, and a drilling, testing, and coring program; and (16) A certificate... information listed below which are current and accurate in the file. For a newly drilled Class I well, the..., construction, date drilled, location, depth, record of plugging and/or completion, and any additional...

  14. All Aboard the Internet.

    ERIC Educational Resources Information Center

    Descy, Don E.

    1993-01-01

    This introduction to the Internet with examples for Macintosh computer users demonstrates the ease of using e-mail, participating on discussion group listservs, logging in to remote sites using Telnet, and obtaining resources using the File Transfer Protocol (FTP). Included are lists of discussion groups, Telnet sites, and FTP Archive sites. (EA)

  15. A Query Analysis of Consumer Health Information Retrieval

    PubMed Central

    Hong, Yi; de la Cruz, Norberto; Barnas, Gary; Early, Eileen; Gillis, Rick

    2002-01-01

    The log files of MCW HealthLink web site were analyzed to study users' needs for consumer health information and get a better understanding of the health topics users are searching for, the paths users usually take to find consumer health information and the way to improve search effectiveness.

  16. The Internet and Technical Services: A Point Break Approach.

    ERIC Educational Resources Information Center

    McCombs, Gillian M.

    1994-01-01

    Discusses implications of using the Internet for library technical services. Topics addressed include creative uses of the Internet; three basic applications on the Internet, i.e., electronic mail, remote log-in to another computer, and file transfer; electronic processing of information; electronic access to information; and electronic processing…

  17. 77 FR 35956 - Appalachian Power Company; Notice of Application Accepted for Filing, Soliciting Motions To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-15

    ...) screened intake structures; (3) a concrete powerhouse containing three turbine-generator units with a total... structures; (3) a concrete powerhouse containing three turbine-generator units with a total installed... by a log boom; (2) screened intake structures; (3) a concrete powerhouse containing three turbine...

  18. Library Web Proxy Use Survey Results.

    ERIC Educational Resources Information Center

    Murray, Peter E.

    2001-01-01

    Outlines the use of proxy Web servers by libraries and reports on a survey on their use in libraries. Highlights include proxy use for remote resource access, for filtering, for bandwidth conservation, and for gathering statistics; privacy policies regarding the use of proxy server log files; and a copy of the survey. (LRW)

  19. Comparing image search behaviour in the ARRS GoldMiner search engine and a clinical PACS/RIS.

    PubMed

    De-Arteaga, Maria; Eggel, Ivan; Do, Bao; Rubin, Daniel; Kahn, Charles E; Müller, Henning

    2015-08-01

    Information search has changed the way we manage knowledge and the ubiquity of information access has made search a frequent activity, whether via Internet search engines or increasingly via mobile devices. Medical information search is in this respect no different and much research has been devoted to analyzing the way in which physicians aim to access information. Medical image search is a much smaller domain but has gained much attention as it has different characteristics than search for text documents. While web search log files have been analysed many times to better understand user behaviour, the log files of hospital internal systems for search in a PACS/RIS (Picture Archival and Communication System, Radiology Information System) have rarely been analysed. Such a comparison between a hospital PACS/RIS search and a web system for searching images of the biomedical literature is the goal of this paper. Objectives are to identify similarities and differences in search behaviour of the two systems, which could then be used to optimize existing systems and build new search engines. Log files of the ARRS GoldMiner medical image search engine (freely accessible on the Internet) containing 222,005 queries, and log files of Stanford's internal PACS/RIS search called radTF containing 18,068 queries were analysed. Each query was preprocessed and all query terms were mapped to the RadLex (Radiology Lexicon) terminology, a comprehensive lexicon of radiology terms created and maintained by the Radiological Society of North America, so the semantic content in the queries and the links between terms could be analysed, and synonyms for the same concept could be detected. RadLex was mainly created for the use in radiology reports, to aid structured reporting and the preparation of educational material (Lanlotz, 2006) [1]. In standard medical vocabularies such as MeSH (Medical Subject Headings) and UMLS (Unified Medical Language System) specific terms of radiology are often underrepresented, therefore RadLex was considered to be the best option for this task. The results show a surprising similarity between the usage behaviour in the two systems, but several subtle differences can also be noted. The average number of terms per query is 2.21 for GoldMiner and 2.07 for radTF, the used axes of RadLex (anatomy, pathology, findings, …) have almost the same distribution with clinical findings being the most frequent and the anatomical entity the second; also, combinations of RadLex axes are extremely similar between the two systems. Differences include a longer length of the sessions in radTF than in GoldMiner (3.4 and 1.9 queries per session on average). Several frequent search terms overlap but some strong differences exist in the details. In radTF the term "normal" is frequent, whereas in GoldMiner it is not. This makes intuitive sense, as in the literature normal cases are rarely described whereas in clinical work the comparison with normal cases is often a first step. The general similarity in many points is likely due to the fact that users of the two systems are influenced by their daily behaviour in using standard web search engines and follow this behaviour in their professional search. This means that many results and insights gained from standard web search can likely be transferred to more specialized search systems. Still, specialized log files can be used to find out more on reformulations and detailed strategies of users to find the right content. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. The Added Value of Log File Analyses of the Use of a Personal Health Record for Patients With Type 2 Diabetes Mellitus

    PubMed Central

    Kelders, Saskia M.; Braakman-Jansen, Louise M. A.; van Gemert-Pijnen, Julia E. W. C.

    2014-01-01

    The electronic personal health record (PHR) is a promising technology for improving the quality of chronic disease management. Until now, evaluations of such systems have provided only little insight into why a particular outcome occurred. The aim of this study is to gain insight into the navigation process (what functionalities are used, and in what sequence) of e-Vita, a PHR for patients with type 2 diabetes mellitus (T2DM), to increase the efficiency of the system and improve the long-term adherence. Log data of the first visits in the first 6 weeks after the release of a renewed version of e-Vita were analyzed to identify the usage patterns that emerge when users explore a new application. After receiving the invitation, 28% of all registered users visited e-Vita. In total, 70 unique usage patterns could be identified. When users visited the education service first, 93% of all users ended their session. Most users visited either 1 or 5 or more services during their first session, but the distribution of the routes was diffuse. In conclusion, log file analyses can provide valuable prompts for improving the system design of a PHR. In this way, the match between the system and its users and the long-term adherence has the potential to increase. PMID:24876574

  1. PDB explorer -- a web based algorithm for protein annotation viewer and 3D visualization.

    PubMed

    Nayarisseri, Anuraj; Shardiwal, Rakesh Kumar; Yadav, Mukesh; Kanungo, Neha; Singh, Pooja; Shah, Pratik; Ahmed, Sheaza

    2014-12-01

    The PDB file format, is a text format characterizing the three dimensional structures of macro molecules available in the Protein Data Bank (PDB). Determined protein structure are found in coalition with other molecules or ions such as nucleic acids, water, ions, Drug molecules and so on, which therefore can be described in the PDB format and have been deposited in PDB database. PDB is a machine generated file, it's not human readable format, to read this file we need any computational tool to understand it. The objective of our present study is to develop a free online software for retrieval, visualization and reading of annotation of a protein 3D structure which is available in PDB database. Main aim is to create PDB file in human readable format, i.e., the information in PDB file is converted in readable sentences. It displays all possible information from a PDB file including 3D structure of that file. Programming languages and scripting languages like Perl, CSS, Javascript, Ajax, and HTML have been used for the development of PDB Explorer. The PDB Explorer directly parses the PDB file, calling methods for parsed element secondary structure element, atoms, coordinates etc. PDB Explorer is freely available at http://www.pdbexplorer.eminentbio.com/home with no requirement of log-in.

  2. Fort Bliss Geothermal Area Data: Temperature profile, logs, schematic model and cross section

    DOE Data Explorer

    Adam Brandt

    2015-11-15

    This dataset contains a variety of data about the Fort Bliss geothermal area, part of the southern portion of the Tularosa Basin, New Mexico. The dataset contains schematic models for the McGregor Geothermal System, a shallow temperature survey of the Fort Bliss geothermal area. The dataset also contains Century OH logs, a full temperature profile, and complete logs from well RMI 56-5, including resistivity and porosity data, drill logs with drill rate, depth, lithology, mineralogy, fractures, temperature, pit total, gases, and descriptions among other measurements as well as CDL, CNL, DIL, GR Caliper and Temperature files. A shallow (2 meter depth) temperature survey of the Fort Bliss geothermal area with 63 data points is also included. Two cross sections through the Fort Bliss area, also included, show well position and depth. The surface map included shows faults and well spatial distribution. Inferred and observed fault distributions from gravity surveys around the Fort Bliss geothermal area.

  3. A Multi-temporal Analysis of Logging Impacts on Tropical Forest Structure Using Airborne Lidar Data

    NASA Astrophysics Data System (ADS)

    Keller, M. M.; Pinagé, E. R.; Duffy, P.; Longo, M.; dos-Santos, M. N.; Leitold, V.; Morton, D. C.

    2017-12-01

    The long-term impacts of selective logging on carbon cycling and ecosystem function in tropical-forests are still uncertain. Despite improvements in selective logging detection using satellite data, quantifying changes in forest structure from logging and recovery following logging is difficult using orbital data. We analyzed the dynamics of forest structure comparing logged and unlogged forests in the Eastern Brazilian Amazon (Paragominas Municipality, Pará State) using small footprint discrete return airborne lidar data acquired in 2012 and 2014. Logging operations were conducted at the 1200 ha study site from 2006 through 2013 using reduced impact logging techniques—management practices that minimize canopy and ground damage compared to more common conventional logging. Nevertheless, logging still reduced aboveground biomass by 10% to 20% in logged areas compared to intact forests. We aggregated lidar point-cloud data at spatial scales ranging from 50 m to 250 m and developed a binomial classification model based on the height distribution of lidar returns in 2012 and validated the model against the 2014 lidar acquisition. We accurately classified intact and logged forest classes compared with field data. Classification performance improved as spatial resolution increased (AUC = 0.974 at 250 m). We analyzed the differences in canopy gaps, understory damage (based on a relative density model), and biomass (estimated from total canopy height) of intact and logged classes. As expected, logging greatly increased both canopy gap formation and understory damage. However, while the area identified as canopy gap persisted for at least 8 years (from the oldest logging treatments in 2006 to the most recent lidar acquisition in 2014), the effects of ground damage were mostly erased by vigorous understory regrowth after about 5 years. The rate of new gap formation was 6 to 7 times greater in recently logged forests compared to undisturbed forests. New gaps opened at a rate of 1.8 times greater than background even 8 years following logging demonstrating the occurrence of delayed tree mortality. Our study showed that even low-intensity anthropogenic disturbances can cause persistent changes in tropical forest structure and dynamics.

  4. Identifying Measures of Student Behavior from Interaction with a Course Management System

    ERIC Educational Resources Information Center

    Nickles, George M., III

    2006-01-01

    The purpose of this work is to identify process measures of student interaction with a course management system (CMS). Logs maintained by Web servers capture aggregate user interactions with a Website. When combined with a login system and context from the course recorded in the CMS, more detailed measures of individual student interaction can be…

  5. The complex dynamics of products and its asymptotic properties

    PubMed Central

    Cristelli, Matthieu; Zaccaria, Andrea; Pietronero, Luciano

    2017-01-01

    We analyse global export data within the Economic Complexity framework. We couple the new economic dimension Complexity, which captures how sophisticated products are, with an index called logPRODY, a measure of the income of the respective exporters. Products’ aggregate motion is treated as a 2-dimensional dynamical system in the Complexity-logPRODY plane. We find that this motion can be explained by a quantitative model involving the competition on the markets, that can be mapped as a scalar field on the Complexity-logPRODY plane and acts in a way akin to a potential. This explains the movement of products towards areas of the plane in which the competition is higher. We analyse market composition in more detail, finding that for most products it tends, over time, to a characteristic configuration, which depends on the Complexity of the products. This market configuration, which we called asymptotic, is characterized by higher levels of competition. PMID:28520794

  6. Colloidal and antibacterial properties of novel triple-headed, double-tailed amphiphiles: exploring structure-activity relationships and synergistic mixtures.

    PubMed

    Marafino, John N; Gallagher, Tara M; Barragan, Jhosdyn; Volkers, Brandi L; LaDow, Jade E; Bonifer, Kyle; Fitzgerald, Gabriel; Floyd, Jason L; McKenna, Kristin; Minahan, Nicholas T; Walsh, Brenna; Seifert, Kyle; Caran, Kevin L

    2015-07-01

    Two novel series of tris-cationic, tripled-headed, double-tailed amphiphiles were synthesized and the effects of tail length and head group composition on the critical aggregation concentration (CAC), thermodynamic parameters, and minimum inhibitory concentration (MIC) against six bacterial strains were investigated. Synergistic antibacterial combinations of these amphiphiles were also identified. Amphiphiles in this study are composed of a benzene core with three benzylic ammonium bromide groups, two of which have alkyl chains, each 8-16 carbons in length. The third head group is a trimethylammonium or pyridinium. Log of critical aggregation concentration (log[CAC]) and heat of aggregation (ΔHagg) were both inversely proportional to the length of the linear hydrocarbon chains. Antibacterial activity increases with tail length until an optimal tail length of 12 carbons per chain, above which, activity decreased. The derivatives with two 12 carbon chains had the best antibacterial activity, killing all tested strains at concentrations of 1-2μM for Gram-positive and 4-16μM for Gram-negative bacteria. The identity of the third head group (trimethylammonium or pyridinium) had minimal effect on colloidal and antibacterial activity. The antibacterial activity of several binary combinations of amphiphiles from this study was higher than activity of individual amphiphiles, indicating that these combinations are synergistic. These amphiphiles show promise as novel antibacterial agents that could be used in a variety of applications. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Clinical bioequivalence of a dose of clopidogrel Leti Cravid tablets 75 mg versus clopidogrel Sanofi Plavix tablets 75 mg administered on a daily dose for 7 days on healthy volunteers: a clinical trial.

    PubMed

    Müller, Aixa; Octavio, José; González, María Y; Contreras, Jesús; Méndez, Gisela; Portillo, Milagros; Valero, Zuleima

    2010-01-01

    Patients undergoing percutaneous coronary intervention procedures, as in patients with coronary disease, should receive treatment indefinitely with acetylsalicylic acid and clopidogrel. New brands of clopidogrel have been developed at lower costs, for helping to avoid premature suspension of antiplatelet therapy, as Cravid Leti Laboratories clopidogrel. Its effectiveness and safety must be compared with Plavix international standard. A prospective, comparative, cross-over, and randomized study was conducted in healthy volunteers. Each group received 1 tablet of Clopidogrel Leti or Clopidogrel Sanofi, 75 mg in a single dose daily for 7 days, followed by 7-day washout period before administration of second treatment. Platelet aggregation was measured at the start of each period and at 7 days of treatment through optical aggregometry, using an optical aggregometer 490-2D Chrono-Log, with a self-calibration system working with platelet-rich plasma with readings 0%-100% of light transmission. An important decrease of platelet aggregation was observed in both groups at 7 days of treatment of more than 50%, independent of adenosine diphosphate reactive (Helena and Chrono-Log) used for aggregation (P < 0.05). The relationship between the mean and 90% confidence interval ratio obtained with the 2 different adenosine diphosphate brands were between 80% and 125%, therefore, it can be considered that both brands are bioequivalent and perfectly exchangeable.

  8. Multinetwork of international trade: A commodity-specific analysis

    NASA Astrophysics Data System (ADS)

    Barigozzi, Matteo; Fagiolo, Giorgio; Garlaschelli, Diego

    2010-04-01

    We study the topological properties of the multinetwork of commodity-specific trade relations among world countries over the 1992-2003 period, comparing them with those of the aggregate-trade network, known in the literature as the international-trade network (ITN). We show that link-weight distributions of commodity-specific networks are extremely heterogeneous and (quasi) log normality of aggregate link-weight distribution is generated as a sheer outcome of aggregation. Commodity-specific networks also display average connectivity, clustering, and centrality levels very different from their aggregate counterpart. We also find that ITN complete connectivity is mainly achieved through the presence of many weak links that keep commodity-specific networks together and that the correlation structure existing between topological statistics within each single network is fairly robust and mimics that of the aggregate network. Finally, we employ cross-commodity correlations between link weights to build hierarchies of commodities. Our results suggest that on the top of a relatively time-invariant “intrinsic” taxonomy (based on inherent between-commodity similarities), the roles played by different commodities in the ITN have become more and more dissimilar, possibly as the result of an increased trade specialization. Our approach is general and can be used to characterize any multinetwork emerging as a nontrivial aggregation of several interdependent layers.

  9. Stratigraphic framework of Cambrian and Ordovician rocks in the central Appalachian basin from Medina County, Ohio, through southwestern and south-central Pennsylvania to Hampshire County, West Virginia: Chapter E.2.2 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Harris, Anita G.; Repetski, John E.; Crangle, Robert D.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    This chapter is a re-release of U.S. Geological Survey Bulletin 1839-K, of the same title, by Ryder and others (1992; online version 2.0 revised and digitized by Robert D. Crangle, Jr., 2003). It consists of one file of the report text as it appeared in USGS Bulletin 1839-K and a second file containing the cross section, figures 1 and 2, and tables 1 and 2 on one oversized sheet; the second file was digitized in 2003 as version 2.0 and also includes the gamma-ray well log traces.

  10. Improved method estimating bioconcentration/bioaccumulation factor from octanol/water partition coefficient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meylan, W.M.; Howard, P.H.; Aronson, D.

    1999-04-01

    A compound`s bioconcentration factor (BDF) is the most commonly used indicator of its tendency to accumulate in aquatic organisms from the surrounding medium. Because it is expensive to measure, the BCF is generally estimated from the octanol/water partition coefficient (K{sub ow}), but currently used regression equations were developed from small data sets that do not adequately represent the wide range of chemical substances now subject to review. To develop and improved method, the authors collected BCF data in a file that contained information on measured BCFs and other key experimental details for 694 chemicals. Log BCF was then regressed againstmore » log K{sub ow} and chemicals with significant deviations from the line of best fit were analyzed by chemical structure. The resulting algorithm classifies a substance as either nonionic or ionic, the latter group including carboxylic acids, sulfonic acids and their salts, and quaternary N compounds. Log BCF for nonionics is estimated from log K{sub ow} and a series of correction factors if applicable; different equations apply for log K{sub ow} 1.0 to 7.0 and >7.0. For ionics, chemicals are categorized by log K{sub ow} and a log BCF in the range 0.5 to 1.75 is assigned. Organometallics, nonionics with long alkyl chains, and aromatic azo compounds receive special treatment. The correlation coefficient and mean error for log BCF indicate that the new method is a significantly better fit to existing data than other methods.« less

  11. Production, prices, employment, and trade in Northwest forest industries, third quarter 1996.

    Treesearch

    Debra D. Warren

    1997-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries: international trade in logs, lumber, and plywood: volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  12. Production, prices, employment, and trade in Northwest forest industries, all quarters 2000.

    Treesearch

    Debra D. Warren

    2002-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  13. Production, prices, employment, and trade in Northwest forest industries, all quarters 2002.

    Treesearch

    Debra D. Warren

    2004-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  14. Production, prices, employment, and trade in Northwest forest industries, all quarters 2005.

    Treesearch

    Debra D. Warren

    2007-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  15. Production, prices, employment, and trade in Northwest forest industries, all quarters 2006.

    Treesearch

    Debra D. Warren

    2008-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  16. Production, prices, employment, and trade in Northwest forest industries, all quarters 2004.

    Treesearch

    Debra D. Warren

    2006-01-01

    Provides current information on lumber and plywood production and prices; employment in forest industries; international trade in logs, lumber, and plywood; volumes and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  17. Voting with Their Seats: Computer Laboratory Design and the Casual User

    ERIC Educational Resources Information Center

    Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David

    2007-01-01

    Student computer laboratories are provided by most teaching institutions around the world; however, what is the most effective layout for such facilities? The log-in data files from computer laboratories at a regional university in Australia were analysed to determine whether there was a pattern in student seating. In particular, it was…

  18. Using Learning Styles and Viewing Styles in Streaming Video

    ERIC Educational Resources Information Center

    de Boer, Jelle; Kommers, Piet A. M.; de Brock, Bert

    2011-01-01

    Improving the effectiveness of learning when students observe video lectures becomes urgent with the rising advent of (web-based) video materials. Vital questions are how students differ in their learning preferences and what patterns in viewing video can be detected in log files. Our experiments inventory students' viewing patterns while watching…

  19. Recommendations for Benchmarking Web Site Usage among Academic Libraries.

    ERIC Educational Resources Information Center

    Hightower, Christy; Sih, Julie; Tilghman, Adam

    1998-01-01

    To help library directors and Web developers create a benchmarking program to compare statistics of academic Web sites, the authors analyzed the Web server log files of 14 university science and engineering libraries. Recommends a centralized voluntary reporting structure coordinated by the Association of Research Libraries (ARL) and a method for…

  20. Motivational Aspects of Learning Genetics with Interactive Multimedia

    ERIC Educational Resources Information Center

    Tsui, Chi-Yan; Treagust, David F.

    2004-01-01

    A BioLogica trial in six U.S. schools using interpretive approach is conducted by the Concord Consortium that examined the student motivation of learning genetics. Multiple data sources like online tests, computer data log files and classroom observation are used that found the result in terms of interviewees' perception, class-wide online…

  1. 16. Photocopy of photograph (4 x 5 inch reduction of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. Photocopy of photograph (4 x 5 inch reduction of 1939 3-1/4 x 5-5/8 inch print, photographer unknown; in Recreation files, Supervisor's Office, Mt. Baker-Snoqualmie National Forest) GENERAL VIEW, NORTHEAST CORNER, INTERPRETIVE LOG TO LEFT. - Glacier Ranger Station, Washington State Route 542, Glacier, Whatcom County, WA

  2. 20 CFR 658.410 - Establishment of State agency JS complaint system.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... system. At the local office level, the local office manager shall be responsible for the management of... related), the local office manager shall transmit a copy of that portion of the log containing the... established for the handling of complaints and files relating to the handling of complaints. The Manager or...

  3. 76 FR 4463 - Privacy Act of 1974; Report of Modified or Altered System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... occupationally related mortality or morbidity is occurring. In the event of litigation where the defendant is: (a... diseases and which provides for the confidentiality of the information. In the event of litigation..., limited log-ins, virus protection, and user rights/file attribute restrictions. Password protection...

  4. Production, prices, employment, and trade in Northwest forest industries, all quarters 1998.

    Treesearch

    Debra D. Warren

    2000-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  5. Production, prices, employment, and trade in Northwest forest industries, fourth quarter 1996.

    Treesearch

    Debra D. Warren

    1997-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  6. Production, prices, employment, and trade in Northwest forest industries, all quarters of 2007.

    Treesearch

    Debra D. Warren

    2008-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  7. Production, prices, employment, and trade in Northwest forest industries, all quarters 2003.

    Treesearch

    Debra D. Warren

    2005-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  8. Production, prices, employment, and trade in Northwest forest industries, all quarters 2008

    Treesearch

    Debra Warren

    2009-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  9. Data Retention Policy | High-Performance Computing | NREL

    Science.gov Websites

    HPC Data Retention Policy. File storage areas on Peregrine and Gyrfalcon are either user-centric to reclaim storage. We can make special arrangements for permanent storage, if needed. User-Centric > is 3 months after the last project ends. During this retention period, the user may log in to

  10. Elementary School Students' Strategic Learning: Does Task-Type Matter?

    ERIC Educational Resources Information Center

    Malmberg, Jonna; Järvelä, Sanna; Kirschner, Paul A.

    2014-01-01

    This study investigated what types of learning patterns and strategies elementary school students use to carry out ill- and well-structured tasks. Specifically, it was investigated which and when learning patterns actually emerge with respect to students' task solutions. The present study uses computer log file traces to investigate how…

  11. Patterns in Elementary School Students' Strategic Actions in Varying Learning Situations

    ERIC Educational Resources Information Center

    Malmberg, Jonna; Järvenoja, Hanna; Järvelä, Sanna

    2013-01-01

    This study uses log file traces to examine differences between high-and low-achieving students' strategic actions in varying learning situations. In addition, this study illustrates, in detail, what strategic and self-regulated learning constitutes in practice. The study investigates the learning patterns that emerge in learning situations…

  12. Online Persistence in Higher Education Web-Supported Courses

    ERIC Educational Resources Information Center

    Hershkovitz, Arnon; Nachmias, Rafi

    2011-01-01

    This research consists of an empirical study of online persistence in Web-supported courses in higher education, using Data Mining techniques. Log files of 58 Moodle websites accompanying Tel Aviv University courses were drawn, recording the activity of 1189 students in 1897 course enrollments during the academic year 2008/9, and were analyzed…

  13. 78 FR 56873 - Information Collection Being Reviewed by the Federal Communications Commission

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... on the respondents, including the use of automated collection techniques or other forms of....: 3060-0360. Title: Section 80.409, Station Logs (Maritime Services). Form No.: N/A. Type of Review... the claim or complaint has been satisfied or barred by statute limiting the time for filing suits upon...

  14. TraceContract

    NASA Technical Reports Server (NTRS)

    Kavelund, Klaus; Barringer, Howard

    2012-01-01

    TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.

  15. Rapid Diagnostics of Onboard Sequences

    NASA Technical Reports Server (NTRS)

    Starbird, Thomas W.; Morris, John R.; Shams, Khawaja S.; Maimone, Mark W.

    2012-01-01

    Keeping track of sequences onboard a spacecraft is challenging. When reviewing Event Verification Records (EVRs) of sequence executions on the Mars Exploration Rover (MER), operators often found themselves wondering which version of a named sequence the EVR corresponded to. The lack of this information drastically impacts the operators diagnostic capabilities as well as their situational awareness with respect to the commands the spacecraft has executed, since the EVRs do not provide argument values or explanatory comments. Having this information immediately available can be instrumental in diagnosing critical events and can significantly enhance the overall safety of the spacecraft. This software provides auditing capability that can eliminate that uncertainty while diagnosing critical conditions. Furthermore, the Restful interface provides a simple way for sequencing tools to automatically retrieve binary compiled sequence SCMFs (Space Command Message Files) on demand. It also enables developers to change the underlying database, while maintaining the same interface to the existing applications. The logging capabilities are also beneficial to operators when they are trying to recall how they solved a similar problem many days ago: this software enables automatic recovery of SCMF and RML (Robot Markup Language) sequence files directly from the command EVRs, eliminating the need for people to find and validate the corresponding sequences. To address the lack of auditing capability for sequences onboard a spacecraft during earlier missions, extensive logging support was added on the Mars Science Laboratory (MSL) sequencing server. This server is responsible for generating all MSL binary SCMFs from RML input sequences. The sequencing server logs every SCMF it generates into a MySQL database, as well as the high-level RML file and dictionary name inputs used to create the SCMF. The SCMF is then indexed by a hash value that is automatically included in all command EVRs by the onboard flight software. Second, both the binary SCMF result and the RML input file can be retrieved simply by specifying the hash to a Restful web interface. This interface enables command line tools as well as large sophisticated programs to download the SCMF and RMLs on-demand from the database, enabling a vast array of tools to be built on top of it. One such command line tool can retrieve and display RML files, or annotate a list of EVRs by interleaving them with the original sequence commands. This software has been integrated with the MSL sequencing pipeline where it will serve sequences useful in diagnostics, debugging, and situational awareness throughout the mission.

  16. VizieR Online Data Catalog: Bessel (1825) calculation for geodesic measurements (Karney+, 2010)

    NASA Astrophysics Data System (ADS)

    Karney, C. F. F.; Deakin, R. E.

    2010-06-01

    The solution of the geodesic problem for an oblate ellipsoid is developed in terms of series. Tables are provided to simplify the computation. Included here are the tables that accompanied Bessel's paper (with corrections). The tables were crafted by Bessel to be minimize the labor of hand calculations. To this end, he adjusted the intervals in the tables, the number of terms included in the series, and the number of significant digits given so that the final results are accurate to about 8 places. For that reason, the most useful form of the tables is as the PDF file which provides the tables in a layout close to the original. Also provided is the LaTeX source file for the PDF file. Finally, the data has been put into a format so that it can be read easily by computer programs. All the logarithms are in base 10 (common logarithms). The characteristic and the mantissa should be read separately (indicated as x.c and x.m in the file description). Thus the first entry in the table, -4.4, should be parsed as "-4" (the characteristic) and ".4" (the mantissa); the anti-log for this entry is 10(-4+0.4)=2.5e-4. The "Delta" columns give the first difference of the preceding column, i.e., the difference of the preceding column in the next row and the preceding column in the current row. In the printed tables these are expressed as "units in the last place" and the differences are of the rounded representations in the preceding columns (to minimize interpolation errors). In table1.dat these are given scaled to a match the format used for the preceding column, as indicated by the units given for these columns. The unit log(") (in the description within square brackets [arcsec]) means the logarithm of a quantity expressed in arcseconds. (3 data files).

  17. Viscosity and transient electric birefringence study of clay colloidal aggregation.

    PubMed

    Bakk, Audun; Fossum, Jon O; da Silva, Geraldo J; Adland, Hans M; Mikkelsen, Arne; Elgsaeter, Arnljot

    2002-02-01

    We study a synthetic clay suspension of laponite at different particle and NaCl concentrations by measuring stationary shear viscosity and transient electrically induced birefringence (TEB). On one hand the viscosity data are consistent with the particles being spheres and the particles being associated with large amount bound water. On the other hand the viscosity data are also consistent with the particles being asymmetric, consistent with single laponite platelets associated with a very few monolayers of water. We analyze the TEB data by employing two different models of aggregate size (effective hydrodynamic radius) distribution: (1) bidisperse model and (2) log-normal distributed model. Both models fit, in the same manner, fairly well to the experimental TEB data and they indicate that the suspension consists of polydisperse particles. The models also appear to confirm that the aggregates increase in size vs increasing ionic strength. The smallest particles at low salt concentrations seem to be monomers and oligomers.

  18. Aggregate R-R-V Analysis

    EPA Pesticide Factsheets

    The excel file contains time series data of flow rates, concentrations of alachlor , atrazine, ammonia, total phosphorus, and total suspended solids observed in two watersheds in Indiana from 2002 to 2007. The aggregate time series data corresponding or representative to all these parameters was obtained using a specialized, data-driven technique. The aggregate data is hypothesized in the published paper to represent the overall health of both watersheds with respect to various potential water quality impairments. The time series data for each of the individual water quality parameters were used to compute corresponding risk measures (Rel, Res, and Vul) that are reported in Table 4 and 5. The aggregation of the risk measures, which is computed from the aggregate time series and water quality standards in Table 1, is also reported in Table 4 and 5 of the published paper. Values under column heading uncertainty reports uncertainties associated with reconstruction of missing records of the water quality parameters. Long-term records of the water quality parameters were reconstructed in order to estimate the (R-R-V) and corresponding aggregate risk measures. This dataset is associated with the following publication:Hoque, Y., S. Tripathi, M. Hantush , and R. Govindaraju. Aggregate Measures of Watershed Health from Reconstructed Water Quality Data with Uncertainty. Ed Gregorich JOURNAL OF ENVIRONMENTAL QUALITY. American Society of Agronomy, MADISON, WI,

  19. Grid-wide neuroimaging data federation in the context of the NeuroLOG project

    PubMed Central

    Michel, Franck; Gaignard, Alban; Ahmad, Farooq; Barillot, Christian; Batrancourt, Bénédicte; Dojat, Michel; Gibaud, Bernard; Girard, Pascal; Godard, David; Kassel, Gilles; Lingrand, Diane; Malandain, Grégoire; Montagnat, Johan; Pélégrini-Issac, Mélanie; Pennec, Xavier; Rojas Balderrama, Javier; Wali, Bacem

    2010-01-01

    Grid technologies are appealing to deal with the challenges raised by computational neurosciences and support multi-centric brain studies. However, core grids middleware hardly cope with the complex neuroimaging data representation and multi-layer data federation needs. Moreover, legacy neuroscience environments need to be preserved and cannot be simply superseded by grid services. This paper describes the NeuroLOG platform design and implementation, shedding light on its Data Management Layer. It addresses the integration of brain image files, associated relational metadata and neuroscience semantic data in a heterogeneous distributed environment, integrating legacy data managers through a mediation layer. PMID:20543431

  20. Information Collection and Correlation Decision Support System for the Marine Corps TCO Environment.

    DTIC Science & Technology

    1980-08-15

    Reliability Index Computation 4-18 4.2.4 Classification 4-20 4.2.5 Sensor Selection 4-25 4.2.6 Information Aggregation 4-30 4.2.7 Track Record...i pB 1 Log Pi 11 - Belis and Guiasu’s Entropy: - k u iP i Log Pi 21 - Gini’s diversity index : 1- P 1 In a Bayesian context with a 0-1 loss...source j will yield observation x, = k. __(k) is obtained via Bayes’ formula: 1 IrP(X.= klCi ) 1ti(k) = "’m0 [ w P(X =kIC)=I= and P(Xj=k) via m P(Xj=k) = I P

  1. Archive of Side Scan Sonar and Swath Bathymetry Data collected during USGS Cruise 10CCT02 Offshore of Petit Bois Island Including Petit Bois Pass, Gulf Islands National Seashore, Mississippi, March 2010

    USGS Publications Warehouse

    Pfeiffer, William R.; Flocks, James G.; DeWitt, Nancy T.; Forde, Arnell S.; Kelso, Kyle; Thompson, Phillip R.; Wiese, Dana S.

    2011-01-01

    In March of 2010, the U.S. Geological Survey (USGS) conducted geophysical surveys offshore of Petit Bois Island, Mississippi, and Dauphin Island, Alabama (fig. 1). These efforts were part of the USGS Gulf of Mexico Science Coordination partnership with the U.S. Army Corps of Engineers (USACE) to assist the Mississippi Coastal Improvements Program (MsCIP) and the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazards Susceptibility Project by mapping the shallow geologic stratigraphic framework of the Mississippi Barrier Island Complex. These geophysical surveys will provide the data necessary for scientists to define, interpret, and provide baseline bathymetry and seafloor habitat for this area and to aid scientists in predicting future geomorphological changes of the islands with respect to climate change, storm impact, and sea-level rise. Furthermore, these data will provide information for barrier island restoration, particularly in Camille Cut, and protection for the historical Fort Massachusetts on Ship Island, Mississippi. For more information please refer to http://ngom.usgs.gov/gomsc/mscip/index.html. This report serves as an archive of the processed swath bathymetry and side scan sonar data (SSS). Data products herein include gridded and interpolated surfaces, seabed backscatter images, and ASCII x,y,z data products for both swath bathymetry and side scan sonar imagery. Additional files include trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal FGDC metadata. Scanned images of the handwritten and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stathakis, S; Defoor, D; Linden, P

    Purpose: To study the frequency of Multi-Leaf Collimator (MLC) leaf failures, investigate methods to predict them and reduce linac downtime. Methods: A Varian HD120 MLC was used in our study. The hyperterminal MLC errors logged from 06/2012 to 12/2014 were collected. Along with the hyperterminal errors, the MLC motor changes and all other MLC interventions by the linear accelerator engineer were recorded. The MLC dynalog files were also recorded on a daily basis for each treatment and during linac QA. The dynalog files were analyzed to calculate root mean square errors (RMS) and cumulative MLC travel distance per motor. Anmore » in-house MatLab code was used to analyze all dynalog files, record RMS errors and calculate the distance each MLC traveled per day. Results: A total of 269 interventions were recorded over a period of 18 months. Of these, 146 included MLC motor leaf change, 39 T-nut replacements, and 84 MLC cleaning sessions. Leaves close to the middle of each side required the most maintenance. In the A bank, leaves A27 to A40 recorded 73% of all interventions, while the same leaves in the B bank counted for 52% of the interventions. On average, leaves in the middle of the bank had their motors changed approximately every 1500m of travel. Finally, it was found that the number of RMS errors increased prior to an MLC motor change. Conclusion: An MLC dynalog file analysis software was developed that can be used to log daily MLC usage. Our eighteen-month data analysis showed that there is a correlation between the distance an MLC travels, the RMS and the life of the MLC motor. We plan to use this tool to predict MLC motor failures and with proper and timely intervention, reduce the downtime of the linac during clinical hours.« less

  3. Comparison of fracture and deformation in the rotary endodontic instruments: Protaper versus K-3 system.

    PubMed

    Nagi, Sana Ehsen; Khan, Farhan Raza; Rahman, Munawar

    2016-03-01

    This experimental study was done on extracted human teeth to compare the fracture and deformation of the two rotary endodontic files system namely K-3 and Protapers. It was conducted at the dental clinics of the Aga Khan University Hospital, Karachi, A log of file deformation or fracture during root canal preparation was kept. The location of fracture was noted along with the identity of the canal in which fracture took place. The fracture in the two rotary systems was compared. SPSS 20 was used for data analysis. Of the 172(80.4%) teeth possessing more than 15 degrees of curvature, fracture occurred in 7(4.1%) cases and deformation in 10(5.8%). Of the 42(19.6%) teeth possessing less than 15 degrees of curvature, fracture occurred in none of them while deformation was seen in 1(2.4%). There was no difference in K-3 and Protaper files with respect to file deformation and fracture. Most of the fractures occurred in mesiobuccal canals of maxillary molars, n=3(21.4%). The likelihood of file fracture increased 5.65-fold when the same file was used more than 3 times. Irrespective of the rotary system, apical third of the root canal space was the most common site for file fracture.

  4. An alternative to sneakernet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrell, S.; Ralstin, S.

    1992-04-01

    Many computer security plans specify that only a small percentage of the data processed will be classified. Thus, the bulk of the data on secure systems must be unclassified. Secure limited access sites operating approved classified computing systems sometimes also have a system ostensibly containing only unclassified files but operating within the secure environment. That system could be networked or otherwise connected to a classified system(s) in order that both be able to use common resources for file storage or computing power. Such a system must operate under the same rules as the secure classified systems. It is in themore » nature of unclassified files that they either came from, or will eventually migrate to, a non-secure system. Today, unclassified files are exported from systems within the secure environment typically by loading transport media and carrying them to an open system. Import of unclassified files is handled similarly. This media transport process, sometimes referred to as sneaker net, often is manually logged and controlled only by administrative procedures. A comprehensive system for secure bi-directional transfer of unclassified files between secure and open environments has yet to be developed. Any such secure file transport system should be required to meet several stringent criteria. It is the purpose of this document to begin a definition of these criteria.« less

  5. An alternative to sneakernet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrell, S.; Ralstin, S.

    1992-01-01

    Many computer security plans specify that only a small percentage of the data processed will be classified. Thus, the bulk of the data on secure systems must be unclassified. Secure limited access sites operating approved classified computing systems sometimes also have a system ostensibly containing only unclassified files but operating within the secure environment. That system could be networked or otherwise connected to a classified system(s) in order that both be able to use common resources for file storage or computing power. Such a system must operate under the same rules as the secure classified systems. It is in themore » nature of unclassified files that they either came from, or will eventually migrate to, a non-secure system. Today, unclassified files are exported from systems within the secure environment typically by loading transport media and carrying them to an open system. Import of unclassified files is handled similarly. This media transport process, sometimes referred to as sneaker net, often is manually logged and controlled only by administrative procedures. A comprehensive system for secure bi-directional transfer of unclassified files between secure and open environments has yet to be developed. Any such secure file transport system should be required to meet several stringent criteria. It is the purpose of this document to begin a definition of these criteria.« less

  6. 75 FR 44830 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-29

    ...-dealers and aggregators such as electronic communications networks. A member firm is able to select any... (`ATSs') and electronic communications networks (`ECNs') that the Commission has also nurtured and... the following methods: Electronic Comments Use the Commission's Internet comment form ( http://www.sec...

  7. 76 FR 55882 - Procurement List; Addition

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-09

    ... Americans to work, earn income and be productive members of society. The AbilityOne Program, which the... Administration, Fort Worth, TX. Coverage: A-List for the Total Government Requirement as aggregated by the... Information Management). [FR Doc. 2011-23107 Filed 9-8-11; 8:45 am] BILLING CODE 6353-01-P ...

  8. 12 CFR 1750.4 - Minimum capital requirement computation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... amounts: (1) 2.50 percent times the aggregate on-balance sheet assets of the Enterprise; (2) 0.45 percent times the unpaid principal balance of mortgage-backed securities and substantially equivalent... last day of the quarter just ended (or the date for which the minimum capital report is filed, if...

  9. 26 CFR 1.6031(a)-1 - Return of partnership income.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... gross income (including gains) derived from sources within the United States (U.S.-source income... more of any item of partnership income, gain, loss, deduction, or credit is allocable in the aggregate... allocable items of partnership income, gain, loss, deduction, and credit. (3) Filing obligations for certain...

  10. 26 CFR 1.6031(a)-1 - Return of partnership income.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... gross income (including gains) derived from sources within the United States (U.S.-source income... more of any item of partnership income, gain, loss, deduction, or credit is allocable in the aggregate... allocable items of partnership income, gain, loss, deduction, and credit. (3) Filing obligations for certain...

  11. 78 FR 21487 - Self-Regulatory Organizations; National Securities Clearing Corporation; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-10

    ... family of affiliated Members (``Affiliated Family'') that would generate the largest aggregate payment... attributable to the exposure presented by those unaffiliated Members and Affiliated Families that regularly... Families provide additional liquidity to NSCC. Under proposed Rule 4(A), this will take the form of...

  12. SU-C-BRD-03: Analysis of Accelerator Generated Text Logs for Preemptive Maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Able, CM; Baydush, AH; Nguyen, C

    2014-06-15

    Purpose: To develop a model to analyze medical accelerator generated parameter and performance data that will provide an early warning of performance degradation and impending component failure. Methods: A robust 6 MV VMAT quality assurance treatment delivery was used to test the constancy of accelerator performance. The generated text log files were decoded and analyzed using statistical process control (SPC) methodology. The text file data is a single snapshot of energy specific and overall systems parameters. A total of 36 system parameters were monitored which include RF generation, electron gun control, energy control, beam uniformity control, DC voltage generation, andmore » cooling systems. The parameters were analyzed using Individual and Moving Range (I/MR) charts. The chart limits were calculated using a hybrid technique that included the use of the standard 3σ limits and the parameter/system specification. Synthetic errors/changes were introduced to determine the initial effectiveness of I/MR charts in detecting relevant changes in operating parameters. The magnitude of the synthetic errors/changes was based on: the value of 1 standard deviation from the mean operating parameter of 483 TB systems, a small fraction (≤ 5%) of the operating range, or a fraction of the minor fault deviation. Results: There were 34 parameters in which synthetic errors were introduced. There were 2 parameters (radial position steering coil, and positive 24V DC) in which the errors did not exceed the limit of the I/MR chart. The I chart limit was exceeded for all of the remaining parameters (94.2%). The MR chart limit was exceeded in 29 of the 32 parameters (85.3%) in which the I chart limit was exceeded. Conclusion: Statistical process control I/MR evaluation of text log file parameters may be effective in providing an early warning of performance degradation or component failure for digital medical accelerator systems. Research is Supported by Varian Medical Systems, Inc.« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wijesooriya, K; Seitter, K; Desai, V

    Purpose: To present our single institution experience on catching errors with trajectory log file analysis. The reported causes of failures, probability of occurrences (O), severity of effects (S), and the probability of the failures to be undetected (D) could be added to guidelines of FMEA analysis. Methods: From March 2013 to March 2014, 19569 patient treatment fields/arcs were analyzed. This work includes checking all 131 treatment delivery parameters for all patients, all treatment sites and all treatment delivery fractions. TrueBeam trajectory log files for all treatment field types as well as all imaging types were accessed, read in every 20ms,more » and every control point (total of 37 million parameters) compared to the physician approved plan in the planning system. Results: Couch angle outlier occurrence: N= 327, range = −1.7 −1.2 deg; gantry angle outlier occurrence: N =59, range = 0.09 – 5.61 deg, collimator angle outlier occurrence: N = 13, range = −0.2 – 0.2 deg. VMAT cases have slightly larger variations in mechanical parameters. MLC: 3D single control point fields have a maximum deviation of 0.04 mm, 39 step and shoot IMRT cases have MLC −0.3 – 0.5 mm deviations, all (1286) VMAT cases have −0.9 – 0.7 mm deviations. Two possible serious errors were found: 1) A 4 cm isocenter shift for the PA beam of an AP-PA pair, under-dosing a portion of PTV by 25%. 2) Delivery with MLC leaves abutted behind the jaws as opposed to the midline as planned, leading to a under-dosing of a small volume of the PTV by 25%, by just the boost plan. Due to their error origin, neither of these errors could have been detected by pre-treatment verification. Conclusion: Performing Trajectory Log file analysis could catch typically undetected errors to avoid potentially adverse incidents.« less

  14. Effects of traffic and ditch maintenance on forest road sediment production

    Treesearch

    Charles H. Luce; Thomas A. Black

    2001-01-01

    Observations of sediment yield from road segments in the Oregon Coast Range show that either heavy traffic during rainfall or blading the road ditch will increase erosion from forest roads. For the fine soils and high quality aggregate surfacing on the study plots, ditch blading increased sediment yield more than traffic equivalent to 12 log trucks per day. The...

  15. On the radiative properties of soot aggregates - Part 2: Effects of coating

    NASA Astrophysics Data System (ADS)

    Liu, Fengshan; Yon, Jérôme; Bescond, Alexandre

    2016-03-01

    The effects of weakly absorbing material coating on soot have attracted considerable research attention in recent years due to the significant influence of such coating on soot radiative properties and the large differences predicted by different numerical models. Soot aggregates were first numerically generated using the diffusion limited cluster aggregation algorithm to produce fractal aggregates formed by log-normally distributed polydisperse spherical primary particles in point-touch. These aggregates were then processed by adding a certain amount of primary particle overlapping and necking to simulate the soot morphology observed from transmission electron microscopy images. After this process, a layer of WAM coating of different thicknesses was added to these more realistic soot aggregates. The radiative properties of these coated soot aggregates over the spectral range of 266-1064 nm were calculated by the discrete dipole approximation (DDA) using the spectrally dependent refractive index of soot for four aggregates containing Np=1, 20, 51 and 96 primary particles. The considered coating thicknesses range from 0% (no coating) up to 100% coating in terms of the primary particle diameter. Coating enhances both the particle absorption and scattering cross sections, with much stronger enhancement to the scattering one, as well as the asymmetry factor and the single scattering albedo. The absorption enhancement is stronger in the UV than in the visible and the near infrared. The simple corrections to the Rayleigh-Debye-Gans fractal aggregates theory for uncoated soot aggregates are found not working for coated soot aggregates. The core-shell model significantly overestimates the absorption enhancement by coating in the visible and the near infrared compared to the DDA results of the coated soot particle. Treating an externally coated soot aggregate as an aggregate formed by individually coated primary particles significantly underestimates the absorption enhancement by coating in the visible and the near infrared.

  16. The medium is NOT the message or Indefinitely long-term file storage at Leeds University

    NASA Technical Reports Server (NTRS)

    Holdsworth, David

    1996-01-01

    Approximately 3 years ago we implemented an archive file storage system which embodies experiences gained over more than 25 years of using and writing file storage systems. It is the third in-house system that we have written, and all three systems have been adopted by other institutions. This paper discusses the requirements for long-term data storage in a university environment, and describes how our present system is designed to meet these requirements indefinitely. Particular emphasis is laid on experiences from past systems, and their influence on current system design. We also look at the influence of the IEEE-MSS standard. We currently have the system operating in five UK universities. The system operates in a multi-server environment, and is currently operational with UNIX (SunOS4, Solaris2, SGI-IRIX, HP-UX), NetWare3 and NetWare4. PCs logged on to NetWare can also archive and recover files that live on their hard disks.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duro, Francisco Rodrigo; Garcia Blas, Javier; Isaila, Florin

    This paper explores novel techniques for improving the performance of many-task workflows based on the Swift scripting language. We propose novel programmer options for automated distributed data placement and task scheduling. These options trigger a data placement mechanism used for distributing intermediate workflow data over the servers of Hercules, a distributed key-value store that can be used to cache file system data. We demonstrate that these new mechanisms can significantly improve the aggregated throughput of many-task workflows with up to 86x, reduce the contention on the shared file system, exploit the data locality, and trade off locality and load balance.

  18. The design and implementation of the HY-1B Product Archive System

    NASA Astrophysics Data System (ADS)

    Liu, Shibin; Liu, Wei; Peng, Hailong

    2010-11-01

    Product Archive System (PAS), as a background system, is the core part of the Product Archive and Distribution System (PADS) which is the center for data management of the Ground Application System of HY-1B satellite hosted by the National Satellite Ocean Application Service of China. PAS integrates a series of updating methods and technologies, such as a suitable data transmittal mode, flexible configuration files and log information in order to make the system with several desirable characteristics, such as ease of maintenance, stability, minimal complexity. This paper describes seven major components of the PAS (Network Communicator module, File Collector module, File Copy module, Task Collector module, Metadata Extractor module, Product data Archive module, Metadata catalogue import module) and some of the unique features of the system, as well as the technical problems encountered and resolved.

  19. Borehole petrophysical chemostratigraphy of Pennsylvanian black shales in the Kansas subsurface

    USGS Publications Warehouse

    Doveton, J.H.; Merriam, D.F.

    2004-01-01

    Pennsylvanian black shales in Kansas have been studied on outcrop for decades as the core unit of the classic Midcontinent cyclothem. These shales appear to be highstand condensed sections in the sequence stratigraphic paradigm. Nuclear log suites provide several petrophysical measurements of rock chemistry that are a useful data source for chemostratigraphic studies of Pennsylvanian black shales in the subsurface. Spectral gamma-ray logs partition natural radioactivity between contributions by U, Th, and K sources. Elevated U contents in black shales can be related to reducing depositional environments, whereas the K and Th contents are indicators of clay-mineral abundance and composition. The photoelectric factor log measurement is a direct function of aggregate atomic number and so is affected by clay-mineral volume, clay-mineral iron content, and other black shale compositional elements. Neutron porosity curves are primarily a response to hydrogen content. Although good quality logs are available for many black shales, borehole washout features invalidate readings from the nuclear contact devices, whereas black shales thinner than tool resolution will be averaged with adjacent beds. Statistical analysis of nuclear log data between black shales in successive cyclothems allows systematic patterns of their chemical and petrophysical properties to be discriminated in both space and time. ?? 2004 Elsevier B.V. All rights reserved.

  20. Intra-familial aggregation and heritability of aortic versus brachial pulse pressure after imputing pretreatment values in a community of African ancestry.

    PubMed

    Redelinghuys, Michelle; Norton, Gavin R; Maseko, Muzi J; Majane, Olebogeng H I; Woodiwiss, Angela J

    2012-06-01

    To compare the intra-familial aggregation and heritability of central (aortic) (PPc) versus peripheral (brachial) (PPp) pulse pressure after imputing pretreatment blood pressures (BPs) in treated participants in a community of black African ancestry. Central PPc [generalized transfer function (GTF) and radial P2-derived] was determined with applanation tonometry at the radial artery (SphygmoCor software) in 946 participants from 258 families with 23 families including three generations from an urban developing community of black Africans. In the 24.1% of participants receiving antihypertensive treatment, pretreatment brachial BP was imputed from published overall averaged effects of therapy grouped by class and dose, specific for groups of black African descent. From these data PPc was estimated from proportionate differences in central aortic and brachial PP. Heritability estimates were determined from SAGE software. Echocardiography was evaluated in 507 participants in order to determine stroke volume. With adjustments for confounders, parent-child (P < 0.05) and sibling-sibling (P < 0.0005) correlations were noted for log PPc, whilst for log PPp only sibling-sibling correlations were noted. No mother-father correlations were noted for either PPc or PPp. Independent of confounders the heritability for log GTF-derived (h = 0.33 ± 0.07, P < 0.0001) and P2-derived (h = 0.30 ± 0.07, P < 0.0001) PPc was greater than the heritability for log PPp (h = 0.11 ± 0.06, P < 0.05) (P < 0.05 for comparison of heritability estimates). After imputing pretreatment BP values, central aortic PP is significantly more inherited than brachial PP. These data suggest that in groups of African descent the genetic determinants of PP may be underestimated when employing brachial rather than central aortic PP measurements.

  1. Analysis of Student Activity in Web-Supported Courses as a Tool for Predicting Dropout

    ERIC Educational Resources Information Center

    Cohen, Anat

    2017-01-01

    Persistence in learning processes is perceived as a central value; therefore, dropouts from studies are a prime concern for educators. This study focuses on the quantitative analysis of data accumulated on 362 students in three academic course website log files in the disciplines of mathematics and statistics, in order to examine whether student…

  2. Self-Regulation during E-Learning: Using Behavioural Evidence from Navigation Log Files

    ERIC Educational Resources Information Center

    Jeske, D.; Backhaus, J.; Stamov Roßnagel, C.

    2014-01-01

    The current paper examined the relationship between perceived characteristics of the learning environment in an e-module in relation to test performance among a group of e-learners. Using structural equation modelling, the relationship between these variables is further explored in terms of the proposed double mediation as outlined by Ning and…

  3. Microanalytic Case studies of Individual Participation Patterns in an Asynchronous Online Discussion in an Undergraduate Blended Course

    ERIC Educational Resources Information Center

    Wise, Alyssa Friend; Perera, Nishan; Hsiao, Ying-Ting; Speer, Jennifer; Marbouti, Farshid

    2012-01-01

    This study presents three case studies of students' participation patterns in an online discussion to address the gap in our current understanding of how "individuals" experience asynchronous learning environments. Cases were constructed via microanalysis of log-file data, post contents, and the evolving discussion structure. The first student was…

  4. Query Classification and Study of University Students' Search Trends

    ERIC Educational Resources Information Center

    Maabreh, Majdi A.; Al-Kabi, Mohammed N.; Alsmadi, Izzat M.

    2012-01-01

    Purpose: This study is an attempt to develop an automatic identification method for Arabic web queries and divide them into several query types using data mining. In addition, it seeks to evaluate the impact of the academic environment on using the internet. Design/methodology/approach: The web log files were collected from one of the higher…

  5. VizieR Online Data Catalog: GAMA. Stellar mass budget (Moffett+, 2016)

    NASA Astrophysics Data System (ADS)

    Moffett, A. J.; Lange, R.; Driver, S. P.; Robotham, A. S. G.; Kelvin, L. S.; Alpaslan, M.; Andrews, S. K.; Bland-Hawthorn, J.; Brough, S.; Cluver, M. E.; Colless, M.; Davies, L. J. M.; Holwerda, B. W.; Hopkins, A. M.; Kafle, P. R.; Liske, J.; Meyer, M.

    2018-04-01

    Using the recently expanded Galaxy and Mass Assembly (GAMA) survey phase II visual morphology sample and the large-scale bulge and disc decomposition analysis of Lange et al. (2016MNRAS.462.1470L), we derive new stellar mass function fits to galaxy spheroid and disc populations down to log(M*/Mȯ)=8. (1 data file).

  6. 25 CFR 547.12 - What are the minimum technical standards for downloading on a Class II gaming system?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...

  7. 25 CFR 547.12 - What are the minimum technical standards for downloading on a Class II gaming system?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...

  8. 76 FR 54835 - Child Labor Regulations, Orders and Statements of Interpretation; Child Labor Violations-Civil...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ....m. in your local time zone, or log onto the Wage and Hour Division's Web site for a nationwide... INFORMATION: I. Electronic Access and Filing Comments Public Participation: This notice of proposed rulemaking is available through the Federal Register and the http://www.regulations.gov Web site. You may also...

  9. Capabilities Report 2012, West Desert Test Center

    DTIC Science & Technology

    2012-03-12

    132 FT- IR Spectrometer...electronic system files, paper logs, production batch records, QA/QC data, and PCR data generated during a test. Data analysts also track and QC raw data...Advantage +SL bench-top freeze dryers achieve shelf temperatures as low as -57°C and condenser temperatures to -67°C. The bulk milling facility produces

  10. 15. Photocopy of photograph (4 x 5 inch reduction of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. Photocopy of photograph (4 x 5 inch reduction of 1939 3-1/4 x 5-1/2 inch print, photographer unknown; in Recreation files, Supervisor's Office, Mt. Baker-Snoqualmie National Forest) GENERAL VIEW, LOOKING SOUTHWEST, SHOWING INTERPRETIVE LOG AND PROTECTION ASSISTANT'S HOUSE IN BACKGROUND. - Glacier Ranger Station, Washington State Route 542, Glacier, Whatcom County, WA

  11. Negotiating the Context of Online In-Service Training: "Expert" and "Non-Expert" Footings

    ERIC Educational Resources Information Center

    Nilsen, Mona

    2010-01-01

    This paper focuses on how people working in the Swedish food production industry engage in in-service training by means of computer-mediated communication. The empirical material consists of archived chat log files from a course concerning quality assurance and food safety hazards control in the preparation and handling of foodstuff. Drawing on…

  12. Learner Characteristics Predict Performance and Confidence in E-Learning: An Analysis of User Behavior and Self-Evaluation

    ERIC Educational Resources Information Center

    Jeske, Debora; Roßnagell, Christian Stamov; Backhaus, Joy

    2014-01-01

    We examined the role of learner characteristics as predictors of four aspects of e-learning performance, including knowledge test performance, learning confidence, learning efficiency, and navigational effectiveness. We used both self reports and log file records to compute the relevant statistics. Regression analyses showed that both need for…

  13. Digging Deeper into Learners' Experiences in MOOCs: Participation in Social Networks outside of MOOCs, Notetaking and Contexts Surrounding Content Consumption

    ERIC Educational Resources Information Center

    Veletsianos, George; Collier, Amy; Schneider, Emily

    2015-01-01

    Researchers describe with increasing confidence "what" they observe participants doing in massive open online courses (MOOCs). However, our understanding of learner activities in open courses is limited by researchers' extensive dependence on log file analyses and clickstream data to make inferences about learner behaviors. Further, the…

  14. Web-Based Learning Programs: Use by Learners with Various Cognitive Styles

    ERIC Educational Resources Information Center

    Chen, Ling-Hsiu

    2010-01-01

    To consider how Web-based learning program is utilized by learners with different cognitive styles, this study presents a Web-based learning system (WBLS) and analyzes learners' browsing data recorded in the log file to identify how learners' cognitive styles and learning behavior are related. In order to develop an adapted WBLS, this study also…

  15. Developing an Efficient Computational Method that Estimates the Ability of Students in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2012-01-01

    This paper presents a computational method that can efficiently estimate the ability of students from the log files of a Web-based learning environment capturing their problem solving processes. The computational method developed in this study approximates the posterior distribution of the student's ability obtained from the conventional Bayes…

  16. Making Sense of Students' Actions in an Open-Ended Virtual Laboratory Environment

    ERIC Educational Resources Information Center

    Gal, Ya'akov; Uzan, Oriel; Belford, Robert; Karabinos, Michael; Yaron, David

    2015-01-01

    A process for analyzing log files collected from open-ended learning environments is developed and tested on a virtual lab problem involving reaction stoichiometry. The process utilizes a set of visualization tools that, by grouping student actions in a hierarchical manner, helps experts make sense of the linear list of student actions recorded in…

  17. Web processing service for landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Sandric, I.; Ursaru, P.; Chitu, D.; Mihai, B.; Savulescu, I.

    2012-04-01

    Hazard analysis requires heavy computation and specialized software. Web processing services can offer complex solutions that can be accessed through a light client (web or desktop). This paper presents a web processing service (both WPS and Esri Geoprocessing Service) for landslides hazard assessment. The web processing service was build with Esri ArcGIS Server solution and Python, developed using ArcPy, GDAL Python and NumPy. A complex model for landslide hazard analysis using both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation was build and published as WPS and Geoprocessing service using ArcGIS Standard Enterprise 10.1. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. All these parameters can be served by the client from other WFS services or by uploading and processing the data on the server. The user can select the option of creating the first and second derivatives from the DEM automatically on the server or to upload the data already calculated. One of the main dynamic factors from the landslide analysis model is leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index can be derived from various satellite images or downloaded as a product. The upload of such data (time series) is possible using a NetCDF file format. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created and it can be downloaded from the server with the log file

  18. The Spider Center Wide File System; From Concept to Reality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shipman, Galen M; Dillow, David A; Oral, H Sarp

    2009-01-01

    The Leadership Computing Facility (LCF) at Oak Ridge National Laboratory (ORNL) has a diverse portfolio of computational resources ranging from a petascale XT4/XT5 simulation system (Jaguar) to numerous other systems supporting development, visualization, and data analytics. In order to support vastly different I/O needs of these systems Spider, a Lustre-based center wide file system was designed and deployed to provide over 240 GB/s of aggregate throughput with over 10 Petabytes of formatted capacity. A multi-stage InfiniBand network, dubbed as Scalable I/O Network (SION), with over 889 GB/s of bisectional bandwidth was deployed as part of Spider to provide connectivity tomore » our simulation, development, visualization, and other platforms. To our knowledge, while writing this paper, Spider is the largest and fastest POSIX-compliant parallel file system in production. This paper will detail the overall architecture of the Spider system, challenges in deploying and initial testings of a file system of this scale, and novel solutions to these challenges which offer key insights into file system design in the future.« less

  19. Application of a Snow Growth Model to Radar Remote Sensing

    NASA Astrophysics Data System (ADS)

    Erfani, E.; Mitchell, D. L.

    2014-12-01

    Microphysical growth processes of diffusion, aggregation and riming are incorporated analytically in a steady-state snow growth model (SGM) to solve the zeroth- and second- moment conservation equations with respect to mass. The SGM is initiated by radar reflectivity (Zw), supersaturation, temperature, and a vertical profile of the liquid water content (LWC), and it uses a gamma size distribution (SD) to predict the vertical evolution of size spectra. Aggregation seems to play an important role in the evolution of snowfall rates and the snowfall rates produced by aggregation, diffusion and riming are considerably greater than those produced by diffusion and riming alone, demonstrating the strong interaction between aggregation and riming. The impact of ice particle shape on particle growth rates and fall speeds is represented in the SGM in terms of ice particle mass-dimension (m-D) power laws (m = αDβ). These growth rates are qualitatively consistent with empirical growth rates, with slower (faster) growth rates predicted for higher (lower) β values. In most models, β is treated constant for a given ice particle habit, but it is well known that β is larger for the smaller crystals. Our recent work quantitatively calculates β and α for cirrus clouds as a function of D where the m-D expression is a second-order polynomial in log-log space. By adapting this method to the SGM, the ice particle growth rates and fall speeds are predicted more accurately. Moreover, the size spectra predicted by the SGM are in good agreement with those from aircraft measurements during Lagrangian spiral descents through frontal clouds, indicating the successful modeling of microphysical processes. Since the lowest Zw over complex topography is often significantly above cloud base, the precipitation is often underestimated by radar quantitative precipitation estimates (QPE). Our SGM is capable of being initialized with Zw at the lowest reliable radar echo and consequently improves QPE at ground level.

  20. 75 FR 22874 - Claymore Exchange-Traded Fund Trust 3, et al.; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-30

    ... SECURITIES AND EXCHANGE COMMISSION [Investment Company Act Release No. 29256; File No. 812-13534... the Investment Company Act of 1940 (``Act'') for an exemption from sections 2(a)(32), 5(a)(1), 22(d... management investment companies to issue shares (``Shares'') redeemable in large aggregations only...

  1. 12 CFR 621.7 - Rule of aggregation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... constitutes an independent credit risk and such determination is adequately documented in the loan file. (1) A loan shall be considered an independent credit risk if a substantial portion of the loan is guaranteed... credit risks if and so long as: (i) The primary sources of repayment are independent for each loan; (ii...

  2. 18 CFR 46.6 - Contents of the written statement and procedures for filing.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... entity described in § 46.5(c), which produces or supplies electrical equipment for use of such public utility, such person shall provide the following information: (1) The aggregate amount of revenues received by such entity from producing or supplying electrical equipment to such public utility in the...

  3. 78 FR 3931 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-17

    ... thresholds through a single MPID to avoid providing excessive encouragement to members to aggregate the... a particular venue to be excessive, or rebate opportunities available at other venues to be more... standards applicable to exchanges. These competitive forces help to ensure that NASDAQ's fees are reasonable...

  4. 78 FR 22580 - Millington Securities, Inc. and Millington Exchange Traded MAVINS Fund, LLC; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-16

    ... security for inclusion in the Fund's portfolio to have aggregate investment characteristics, fundamental... SECURITIES AND EXCHANGE COMMISSION [Investment Company Act Release No. 30459; File No. 812-13887... an order under section 6(c) of the Investment Company Act of 1940 (the ``Act'') for an exemption from...

  5. 76 FR 78610 - Notice of Intent To Reduce the Frequency of Rice and Potato Stocks Surveys and All Associated...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-19

    ... the resulting publications. Abstract: The primary functions of the National Agricultural Statistics..., filing of petitions and applications and agency #0;statements of organization and functions are examples..., 7 U.S.C. 2276, which requires USDA to afford strict confidentiality to non- aggregated data provided...

  6. 78 FR 48757 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-09

    ... market, with aggregate size (``BBO data,'' sometimes referred to as ``top-of-book data''). Data with...''). This new COB Feed will provide data regarding the Exchange's Complex Order Book (``COB'') and related... to prevent fraudulent and manipulative acts and practices, to promote just and equitable principles...

  7. Sexual Abuse of Vulnerable Young and Old Men

    ERIC Educational Resources Information Center

    Roberto, Karen A.; Teaster, Pamela B.; Nikzad, Katherina A.

    2007-01-01

    During a 4-year period, aggregated data from Adult Protective Services case files in Virginia revealed 17 cases of sexually abused young, middle-age, and old men. The most common types of sexual abuse across age groups involved instances of sexualized kissing and fondling and unwelcome sexual interest in the individual men's bodies. The majority…

  8. 78 FR 25496 - Self-Regulatory Organizations; National Securities Clearing Corporation; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-01

    ... obligations to its Members in the event of the default of the Member or family of affiliated Members (``Affiliated Family'') that would generate the largest aggregate payment obligation to NSCC in stressed... Members and Affiliated Families that regularly incur the largest gross settlement debits over a settlement...

  9. Mission Operations Center (MOC) - Precipitation Processing System (PPS) Interface Software System (MPISS)

    NASA Technical Reports Server (NTRS)

    Ferrara, Jeffrey; Calk, William; Atwell, William; Tsui, Tina

    2013-01-01

    MPISS is an automatic file transfer system that implements a combination of standard and mission-unique transfer protocols required by the Global Precipitation Measurement Mission (GPM) Precipitation Processing System (PPS) to control the flow of data between the MOC and the PPS. The primary features of MPISS are file transfers (both with and without PPS specific protocols), logging of file transfer and system events to local files and a standard messaging bus, short term storage of data files to facilitate retransmissions, and generation of file transfer accounting reports. The system includes a graphical user interface (GUI) to control the system, allow manual operations, and to display events in real time. The PPS specific protocols are an enhanced version of those that were developed for the Tropical Rainfall Measuring Mission (TRMM). All file transfers between the MOC and the PPS use the SSH File Transfer Protocol (SFTP). For reports and data files generated within the MOC, no additional protocols are used when transferring files to the PPS. For observatory data files, an additional handshaking protocol of data notices and data receipts is used. MPISS generates and sends to the PPS data notices containing data start and stop times along with a checksum for the file for each observatory data file transmitted. MPISS retrieves the PPS generated data receipts that indicate the success or failure of the PPS to ingest the data file and/or notice. MPISS retransmits the appropriate files as indicated in the receipt when required. MPISS also automatically retrieves files from the PPS. The unique feature of this software is the use of both standard and PPS specific protocols in parallel. The advantage of this capability is that it supports users that require the PPS protocol as well as those that do not require it. The system is highly configurable to accommodate the needs of future users.

  10. A methodological framework to assess the carbon balance of tropical managed forests.

    PubMed

    Piponiot, Camille; Cabon, Antoine; Descroix, Laurent; Dourdain, Aurélie; Mazzei, Lucas; Ouliac, Benjamin; Rutishauser, Ervan; Sist, Plinio; Hérault, Bruno

    2016-12-01

    Managed forests are a major component of tropical landscapes. Production forests as designated by national forest services cover up to 400 million ha, i.e. half of the forested area in the humid tropics. Forest management thus plays a major role in the global carbon budget, but with a lack of unified method to estimate carbon fluxes from tropical managed forests. In this study we propose a new time- and spatially-explicit methodology to estimate the above-ground carbon budget of selective logging at regional scale. The yearly balance of a logging unit, i.e. the elementary management unit of a forest estate, is modelled by aggregating three sub-models encompassing (i) emissions from extracted wood, (ii) emissions from logging damage and deforested areas and (iii) carbon storage from post-logging recovery. Models are parametrised and uncertainties are propagated through a MCMC algorithm. As a case study, we used 38 years of National Forest Inventories in French Guiana, northeastern Amazonia, to estimate the above-ground carbon balance (i.e. the net carbon exchange with the atmosphere) of selectively logged forests. Over this period, the net carbon balance of selective logging in the French Guianan Permanent Forest Estate is estimated to be comprised between 0.12 and 1.33 Tg C, with a median value of 0.64 Tg C. Uncertainties over the model could be diminished by improving the accuracy of both logging damage and large woody necromass decay submodels. We propose an innovating carbon accounting framework relying upon basic logging statistics. This flexible tool allows carbon budget of tropical managed forests to be estimated in a wide range of tropical regions.

  11. Cyber indicators of compromise: a domain ontology for security information and event management

    DTIC Science & Technology

    2017-03-01

    COMPROMISE: A DOMAIN ONTOLOGY FOR SECURITY INFORMATION AND EVENT MANAGEMENT by Marsha D. Rowell March 2017 Thesis Co-Advisors: J. D...to automate this work is Security Information and Event Management (SIEM). In short, SIEM technology works by aggregating log information , and then...Distribution is unlimited. CYBER INDICATORS OF COMPROMISE: A DOMAIN ONTOLOGY FOR SECURITY INFORMATION AND EVENT MANAGEMENT Marsha D. Rowell

  12. High Performance Data Transfer for Distributed Data Intensive Sciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fang, Chin; Cottrell, R 'Les' A.; Hanushevsky, Andrew B.

    We report on the development of ZX software providing high performance data transfer and encryption. The design scales in: computation power, network interfaces, and IOPS while carefully balancing the available resources. Two U.S. patent-pending algorithms help tackle data sets containing lots of small files and very large files, and provide insensitivity to network latency. It has a cluster-oriented architecture, using peer-to-peer technologies to ease deployment, operation, usage, and resource discovery. Its unique optimizations enable effective use of flash memory. Using a pair of existing data transfer nodes at SLAC and NERSC, we compared its performance to that of bbcp andmore » GridFTP and determined that they were comparable. With a proof of concept created using two four-node clusters with multiple distributed multi-core CPUs, network interfaces and flash memory, we achieved 155Gbps memory-to-memory over a 2x100Gbps link aggregated channel and 70Gbps file-to-file with encryption over a 5000 mile 100Gbps link.« less

  13. A Survey of Complex Object Technologies for Digital Libraries

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L.; Argue, Brad; Efron, Miles; Denn, Sheila; Pattuelli, Maria Cristina

    2001-01-01

    Many early web-based digital libraries (DLs) had implicit assumptions reflected in their architecture that the unit of focus in the DL (frequently "reports" or "e-prints") would only be manifested in a single, or at most a few, common file formats such as PDF or PostScript. DLs have now matured to the point where their contents are commonly no longer simple files. Complex objects in DLs have emerged from in response to various requirements, including: simple aggregation of formats and supporting files, bundling additional information to aid digital preservation, creating opaque digital objects for e-commerce applications, and the incorporation of dynamic services with the traditional data files. We examine a representative (but not necessarily exhaustive) number of current and recent historical web-based complex object technologies and projects that are applicable to DLs: Aurora, Buckets, ComMentor, Cryptolopes, Digibox, Document Management Alliance, FEDORA, Kahn-Wilensky Framework Digital Objects, Metadata Encoding & Transmission Standard, Multivalent Documents, Open eBooks, VERS Encapsulated Objects, and the Warwick Framework.

  14. Early Warning Signals of Financial Crises with Multi-Scale Quantile Regressions of Log-Periodic Power Law Singularities.

    PubMed

    Zhang, Qun; Zhang, Qunzhi; Sornette, Didier

    2016-01-01

    We augment the existing literature using the Log-Periodic Power Law Singular (LPPLS) structures in the log-price dynamics to diagnose financial bubbles by providing three main innovations. First, we introduce the quantile regression to the LPPLS detection problem. This allows us to disentangle (at least partially) the genuine LPPLS signal and the a priori unknown complicated residuals. Second, we propose to combine the many quantile regressions with a multi-scale analysis, which aggregates and consolidates the obtained ensembles of scenarios. Third, we define and implement the so-called DS LPPLS Confidence™ and Trust™ indicators that enrich considerably the diagnostic of bubbles. Using a detailed study of the "S&P 500 1987" bubble and presenting analyses of 16 historical bubbles, we show that the quantile regression of LPPLS signals contributes useful early warning signals. The comparison between the constructed signals and the price development in these 16 historical bubbles demonstrates their significant predictive ability around the real critical time when the burst/rally occurs.

  15. Toward Data-Driven Radiology Education-Early Experience Building Multi-Institutional Academic Trainee Interpretation Log Database (MATILDA).

    PubMed

    Chen, Po-Hao; Loehfelm, Thomas W; Kamer, Aaron P; Lemmon, Andrew B; Cook, Tessa S; Kohli, Marc D

    2016-12-01

    The residency review committee of the Accreditation Council of Graduate Medical Education (ACGME) collects data on resident exam volume and sets minimum requirements. However, this data is not made readily available, and the ACGME does not share their tools or methodology. It is therefore difficult to assess the integrity of the data and determine if it truly reflects relevant aspects of the resident experience. This manuscript describes our experience creating a multi-institutional case log, incorporating data from three American diagnostic radiology residency programs. Each of the three sites independently established automated query pipelines from the various radiology information systems in their respective hospital groups, thereby creating a resident-specific database. Then, the three institutional resident case log databases were aggregated into a single centralized database schema. Three hundred thirty residents and 2,905,923 radiologic examinations over a 4-year span were catalogued using 11 ACGME categories. Our experience highlights big data challenges including internal data heterogeneity and external data discrepancies faced by informatics researchers.

  16. Neuromorphic log-domain silicon synapse circuits obey bernoulli dynamics: a unifying tutorial analysis

    PubMed Central

    Papadimitriou, Konstantinos I.; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M.

    2014-01-01

    The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4th) order topology. PMID:25653579

  17. Neuromorphic log-domain silicon synapse circuits obey bernoulli dynamics: a unifying tutorial analysis.

    PubMed

    Papadimitriou, Konstantinos I; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M

    2014-01-01

    The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4(th)) order topology.

  18. An analysis of image storage systems for scalable training of deep neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Young, Steven R; Patton, Robert M

    This study presents a principled empirical evaluation of image storage systems for training deep neural networks. We employ the Caffe deep learning framework to train neural network models for three different data sets, MNIST, CIFAR-10, and ImageNet. While training the models, we evaluate five different options to retrieve training image data: (1) PNG-formatted image files on local file system; (2) pushing pixel arrays from image files into a single HDF5 file on local file system; (3) in-memory arrays to hold the pixel arrays in Python and C++; (4) loading the training data into LevelDB, a log-structured merge tree based key-valuemore » storage; and (5) loading the training data into LMDB, a B+tree based key-value storage. The experimental results quantitatively highlight the disadvantage of using normal image files on local file systems to train deep neural networks and demonstrate reliable performance with key-value storage based storage systems. When training a model on the ImageNet dataset, the image file option was more than 17 times slower than the key-value storage option. Along with measurements on training time, this study provides in-depth analysis on the cause of performance advantages/disadvantages of each back-end to train deep neural networks. We envision the provided measurements and analysis will shed light on the optimal way to architect systems for training neural networks in a scalable manner.« less

  19. Toward a Real-Time (Day) Dreamcatcher: Sensor-Free Detection of Mind Wandering during Online Reading

    ERIC Educational Resources Information Center

    Mills, Caitlin; D'Mello, Sidney

    2015-01-01

    This paper reports the results from a sensor-free detector of mind wandering during an online reading task. Features consisted of reading behaviors (e.g., reading time) and textual features (e.g., level of difficulty) extracted from self-paced reading log files. Supervised machine learning was applied to two datasets in order to predict if…

  20. Real-Time Population Health Detector

    DTIC Science & Technology

    2004-11-01

    military and civilian populations. General Dynamics (then Veridian Systems Division), in cooperation with Stanford University, won a competitive DARPA...via the sequence of one-step ahead forecast errors from the Kalman recursions: 1| −−= tttt Hye µ The log-likelihood then follows by treating the... parking in the transient parking structure. Norfolk Area Military Treatment Facility Patient Files GDAIS received historic CHCS data from all

  1. Diagnostic Problem-Solving Process in Professional Contexts: Theory and Empirical Investigation in the Context of Car Mechatronics Using Computer-Generated Log-Files

    ERIC Educational Resources Information Center

    Abele, Stephan

    2018-01-01

    This article deals with a theory-based investigation of the diagnostic problem-solving process in professional contexts. To begin with, a theory of the diagnostic problem-solving process was developed drawing on findings from different professional contexts. The theory distinguishes between four sub-processes of the diagnostic problem-solving…

  2. Some Features of "Alt" Texts Associated with Images in Web Pages

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    2006-01-01

    Introduction: This paper extends a series on summaries of Web objects, in this case, the alt attribute of image files. Method: Data were logged from 1894 pages from Yahoo!'s random page service and 4703 pages from the Google directory; an img tag was extracted randomly from each where present; its alt attribute, if any, was recorded; and the…

  3. Sediment data collected in 2013 from the northern Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    Buster, Noreen A.; Kelso, Kyle W.; Bernier, Julie C.; Flocks, James G.; Miselis, Jennifer L.; DeWitt, Nancy T.

    2014-01-01

    This data series serves as an archive of sediment data collected in July 2013 from the Chandeleur Islands sand berm and adjacent barrier-island environments. Data products include descriptive core logs, core photographs and x-radiographs, results of sediment grain-size analyses, sample location maps, and Geographic Information System data files with accompanying formal Federal Geographic Data Committee metadata.

  4. Well construction information, lithologic logs, water level data, and overview of research in Handcart Gulch, Colorado: an alpine watershed affected by metalliferous hydrothermal alteration

    USGS Publications Warehouse

    Caine, Jonathan S.; Manning, Andrew H.; Verplanck, Philip L.; Bove, Dana J.; Kahn, Katherine Gurley; Ge, Shemin

    2006-01-01

    Integrated, multidisciplinary studies of the Handcart Gulch alpine watershed provide a unique opportunity to study and characterize the geology and hydrology of an alpine watershed along the Continental Divide. The study area arose out of the donation of four abandoned, deep mineral exploration boreholes to the U.S. Geological Survey for research purposes by Mineral Systems Inc. These holes were supplemented with nine additional shallow holes drilled by the U.S. Geological Survey along the Handcart Gulch trunk stream. All of the holes were converted into observation wells, and a variety of data and samples were measured and collected from each. This open-file report contains: (1) An overview of the research conducted to date in Handcart Gulch; (2) well location, construction, lithologic log, and water level data from the research boreholes; and (3) a brief synopsis of preliminary results. The primary purpose of this report is to provide a research overview as well as raw data from the boreholes. Interpretation of the data will be reported in future publications. The drill hole data were tabulated into a spreadsheet included with this digital open-file report.

  5. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  6. 17 CFR 240.13d-1 - Filing of Schedules 13D and 13G.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... regulatory scheme applicable to the equivalent U.S. institution; and (K) A group, provided that all the... influencing the control of the issuer, nor in connection with or as a participant in any transaction having... or control person, provided the aggregate amount held directly by the parent or control person, and...

  7. 17 CFR 240.13d-1 - Filing of Schedules 13D and 13G.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... regulatory scheme applicable to the equivalent U.S. institution; and (K) A group, provided that all the... control of the issuer, nor in connection with or as a participant in any transaction having such purpose... or control person, provided the aggregate amount held directly by the parent or control person, and...

  8. 77 FR 73089 - Cambria Investment Management, L.P. and Cambria ETF Trust; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-07

    ... Investing Fund's Advisory Group will not control (individually or in the aggregate) a Fund within the... group of investment companies as the series to acquire Shares. DATES: Filing Dates: The application was... Fund will (a) be advised by Cambria or an entity controlling, controlled by, or under common control...

  9. 77 FR 67431 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-09

    ..., and adds liquidity to the Exchange as a Supplemental Liquidity Provider (``SLP'') for all assigned SLP securities in the aggregate (including shares of both an SLP proprietary trading unit (``SLP- Prop'') and an SLP market maker (``SLMM'') of the same member organization) of more than 0.25% of NYSE CADV. The...

  10. Improvement of DHRA-DMDC Physical Access Software DBIDS Using Cloud Computing Technology: A Case Study

    DTIC Science & Technology

    2012-06-01

    technology originally developed on the Java platform. The Hibernate framework supports rapid development of a data access layer without requiring a...31 viii 2. Hibernate ................................................................................ 31 3. Database Design...protect from security threats; o Easy aggregate management operations via file tags; 2. Hibernate We recommend using Hibernate technology for object

  11. Pancreatic Ribonucleases Superfamily Dynamics

    DOE Data Explorer

    Agarwal, Pratul

    2016-01-01

    This data set consists of molecular dynamics simulations based flexibility/dynamics derived for family members of pancreatic ribonucleases. The results are based on two independent 0.5 microsecond trajectories for each of the 23 members. The flexibility is computed at aggregation of first ten quasi-harmonic modes, and indicated in the temperature factor column of PDB (protein data bank) file format.

  12. 78 FR 25334 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-30

    ... aggregate size (``BBO data,'' sometimes referred to as ``top-of-book data''). Data with respect to executed... for complex strategies (e.g., spreads, straddles, buy-writes, etc.) (``Spread Data''), (iv) BBO data... approved redistributor (i.e., a market data vendor or an extranet service provider) and then distributes it...

  13. 75 FR 67417 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-02

    ... trading platform at a given time, not both. What trading platform an individual series trades on is... remove) series from the Hybrid Trading Platform we plan to revert back to the general approach of... quotes which represent the aggregate Market-Maker quoting interest in the series for the trading crowd...

  14. 76 FR 62872 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-11

    ... the calculation of the SQT Fees to incentivize trading in equity options, excluding currencies and... the Exchange's options floor and thereby provide liquidity for floor-brokered orders traded in-crowd... Fee based on the aggregate amount of equity options and index options traded by the SQTs in that...

  15. Fat Cat and Friends: Which University Pays Its General Staff the Best?

    ERIC Educational Resources Information Center

    Dobson, Ian R.

    2008-01-01

    Universities are required by legislation to provide annual staff unit record files to the Department of Education, Science and Training (DEST). Most universities seem to undertake this task with appropriate diligence. This paper examines the latest staff aggregated data set released by DEST (2005 data), in order to compare average salaries paid by…

  16. 78 FR 14617 - Self-Regulatory Organizations; Miami International Securities Exchange LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-06

    ..., with aggregate size and last sale information, subscribers to ToM will also receive: Opening imbalance... information. \\3\\ Where there is an imbalance at the price at which the maximum number of contracts can trade... Timer or Imbalance Timer expires if material conditions of the market (imbalance size, ABBO price or...

  17. 26 CFR 1.368-3 - Records to be kept and information to be filed with returns.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... such parties; (2) The date of the reorganization; (3) The aggregate fair market value and basis, determined immediately before the exchange, of the assets, stock or securities of the target corporation... all of the parties to the reorganization; (2) The date of the reorganization; and (3) The fair market...

  18. 78 FR 57191 - Self-Regulatory Organizations; BOX Options Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-17

    ... transactions in the BOX Price Improvement Period (``PIP'') on the BOX Market LLC (``BOX'') options facility... to a PIP auction, and the retention rates of Initiating Participants and those market makers who... believes that in the aggregate, the long term data trends demonstrate there has not been a decline in...

  19. 78 FR 25118 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-29

    ... not marked dark is already visible to DMMs. Similarly, aggregated information for interest not marked dark is visible to any market participant beyond the Floor via OpenBook.\\20\\ \\19\\ Exchange systems make... ``dark'' orders that are not visible to the DMM, which would prevent any communication about such...

  20. 77 FR 32640 - Designation of a Class of Employees for Addition to the Special Exposure Cohort

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-01

    ... Special Exposure Cohort AGENCY: National Institute for Occupational Safety and Health (NIOSH), Centers for... number of work days aggregating at least 250 work days, occurring either solely under this employment or... Occupational Safety and Health. [FR Doc. 2012-13381 Filed 5-31-12; 8:45 am] BILLING CODE 4163-19-P ...

  1. 77 FR 9250 - Designation of a Class of Employees for Addition to the Special Exposure Cohort

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-16

    ... Special Exposure Cohort AGENCY: National Institute for Occupational Safety and Health (NIOSH), Centers for... January 1, 1953, through September 30, 1972, for a number of work days aggregating at least 250 work days... . John Howard, Director, National Institute for Occupational Safety and Health. [FR Doc. 2012-3645 Filed...

  2. 77 FR 58381 - Designation of a Class of Employees for Addition to the Special Exposure Cohort

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-20

    ... Special Exposure Cohort AGENCY: National Institute for Occupational Safety and Health (NIOSH), Centers for... December 31, 1967, for a number of work days aggregating at least 250 work days, occurring either solely..., National Institute for Occupational Safety and Health. [FR Doc. 2012-23207 Filed 9-19-12; 8:45 am] BILLING...

  3. The matching law in and within groups of rats1

    PubMed Central

    Graft, D. A.; Lea, S. E. G.; Whitworth, T. L.

    1977-01-01

    In each of the two experiments, a group of five rats lived in a complex maze containing four small single-lever operant chambers. In two of these chambers, food was available on variable-interval schedules of reinforcement. In Experiment I, nine combinations of variable intervals were used, and the aggregate lever-pressing rates (by the five rats together) were studied. The log ratio of the rates in the two chambers was linearly related to the log ratio of the reinforcement rates in them; this is an instance of Herrnstein's matching law, as generalized by Baum. Summing over the two food chambers, food consumption decreased, and response output increased, as the time required to earn each pellet increased. In Experiment II, the behavior of individual rats was observed by time-sampling on selected days, while different variable-interval schedules were arranged in the two chambers where food was available. Individual lever-pressing rates for the rats were obtained, and their median bore the same “matching” relationship to the reinforcement rates as the group aggregate in Experiment I. There were differences between the rats in their distribution of time and responses between the two food chambers; these differences were correlated with differences in the proportions of reinforcements the rats obtained from each chamber. PMID:16811975

  4. Seabird acoustic communication at sea: a new perspective using bio-logging devices.

    PubMed

    Thiebault, Andréa; Pistorius, Pierre; Mullers, Ralf; Tremblay, Yann

    2016-08-05

    Most seabirds are very noisy at their breeding colonies, when aggregated in high densities. Calls are used for individual recognition and also emitted during agonistic interactions. When at sea, many seabirds aggregate over patchily distributed resources and may benefit from foraging in groups. Because these aggregations are so common, it raises the question of whether seabirds use acoustic communication when foraging at sea? We deployed video-cameras with built in microphones on 36 Cape gannets (Morus capensis) during the breeding season of 2010-2011 at Bird Island (Algoa Bay, South Africa) to study their foraging behaviour and vocal activity at sea. Group formation was derived from the camera footage. During ~42 h, calls were recorded on 72 occasions from 16 birds. Vocalization exclusively took place in the presence of conspecifics, and mostly in feeding aggregations (81% of the vocalizations). From the observation of the behaviours of birds associated with the emission of calls, we suggest that the calls were emitted to avoid collisions between birds. Our observations show that at least some seabirds use acoustic communication when foraging at sea. These findings open up new perspectives for research on seabirds foraging ecology and their interactions at sea.

  5. Seabird acoustic communication at sea: a new perspective using bio-logging devices

    PubMed Central

    Thiebault, Andréa; Pistorius, Pierre; Mullers, Ralf; Tremblay, Yann

    2016-01-01

    Most seabirds are very noisy at their breeding colonies, when aggregated in high densities. Calls are used for individual recognition and also emitted during agonistic interactions. When at sea, many seabirds aggregate over patchily distributed resources and may benefit from foraging in groups. Because these aggregations are so common, it raises the question of whether seabirds use acoustic communication when foraging at sea? We deployed video-cameras with built in microphones on 36 Cape gannets (Morus capensis) during the breeding season of 2010–2011 at Bird Island (Algoa Bay, South Africa) to study their foraging behaviour and vocal activity at sea. Group formation was derived from the camera footage. During ~42 h, calls were recorded on 72 occasions from 16 birds. Vocalization exclusively took place in the presence of conspecifics, and mostly in feeding aggregations (81% of the vocalizations). From the observation of the behaviours of birds associated with the emission of calls, we suggest that the calls were emitted to avoid collisions between birds. Our observations show that at least some seabirds use acoustic communication when foraging at sea. These findings open up new perspectives for research on seabirds foraging ecology and their interactions at sea. PMID:27492779

  6. Effect of different aspirin doses on arterial thrombosis after canine carotid endarterectomy: a scanning electron microscope and indium-111-labeled platelet study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ercius, M.S.; Chandler, W.F.; Ford, J.W.

    Although it is widely accepted that aspirin inhibits platelet aggregation in arterial thrombosis, the appropriate dosage of aspirin remains quite controversial. The purpose of this study was to determine the effect of different doses of aspirin (0.5 mg/kg vs. 10 mg/kg) on mural thrombus formation after carotid endarterectomy. Eighteen hours after oral aspirin administration, 20 endarterectomies were performed on mongrel dogs with the use of the operating microscope. Blood flow was then restored for 3 hours and the vessels were prepared for investigation with the scanning electron microscope. Ten endarterectomies were also performed on unmedicated dogs as controls. Five minutesmore » before vessel unclamping, autologous indium-111-labeled platelets were administered intravenously, and the endarterectomized portions of the vessels were studied with a gamma counter system after harvesting. Group 1, the control group, revealed extensive mural thrombus consisting of platelet aggregates, fibrin, red blood cells, and white blood cells. Six of the 10 vessels in Group 2, premedicated with 0.5 mg of aspirin per kg, demonstrated varying amounts of mural thrombus. Group 3 (10 vessels), premedicated with 10 mg of aspirin per kg, revealed a platelet monolayer completely covering the exposed vessel wall media, with scattered white blood cells and infrequent fine fibrin strands overlying the platelet surface. The mean (+/- SD) radioactivity per group expressed as counts/minute/mm2 was: Group 1--2055.3 +/- 1905.5, log . 7.253 +/- 0.926; Group 2--1235.6 +/- 1234.3, log . 6.785 +/- 0.817; Group 3--526 +/- 433.06, log . 5.989 +/- 0.774.« less

  7. First Report of Mating Disruption With an Aggregation Pheromone: A Case Study With Tetropium fuscum (Coleoptera: Cerambycidae).

    PubMed

    Sweeney, Jon; Silk, Peter J; Rhainds, Marc; MacKay, Wayne; Hughes, Cory; Van Rooyen, Kate; MacKinnon, Wayne; Leclair, Gaetan; Holmes, Steve; Kettela, Edward G

    2017-06-01

    Tetropium fuscum (F.), native to Europe and established in Nova Scotia, Canada, since at least 1990, is considered a low-to-moderate threat to spruce (Picea spp.) forests in North America and regulated as a quarantine pest by the Canadian Food Inspection Agency. We tested broadcast applications of the aggregation pheromone racemic (5E)-6,10-dimethyl-5,9-undecadien-2-ol (fuscumol), formulated at 10% concentration in Hercon Bio-Flakes (Hercon International, Emigsville, PA), for efficacy in disrupting T. fuscum mating and suppressing populations. Two applications of 2.5-2.75 kg Bio-Flakes (250-275 g a.i.) per ha per season significantly reduced trap catches and mating success (2009, 2010, 2012): about 30% of females trapped in treated plots had mated compared with 60% of females trapped in untreated plots. Similar reductions in mating success were observed in 2011 with one or two 4.5 kg/ha applications of Bio-Flakes. Mean densities of T. fuscum colonizing sentinel bait logs or girdled trees were 36% lower in pheromone-treated plots than in untreated plots, but the difference was not statistically significant. Lack of population suppression may have been because mated females immigrated into treated plots or because populations were so high that despite a 50% reduction in mating success, absolute numbers of mated females were sufficient to infest our bait logs or trees. This is the first demonstration of insect mating disruption via broadcast application of an aggregation pheromone. Pheromone-mediated mating disruption has potential to slow the spread of invasive cerambycids by targeting low-density outlier populations near or beyond the leading edge of an infestation. © Crown copyright 2017.

  8. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  9. Perceived Task-Difficulty Recognition from Log-File Information for the Use in Adaptive Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Janning, Ruth; Schatten, Carlotta; Schmidt-Thieme, Lars

    2016-01-01

    Recognising students' emotion, affect or cognition is a relatively young field and still a challenging task in the area of intelligent tutoring systems. There are several ways to use the output of these recognition tasks within the system. The approach most often mentioned in the literature is using it for giving feedback to the students. The…

  10. Scalable Trust of Next-Generation Management (STRONGMAN)

    DTIC Science & Technology

    2004-10-01

    remote logins might be policy controlled to allow only strongly encrypted IPSec tunnels to log in remotely, to access selected files, etc. The...and Angelos D. Keromytis. Drop-in Security for Distributed and Portable Computing Elements. Emerald Journal of Internet Research. Electronic...Security and Privacy, pp. 17-31, May 1999. [2] S. M. Bellovin. Distributed Firewalls. ; login : magazine, special issue on security, November 1999. [3] M

  11. 77 FR 66608 - New England Hydropower Company, LLC; Notice of Preliminary Permit Application Accepted for Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-06

    ... Spillway Dike with an 8-foot-long stop-log slot; (2) an existing 31-foot-long, 42-inch-diameter low level penstock; (3) an existing 0.13 acre impoundment with a normal maximum water surface elevation of 66.3 feet... transmission line connected to the NSTAR regional grid. The project would have an estimated average annual...

  12. 76 FR 7838 - Claverack Creek, LLC; Notice of Preliminary Permit Application Accepted for Filing and Soliciting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ...-deep intake canal; (5) new trash racks, head gates, and stop log structure; (6) an existing 6-foot... Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site http://www.ferc... copy of the application, can be viewed or printed on the ``eLibrary'' link of the Commission's Web site...

  13. Evaluation of an interactive case simulation system in dermatology and venereology for medical students

    PubMed Central

    Wahlgren, Carl-Fredrik; Edelbring, Samuel; Fors, Uno; Hindbeck, Hans; Ståhle, Mona

    2006-01-01

    Background Most of the many computer resources used in clinical teaching of dermatology and venereology for medical undergraduates are information-oriented and focus mostly on finding a "correct" multiple-choice alternative or free-text answer. We wanted to create an interactive computer program, which facilitates not only factual recall but also clinical reasoning. Methods Through continuous interaction with students, a new computerised interactive case simulation system, NUDOV, was developed. It is based on authentic cases and contains images of real patients, actors and healthcare providers. The student selects a patient and proposes questions for medical history, examines the skin, and suggests investigations, diagnosis, differential diagnoses and further management. Feedback is given by comparing the user's own suggestions with those of a specialist. In addition, a log file of the student's actions is recorded. The program includes a large number of images, video clips and Internet links. It was evaluated with a student questionnaire and by randomising medical students to conventional teaching (n = 85) or conventional teaching plus NUDOV (n = 31) and comparing the results of the two groups in a final written examination. Results The questionnaire showed that 90% of the NUDOV students stated that the program facilitated their learning to a large/very large extent, and 71% reported that extensive working with authentic computerised cases made it easier to understand and learn about diseases and their management. The layout, user-friendliness and feedback concept were judged as good/very good by 87%, 97%, and 100%, respectively. Log files revealed that the students, in general, worked with each case for 60–90 min. However, the intervention group did not score significantly better than the control group in the written examination. Conclusion We created a computerised case simulation program allowing students to manage patients in a non-linear format supporting the clinical reasoning process. The student gets feedback through comparison with a specialist, eliminating the need for external scoring or correction. The model also permits discussion of case processing, since all transactions are stored in a log file. The program was highly appreciated by the students, but did not significantly improve their performance in the written final examination. PMID:16907972

  14. An Elegant Sufficiency: Load-Aware Differentiated Scheduling of Data Transfers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kettimuthu, Rajkumar; Vardoyan, Gayane; Agrawal, Gagan

    2015-11-15

    We investigate the file transfer scheduling problem, where transfers among different endpoints must be scheduled to maximize pertinent metrics. We propose two new algorithms that exploit the fact that the aggregate bandwidth obtained over a network or at a storage system tends to increase with the number of concurrent transfers—but only up to a certain limit. The first algorithm, SEAL, uses runtime information and data-driven models to approximate system load and adapt transfer schedules and concurrency so as to maximize performance while avoiding saturation. We implement this algorithm using GridFTP as the transfer protocol and evaluate it using real transfermore » logs in a production WAN environment. Results show that SEAL can improve average slowdowns and turnaround times by up to 25% and worst-case slowdown and turnaround times by up to 50%, compared with the best-performing baseline scheme. Our second algorithm, STEAL, further leverages user-supplied categorization of transfers as either “interactive” (requiring immediate processing) or “batch” (less time-critical). Results show that STEAL reduces the average slowdown of interactive transfers by 63% compared to the best-performing baseline and by 21% compared to SEAL. For batch transfers, compared to the best-performing baseline, STEAL improves by 18% the utilization of the bandwidth unused by interactive transfers. By elegantly ensuring a sufficient, but not excessive, allocation of concurrency to the right transfers, we significantly improve overall performance despite constraints.« less

  15. HS.Register - An Audit-Trail Tool to Respond to the General Data Protection Regulation (GDPR).

    PubMed

    Gonçalves-Ferreira, Duarte; Leite, Mariana; Santos-Pereira, Cátia; Correia, Manuel E; Antunes, Luis; Cruz-Correia, Ricardo

    2018-01-01

    Introduction The new General Data Protection Regulation (GDPR) compels health care institutions and their software providers to properly document all personal data processing and provide clear evidence that their systems are inline with the GDPR. All applications involved in personal data processing should therefore produce meaningful event logs that can later be used for the effective auditing of complex processes. Aim This paper aims to describe and evaluate HS.Register, a system created to collect and securely manage at scale audit logs and data produced by a large number of systems. Methods HS.Register creates a single audit log by collecting and aggregating all kinds of meaningful event logs and data (e.g. ActiveDirectory, syslog, log4j, web server logs, REST, SOAP and HL7 messages). It also includes specially built dashboards for easy auditing and monitoring of complex processes, crossing different systems in an integrated way, as well as providing tools for helping on the auditing and on the diagnostics of difficult problems, using a simple web application. HS.Register is currently installed at five large Portuguese Hospitals and is composed of the following open-source components: HAproxy, RabbitMQ, Elasticsearch, Logstash and Kibana. Results HS.Register currently collects and analyses an average of 93 million events per week and it is being used to document and audit HL7 communications. Discussion Auditing tools like HS.Register are likely to become mandatory in the near future to allow for traceability and detailed auditing for GDPR compliance.

  16. No3CoGP: non-conserved and conserved coexpressed gene pairs.

    PubMed

    Mal, Chittabrata; Aftabuddin, Md; Kundu, Sudip

    2014-12-08

    Analyzing the microarray data of different conditions, one can identify the conserved and condition-specific genes and gene modules, and thus can infer the underlying cellular activities. All the available tools based on Bioconductor and R packages differ in how they extract differential coexpression and at what level they study. There is a need for a user-friendly, flexible tool which can start analysis using raw or preprocessed microarray data and can report different levels of useful information. We present a GUI software, No3CoGP: Non-Conserved and Conserved Coexpressed Gene Pairs which takes Affymetrix microarray data (.CEL files or log2 normalized.txt files) along with annotation file (.csv file), Chip Definition File (CDF file) and probe file as inputs, utilizes the concept of network density cut-off and Fisher's z-test to extract biologically relevant information. It can identify four possible types of gene pairs based on their coexpression relationships. These are (i) gene pair showing coexpression in one condition but not in the other, (ii) gene pair which is positively coexpressed in one condition but negatively coexpressed in the other condition, (iii) positively and (iv) negatively coexpressed in both the conditions. Further, it can generate modules of coexpressed genes. Easy-to-use GUI interface enables researchers without knowledge in R language to use No3CoGP. Utilization of one or more CPU cores, depending on the availability, speeds up the program. The output files stored in the respective directories under the user-defined project offer the researchers to unravel condition-specific functionalities of gene, gene sets or modules.

  17. 77 FR 63406 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-16

    ... Liquidity Provider (``SLP'') for all assigned SLP securities in the aggregate (including shares of both a SLP proprietary trading unit (``SLP-Prop'') and a SLP market maker (``SLMM'') of the same member... per share price of $1.00 or more, if the SLP (i) meets the 10% average or more quoting requirement in...

  18. 75 FR 6750 - Self-Regulatory Organizations; The Options Clearing Corporation; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-10

    ... Change Relating to Options for Which the Premium and Exercise Price Are Expressed as a Multiple of the... Rules to accommodate options for which the premium and exercise price are expressed on other than a per... (definition of ``Aggregate Exercise Price'') and OCC Rule 805(d)(2) to accommodate options for which the...

  19. 76 FR 35495 - Self-Regulatory Organizations; NYSE Amex LLC; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-17

    ... Exchange will aggregate routing and market making activity in the case of an ATP firm that has both a routing and a market making arm affiliated with its operation. For purposes of determining whether the routing and market making arm are ``affiliated'' with the ATP firm, the Exchange will apply a 70% common...

  20. 11 CFR 104.18 - Electronic filing of reports (2 U.S.C. 432(d) and 434(a)(11)).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... $50,000 in any calendar year; or (ii) The political committee or other person has made expenditures or has reason to expect to make expenditures aggregating in excess of $50,000 in any calendar year. (2... electronically all subsequent reports covering financial activity for the remainder of the calendar year. All...

  1. 76 FR 18591 - Self-Regulatory Organizations; NYSE Amex LLC; Notice of Filing of Proposed Rule Change Relating...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-04

    ...'s Public Reference Room, on the Commission's Web site at http://www.sec.gov , and on http://www.nyse... periods or (iii) such Founding Firm's entry into certain economic arrangements as further detailed below... Class A Common Interests, the product of (w) the Aggregate Class A Economic Allocation multiplied by (x...

  2. Hey Big Spender! An Analysis of Australian Universities and How Much They Pay Their General Staff

    ERIC Educational Resources Information Center

    Dobson, Ian R.

    2009-01-01

    Analysis of aggregated data files on staff sent by all Australian universities to DEST in 2007 and of salary schedules posted on university websites reveals a considerable variation between salaries paid to general staff at each salary level and the relative seniority of those staff. This paper outlines the differences in staffing structures and…

  3. 78 FR 25102 - Self-Regulatory Organizations; NYSE MKT LLC; Notice of Filing of Proposed Rule Change To Amend...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-29

    ... selling interest that is not marked dark is already visible to DMMs. Similarly, aggregated information for interest not marked dark is visible to any market participant beyond the Floor via OpenBook.\\21\\ \\20... or completely ``dark'' orders that are not visible to the DMM, which would prevent any communication...

  4. 77 FR 58382 - Designation of a Class of Employees for Addition to the Special Exposure Cohort

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-20

    ... Special Exposure Cohort AGENCY: National Institute for Occupational Safety and Health (NIOSH), Centers for... December 31, 1961, for a number of work days aggregating at least 250 work days, occurring either solely... Institute for Occupational Safety and Health. [FR Doc. 2012-23272 Filed 9-19-12; 8:45 am] BILLING CODE 4163...

  5. 77 FR 38835 - Final Effect of Designation of a Class of Employees for Addition to the Special Exposure Cohort

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-29

    ... Addition to the Special Exposure Cohort AGENCY: National Institute for Occupational Safety and Health... aggregating at least 250 work days, occurring either solely under this employment, or in combination with work... Occupational Safety and Health. [FR Doc. 2012-15968 Filed 6-28-12; 8:45 am] BILLING CODE 4163-19-P ...

  6. Survey Available Computer Software for Automated Production Planning and Inventory Control, and Software and Hardware for Data Logging and Monitoring Shop Floor Activities

    DTIC Science & Technology

    1993-08-01

    pricing and sales, order processing , and purchasing. The class of manufacturing planning functions include aggregate production planning, materials...level. I Depending on the application, each control level will have a number of functions associated with it. For instance, order processing , purchasing...include accounting, sales forecasting, product costing, pricing and sales, order processing , and purchasing. The class of manufacturing planning functions

  7. COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.

    PubMed

    Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas

    2014-12-14

    With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.

  8. Console Log Keeping Made Easier - Tools and Techniques for Improving Quality of Flight Controller Activity Logs

    NASA Technical Reports Server (NTRS)

    Scott, David W.; Underwood, Debrah (Technical Monitor)

    2002-01-01

    At the Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) for International Space Station (ISS), each flight controller maintains detailed logs of activities and communications at their console position. These logs are critical for accurately controlling flight in real-time as well as providing a historical record and troubleshooting tool. This paper describes logging methods and electronic formats used at the POIC and provides food for thought on their strengths and limitations, plus proposes some innovative extensions. It also describes an inexpensive PC-based scheme for capturing and/or transcribing audio clips from communications consoles. Flight control activity (e.g. interpreting computer displays, entering data/issuing electronic commands, and communicating with others) can become extremely intense. It's essential to document it well, but the effort to do so may conflict with actual activity. This can be more than just annoying, as what's in the logs (or just as importantly not in them) often feeds back directly into the quality of future operations, whether short-term or long-term. In earlier programs, such as Spacelab, log keeping was done on paper, often using position-specific shorthand, and the other reader was at the mercy of the writer's penmanship. Today, user-friendly software solves the legibility problem and can automate date/time entry, but some content may take longer to finish due to individual typing speed and less use of symbols. File layout can be used to great advantage in making types of information easy to find, and creating searchable master logs for a given position is very easy and a real lifesaver in reconstructing events or researching a given topic. We'll examine log formats from several console position, and the types of information that are included and (just as importantly) excluded. We'll also look at when a summary or synopsis is effective, and when extensive detail is needed.

  9. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool.

    PubMed

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data. FQC is implemented in Python 3 and Javascript, and is maintained under an MIT license. Documentation and source code is available at: https://github.com/pnnl/fqc . joseph.brown@pnnl.gov. © The Author(s) 2017. Published by Oxford University Press.

  10. Archive of digital CHIRP seismic reflection data collected during USGS cruise 06FSH01 offshore of Siesta Key, Florida, May 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.; Robbins, Lisa L.

    2007-01-01

    In May of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Siesta Key, Florida. This report serves as an archive of unprocessed digital chirp seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  11. Archive of digital CHIRP seismic reflection data collected during USGS cruise 06SCC01 offshore of Isles Dernieres, Louisiana, June 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Ferina, Nick F.; Wiese, Dana S.; Flocks, James G.

    2007-01-01

    In June of 2006, the U.S. Geological Survey conducted a geophysical survey offshore of Isles Dernieres, Louisiana. This report serves as an archive of unprocessed digital CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic UNIX (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  12. Aero/fluids database system

    NASA Technical Reports Server (NTRS)

    Reardon, John E.; Violett, Duane L., Jr.

    1991-01-01

    The AFAS Database System was developed to provide the basic structure of a comprehensive database system for the Marshall Space Flight Center (MSFC) Structures and Dynamics Laboratory Aerophysics Division. The system is intended to handle all of the Aerophysics Division Test Facilities as well as data from other sources. The system was written for the DEC VAX family of computers in FORTRAN-77 and utilizes the VMS indexed file system and screen management routines. Various aspects of the system are covered, including a description of the user interface, lists of all code structure elements, descriptions of the file structures, a description of the security system operation, a detailed description of the data retrieval tasks, a description of the session log, and a description of the archival system.

  13. Early Warning Signals of Financial Crises with Multi-Scale Quantile Regressions of Log-Periodic Power Law Singularities

    PubMed Central

    Zhang, Qun; Zhang, Qunzhi; Sornette, Didier

    2016-01-01

    We augment the existing literature using the Log-Periodic Power Law Singular (LPPLS) structures in the log-price dynamics to diagnose financial bubbles by providing three main innovations. First, we introduce the quantile regression to the LPPLS detection problem. This allows us to disentangle (at least partially) the genuine LPPLS signal and the a priori unknown complicated residuals. Second, we propose to combine the many quantile regressions with a multi-scale analysis, which aggregates and consolidates the obtained ensembles of scenarios. Third, we define and implement the so-called DS LPPLS Confidence™ and Trust™ indicators that enrich considerably the diagnostic of bubbles. Using a detailed study of the “S&P 500 1987” bubble and presenting analyses of 16 historical bubbles, we show that the quantile regression of LPPLS signals contributes useful early warning signals. The comparison between the constructed signals and the price development in these 16 historical bubbles demonstrates their significant predictive ability around the real critical time when the burst/rally occurs. PMID:27806093

  14. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  15. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  16. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  17. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  18. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  19. 105-KE Isolation Barrier Leak Rate Acceptance Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCracken, K.J.

    1995-06-14

    This Acceptance Test Report (ATR) contains the completed and signed Acceptance Procedure (ATP) for the 105-KE Isolations Barrier Leak Rate Test. The Test Engineer`s log, the completed sections of the ATP in the Appendix for Repeat Testing (Appendix K), the approved WHC J-7s (Appendix H), the data logger files (Appendices T and U), and the post test calibration checks (Appendix V) are included.

  20. Gigabit Network Communications Research

    DTIC Science & Technology

    1992-12-31

    additional BPF channels, raw bytesync support for video codecs, and others. All source file modifications were logged with RCS. Source and object trees were...34 (RFCs). 20 RFCs were published this quarter: RFC 1366: Gerich, E., " Guidelines for Management of IP Address Space", Merit, October 1992. RFC 1367...Topolcic, C., "Schedule for IP Address Space Management Guidelines ", CNRI, October 1992. RFC 1368: McMaster, D. (Synoptics Communications, Inc.), K

Top