Sample records for log file analysis

  1. Who Goes There? Measuring Library Web Site Usage.

    ERIC Educational Resources Information Center

    Bauer, Kathleen

    2000-01-01

    Discusses how libraries can gather data on the use of their Web sites. Highlights include Web server log files, including the common log file, referrer log file, and agent log file; log file limitations; privacy concerns; and choosing log analysis software, both free and commercial. (LRW)

  2. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    ERIC Educational Resources Information Center

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  3. SU-E-T-142: Automatic Linac Log File: Analysis and Reporting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gainey, M; Rothe, T

    Purpose: End to end QA for IMRT/VMAT is time consuming. Automated linac log file analysis and recalculation of daily recorded fluence, and hence dose, distribution bring this closer. Methods: Matlab (R2014b, Mathworks) software was written to read in and analyse IMRT/VMAT trajectory log files (TrueBeam 1.5, Varian Medical Systems) overnight, and are archived on a backed-up network drive (figure). A summary report (PDF) is sent by email to the duty linac physicist. A structured summary report (PDF) for each patient is automatically updated for embedding into the R&V system (Mosaiq 2.5, Elekta AG). The report contains cross-referenced hyperlinks to easemore » navigation between treatment fractions. Gamma analysis can be performed on planned (DICOM RTPlan) and treated (trajectory log) fluence distributions. Trajectory log files can be converted into RTPlan files for dose distribution calculation (Eclipse, AAA10.0.28, VMS). Results: All leaf positions are within +/−0.10mm: 57% within +/−0.01mm; 89% within 0.05mm. Mean leaf position deviation is 0.02mm. Gantry angle variations lie in the range −0.1 to 0.3 degrees, mean 0.04 degrees. Fluence verification shows excellent agreement between planned and treated fluence. Agreement between planned and treated dose distribution, the derived from log files, is very good. Conclusion: Automated log file analysis is a valuable tool for the busy physicist, enabling potential treated fluence distribution errors to be quickly identified. In the near future we will correlate trajectory log analysis with routine IMRT/VMAT QA analysis. This has the potential to reduce, but not eliminate, the QA workload.« less

  4. SU-E-T-392: Evaluation of Ion Chamber/film and Log File Based QA to Detect Delivery Errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, C; Mason, B; Kirsner, S

    2015-06-15

    Purpose: Ion chamber and film (ICAF) is a method used to verify patient dose prior to treatment. More recently, log file based QA has been shown as an alternative for measurement based QA. In this study, we delivered VMAT plans with and without errors to determine if ICAF and/or log file based QA was able to detect the errors. Methods: Using two VMAT patients, the original treatment plan plus 7 additional plans with delivery errors introduced were generated and delivered. The erroneous plans had gantry, collimator, MLC, gantry and collimator, collimator and MLC, MLC and gantry, and gantry, collimator, andmore » MLC errors. The gantry and collimator errors were off by 4{sup 0} for one of the two arcs. The MLC error introduced was one in which the opening aperture didn’t move throughout the delivery of the field. For each delivery, an ICAF measurement was made as well as a dose comparison based upon log files. Passing criteria to evaluate the plans were ion chamber less and 5% and film 90% of pixels pass the 3mm/3% gamma analysis(GA). For log file analysis 90% of voxels pass the 3mm/3% 3D GA and beam parameters match what was in the plan. Results: Two original plans were delivered and passed both ICAF and log file base QA. Both ICAF and log file QA met the dosimetry criteria on 4 of the 12 erroneous cases analyzed (2 cases were not analyzed). For the log file analysis, all 12 erroneous plans alerted a mismatch in delivery versus what was planned. The 8 plans that didn’t meet criteria all had MLC errors. Conclusion: Our study demonstrates that log file based pre-treatment QA was able to detect small errors that may not be detected using an ICAF and both methods of were able to detect larger delivery errors.« less

  5. Catching errors with patient-specific pretreatment machine log file analysis.

    PubMed

    Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa

    2013-01-01

    A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  6. SU-F-T-295: MLCs Performance and Patient-Specific IMRT QA Using Log File Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osman, A; American University of Biuret Medical Center, Biuret; Maalej, N

    2016-06-15

    Purpose: To analyze the performance of the multi-leaf collimators (MLCs) from the log files recorded during the intensity modulated radiotherapy (IMRT) treatment and to construct the relative fluence maps and do the gamma analysis to compare the planned and executed MLCs movement. Methods: We developed a program to extract and analyze the data from dynamic log files (dynalog files) generated from sliding window IMRT delivery treatments. The program extracts the planned and executed (actual or delivered) MLCs movement, calculates and compares the relative planned and executed fluences. The fluence maps were used to perform the gamma analysis (with 3% dosemore » difference and 3 mm distance to agreement) for 3 IMR patients. We compared our gamma analysis results with those obtained from portal dose image prediction (PDIP) algorithm performed using the EPID. Results: For 3 different IMRT patient treatments, the maximum difference between the planned and the executed MCLs positions was 1.2 mm. The gamma analysis results of the planned and delivered fluences were in good agreement with the gamma analysis from portal dosimetry. The maximum difference for number of pixels passing the gamma criteria (3%/3mm) was 0.19% with respect to portal dosimetry results. Conclusion: MLC log files can be used to verify the performance of the MLCs. Patientspecific IMRT QA based on MLC movement log files gives similar results to EPID dosimetry results. This promising method for patient-specific IMRT QA is fast, does not require dose measurements in a phantom, can be done before the treatment and for every fraction, and significantly reduces the IMRT workload. The author would like to thank King Fahd University of petroleum and Minerals for the support.« less

  7. Monte Carlo based, patient-specific RapidArc QA using Linac log files.

    PubMed

    Teke, Tony; Bergman, Alanah M; Kwa, William; Gill, Bradford; Duzenli, Cheryl; Popescu, I Antoniu

    2010-01-01

    A Monte Carlo (MC) based QA process to validate the dynamic beam delivery accuracy for Varian RapidArc (Varian Medical Systems, Palo Alto, CA) using Linac delivery log files (DynaLog) is presented. Using DynaLog file analysis and MC simulations, the goal of this article is to (a) confirm that adequate sampling is used in the RapidArc optimization algorithm (177 static gantry angles) and (b) to assess the physical machine performance [gantry angle and monitor unit (MU) delivery accuracy]. Ten clinically acceptable RapidArc treatment plans were generated for various tumor sites and delivered to a water-equivalent cylindrical phantom on the treatment unit. Three Monte Carlo simulations were performed to calculate dose to the CT phantom image set: (a) One using a series of static gantry angles defined by 177 control points with treatment planning system (TPS) MLC control files (planning files), (b) one using continuous gantry rotation with TPS generated MLC control files, and (c) one using continuous gantry rotation with actual Linac delivery log files. Monte Carlo simulated dose distributions are compared to both ionization chamber point measurements and with RapidArc TPS calculated doses. The 3D dose distributions were compared using a 3D gamma-factor analysis, employing a 3%/3 mm distance-to-agreement criterion. The dose difference between MC simulations, TPS, and ionization chamber point measurements was less than 2.1%. For all plans, the MC calculated 3D dose distributions agreed well with the TPS calculated doses (gamma-factor values were less than 1 for more than 95% of the points considered). Machine performance QA was supplemented with an extensive DynaLog file analysis. A DynaLog file analysis showed that leaf position errors were less than 1 mm for 94% of the time and there were no leaf errors greater than 2.5 mm. The mean standard deviation in MU and gantry angle were 0.052 MU and 0.355 degrees, respectively, for the ten cases analyzed. The accuracy and flexibility of the Monte Carlo based RapidArc QA system were demonstrated. Good machine performance and accurate dose distribution delivery of RapidArc plans were observed. The sampling used in the TPS optimization algorithm was found to be adequate.

  8. Linking log files with dosimetric accuracy--A multi-institutional study on quality assurance of volumetric modulated arc therapy.

    PubMed

    Pasler, Marlies; Kaas, Jochem; Perik, Thijs; Geuze, Job; Dreindl, Ralf; Künzler, Thomas; Wittkamper, Frits; Georg, Dietmar

    2015-12-01

    To systematically evaluate machine specific quality assurance (QA) for volumetric modulated arc therapy (VMAT) based on log files by applying a dynamic benchmark plan. A VMAT benchmark plan was created and tested on 18 Elekta linacs (13 MLCi or MLCi2, 5 Agility) at 4 different institutions. Linac log files were analyzed and a delivery robustness index was introduced. For dosimetric measurements an ionization chamber array was used. Relative dose deviations were assessed by mean gamma for each control point and compared to the log file evaluation. Fourteen linacs delivered the VMAT benchmark plan, while 4 linacs failed by consistently terminating the delivery. The mean leaf error (±1SD) was 0.3±0.2 mm for all linacs. Large MLC maximum errors up to 6.5 mm were observed at reversal positions. Delivery robustness index accounting for MLC position correction (0.8-1.0) correlated with delivery time (80-128 s) and depended on dose rate performance. Dosimetric evaluation indicated in general accurate plan reproducibility with γ(mean)(±1 SD)=0.4±0.2 for 1 mm/1%. However single control point analysis revealed larger deviations and attributed well to log file analysis. The designed benchmark plan helped identify linac related malfunctions in dynamic mode for VMAT. Log files serve as an important additional QA measure to understand and visualize dynamic linac parameters. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeMarco, J; McCloskey, S; Low, D

    Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT planmore » file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment.« less

  10. SU-E-T-184: Clinical VMAT QA Practice Using LINAC Delivery Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, H; Jacobson, T; Gu, X

    2015-06-15

    Purpose: To evaluate the accuracy of volumetric modulated arc therapy (VMAT) treatment delivery dose clouds by comparing linac log data to doses measured using an ionization chamber and film. Methods: A commercial IMRT quality assurance (QA) process utilizing a DICOM-RT framework was tested for clinical practice using 30 prostate and 30 head and neck VMAT plans. Delivered 3D VMAT dose distributions were independently checked using a PinPoint ionization chamber and radiographic film in a solid water phantom. DICOM RT coordinates were used to extract the corresponding point and planar doses from 3D log file dose distributions. Point doses were evaluatedmore » by computing the percent error between log file and chamber measured values. A planar dose evaluation was performed for each plan using a 2D gamma analysis with 3% global dose difference and 3 mm isodose point distance criteria. The same analysis was performed to compare treatment planning system (TPS) doses to measured values to establish a baseline assessment of agreement. Results: The mean percent error between log file and ionization chamber dose was 1.0%±2.1% for prostate VMAT plans and −0.2%±1.4% for head and neck plans. The corresponding TPS calculated and measured ionization chamber values agree within 1.7%±1.6%. The average 2D gamma passing rates for the log file comparison to film are 98.8%±1.0% and 96.2%±4.2% for the prostate and head and neck plans, respectively. The corresponding passing rates for the TPS comparison to film are 99.4%±0.5% and 93.9%±5.1%. Overall, the point dose and film data indicate that log file determined doses are in excellent agreement with measured values. Conclusion: Clinical VMAT QA practice using LINAC treatment log files is a fast and reliable method for patient-specific plan evaluation.« less

  11. Automating linear accelerator quality assurance.

    PubMed

    Eckhause, Tobias; Al-Hallaq, Hania; Ritter, Timothy; DeMarco, John; Farrey, Karl; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Perez, Mario; Park, SungYong; Booth, Jeremy T; Thorwarth, Ryan; Moran, Jean M

    2015-10-01

    The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The standard deviation in MLC position, as determined by EPID measurements, across the consortium was 0.33 mm for IMRT fields. With respect to the log files, the deviations between expected and actual positions for parameters were small (<0.12 mm) for all Linacs. Considering both log files and EPID measurements, all parameters were well within published tolerance values. Variations in collimator angle, MLC position, and gantry sag were also evaluated for all Linacs. The performance of the TrueBeam Linac model was shown to be consistent based on automated analysis of trajectory log files and EPID images acquired during delivery of a standardized test suite. The results can be compared directly to tolerance thresholds. In addition, sharing of results from standard tests across institutions can facilitate the identification of QA process and Linac changes. These reference values are presented along with the standard deviation for common tests so that the test suite can be used by other centers to evaluate their Linac performance against those in this consortium.

  12. Identification and Management of Pump Thrombus in the HeartWare Left Ventricular Assist Device System: A Novel Approach Using Log File Analysis.

    PubMed

    Jorde, Ulrich P; Aaronson, Keith D; Najjar, Samer S; Pagani, Francis D; Hayward, Christopher; Zimpfer, Daniel; Schlöglhofer, Thomas; Pham, Duc T; Goldstein, Daniel J; Leadley, Katrin; Chow, Ming-Jay; Brown, Michael C; Uriel, Nir

    2015-11-01

    The study sought to characterize patterns in the HeartWare (HeartWare Inc., Framingham, Massachusetts) ventricular assist device (HVAD) log files associated with successful medical treatment of device thrombosis. Device thrombosis is a serious adverse event for mechanical circulatory support devices and is often preceded by increased power consumption. Log files of the pump power are easily accessible on the bedside monitor of HVAD patients and may allow early diagnosis of device thrombosis. Furthermore, analysis of the log files may be able to predict the success rate of thrombolysis or the need for pump exchange. The log files of 15 ADVANCE trial patients (algorithm derivation cohort) with 16 pump thrombus events treated with tissue plasminogen activator (tPA) were assessed for changes in the absolute and rate of increase in power consumption. Successful thrombolysis was defined as a clinical resolution of pump thrombus including normalization of power consumption and improvement in biochemical markers of hemolysis. Significant differences in log file patterns between successful and unsuccessful thrombolysis treatments were verified in 43 patients with 53 pump thrombus events implanted outside of clinical trials (validation cohort). The overall success rate of tPA therapy was 57%. Successful treatments had significantly lower measures of percent of expected power (130.9% vs. 196.1%, p = 0.016) and rate of increase in power (0.61 vs. 2.87, p < 0.0001). Medical therapy was successful in 77.7% of the algorithm development cohort and 81.3% of the validation cohort when the rate of power increase and percent of expected power values were <1.25% and 200%, respectively. Log file parameters can potentially predict the likelihood of successful tPA treatments and if validated prospectively, could substantially alter the approach to thrombus management. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  13. Web usage data mining agent

    NASA Astrophysics Data System (ADS)

    Madiraju, Praveen; Zhang, Yanqing

    2002-03-01

    When a user logs in to a website, behind the scenes the user leaves his/her impressions, usage patterns and also access patterns in the web servers log file. A web usage mining agent can analyze these web logs to help web developers to improve the organization and presentation of their websites. They can help system administrators in improving the system performance. Web logs provide invaluable help in creating adaptive web sites and also in analyzing the network traffic analysis. This paper presents the design and implementation of a Web usage mining agent for digging in to the web log files.

  14. INSPIRE and SPIRES Log File Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Cole; /Wheaton Coll. /SLAC

    2012-08-31

    SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are mademore » between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.« less

  15. WE-G-213CD-03: A Dual Complementary Verification Method for Dynamic Tumor Tracking on Vero SBRT.

    PubMed

    Poels, K; Depuydt, T; Verellen, D; De Ridder, M

    2012-06-01

    to use complementary cine EPID and gimbals log file analysis for in-vivo tracking accuracy monitoring. A clinical prototype of dynamic tracking (DT) was installed on the Vero SBRT system. This prototype version allowed tumor tracking by gimballed linac rotations using an internal-external correspondence model. The DT prototype software allowed the detailed logging of all applied gimbals rotations during tracking. The integration of an EPID on the vero system allowed the acquisition of cine EPID images during DT. We quantified the tracking error on cine EPID (E-EPID) by subtracting the target center (fiducial marker detection) and the field centroid. Dynamic gimbals log file information was combined with orthogonal x-ray verification images to calculate the in-vivo tracking error (E-kVLog). The correlation between E-kVLog and E-EPID was calculated for validation of the gimbals log file. Further, we investigated the sensitivity of the log file tracking error by introducing predefined systematic tracking errors. As an application we calculate gimbals log file tracking error for dynamic hidden target tests to investigate gravity effects and decoupled gimbals rotation from gantry rotation. Finally, calculating complementary cine EPID and log file tracking errors evaluated the clinical accuracy of dynamic tracking. A strong correlation was found between log file and cine EPID tracking error distribution during concurrent measurements (R=0.98). We found sensitivity in the gimbals log files to detect a systematic tracking error up to 0.5 mm. Dynamic hidden target tests showed no gravity influence on tracking performance and high degree of decoupled gimbals and gantry rotation during dynamic arc dynamic tracking. A submillimetric agreement between clinical complementary tracking error measurements was found. Redundancy of the internal gimbals log file with x-ray verification images with complementary independent cine EPID images was implemented to monitor the accuracy of gimballed tumor tracking on Vero SBRT. Research was financially supported by the Flemish government (FWO), Hercules Foundation and BrainLAB AG. © 2012 American Association of Physicists in Medicine.

  16. Users' information-seeking behavior on a medical library Website

    PubMed Central

    Rozic-Hristovski, Anamarija; Hristovski, Dimitar; Todorovski, Ljupco

    2002-01-01

    The Central Medical Library (CMK) at the Faculty of Medicine, University of Ljubljana, Slovenia, started to build a library Website that included a guide to library services and resources in 1997. The evaluation of Website usage plays an important role in its maintenance and development. Analyzing and exploring regularities in the visitors' behavior can be used to enhance the quality and facilitate delivery of information services, identify visitors' interests, and improve the server's performance. The analysis of the CMK Website users' navigational behavior was carried out by analyzing the Web server log files. These files contained information on all user accesses to the Website and provided a great opportunity to learn more about the behavior of visitors to the Website. The majority of the available tools for Web log file analysis provide a predefined set of reports showing the access count and the transferred bytes grouped along several dimensions. In addition to the reports mentioned above, the authors wanted to be able to perform interactive exploration and ad hoc analysis and discover trends in a user-friendly way. Because of that, we developed our own solution for exploring and analyzing the Web logs based on data warehousing and online analytical processing technologies. The analytical solution we developed proved successful, so it may find further application in the field of Web log file analysis. We will apply the findings of the analysis to restructuring the CMK Website. PMID:11999179

  17. SU-E-T-325: The New Evaluation Method of the VMAT Plan Delivery Using Varian DynaLog Files and Modulation Complexity Score (MCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tateoka, K; Graduate School of Medicine, Sapporo Medical University, Sapporo, JP; Fujimomo, K

    2014-06-01

    Purpose: The aim of the study is to evaluate the use of Varian DynaLog files to verify VMAT plans delivery and modulation complexity score (MCS) of VMAT. Methods: Delivery accuracy of machine performance was quantified by multileaf collimator (MLC) position errors, gantry angle errors and fluence delivery accuracy for volumetric modulated arc therapy (VMAT). The relationship between machine performance and plan complexity were also investigated using the modulation complexity score (MCS). Plan and Actual MLC positions, gantry angles and delivered fraction of monitor units were extracted from Varian DynaLog files. These factors were taken from the record and verify systemmore » of MLC control file. Planned and delivered beam data were compared to determine leaf position errors and gantry angle errors. Analysis was also performed on planned and actual fluence maps reconstructed from those of the DynaLog files. This analysis was performed for all treatment fractions of 5 prostate VMAT plans. The analysis of DynaLog files have been carried out by in-house programming in Visual C++. Results: The root mean square of leaf position and gantry angle errors were about 0.12 and 0.15, respectively. The Gamma of planned and actual fluence maps at 3%/3 mm criterion was about 99.21. The gamma of the leaf position errors were not directly related to plan complexity as determined by the MCS. Therefore, the gamma of the gantry angle errors were directly related to plan complexity as determined by the MCS. Conclusion: This study shows Varian dynalog files for VMAT plan can be diagnosed delivery errors not possible with phantom based quality assurance. Furthermore, the MCS of VMAT plan can evaluate delivery accuracy for patients receiving of VMAT. Machine performance was found to be directly related to plan complexity but this is not the dominant determinant of delivery accuracy.« less

  18. TH-AB-201-12: Using Machine Log-Files for Treatment Planning and Delivery QA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stanhope, C; Liang, J; Drake, D

    2016-06-15

    Purpose: To determine the segment reduction and dose resolution necessary for machine log-files to effectively replace current phantom-based patient-specific quality assurance, while minimizing computational cost. Methods: Elekta’s Log File Convertor R3.2 records linac delivery parameters (dose rate, gantry angle, leaf position) every 40ms. Five VMAT plans [4 H&N, 1 Pulsed Brain] comprised of 2 arcs each were delivered on the ArcCHECK phantom. Log-files were reconstructed in Pinnacle on the phantom geometry using 1/2/3/4° control point spacing and 2/3/4mm dose grid resolution. Reconstruction effectiveness was quantified by comparing 2%/2mm gamma passing rates of the original and log-file plans. Modulation complexity scoresmore » (MCS) were calculated for each beam to correlate reconstruction accuracy and beam modulation. Percent error in absolute dose for each plan-pair combination (log-file vs. ArcCHECK, original vs. ArcCHECK, log-file vs. original) was calculated for each arc and every diode greater than 10% of the maximum measured dose (per beam). Comparing standard deviations of the three plan-pair distributions, relative noise of the ArcCHECK and log-file systems was elucidated. Results: The original plans exhibit a mean passing rate of 95.1±1.3%. The eight more modulated H&N arcs [MCS=0.088±0.014] and two less modulated brain arcs [MCS=0.291±0.004] yielded log-file pass rates most similar to the original plan when using 1°/2mm [0.05%±1.3% lower] and 2°/3mm [0.35±0.64% higher] log-file reconstructions respectively. Log-file and original plans displayed percent diode dose errors 4.29±6.27% and 3.61±6.57% higher than measurement. Excluding the phantom eliminates diode miscalibration and setup errors; log-file dose errors were 0.72±3.06% higher than the original plans – significantly less noisy. Conclusion: For log-file reconstructed VMAT arcs, 1° control point spacing and 2mm dose resolution is recommended, however, less modulated arcs may allow less stringent reconstructions. Following the aforementioned reconstruction recommendations, the log-file technique is capable of detecting delivery errors with equivalent accuracy and less noise than ArcCHECK QA. I am funded by an Elekta Research Grant.« less

  19. The Feasibility of Using Cluster Analysis to Examine Log Data from Educational Video Games. CRESST Report 790

    ERIC Educational Resources Information Center

    Kerr, Deirdre; Chung, Gregory K. W. K.; Iseli, Markus R.

    2011-01-01

    Analyzing log data from educational video games has proven to be a challenging endeavor. In this paper, we examine the feasibility of using cluster analysis to extract information from the log files that is interpretable in both the context of the game and the context of the subject area. If cluster analysis can be used to identify patterns of…

  20. Study of the IMRT interplay effect using a 4DCT Monte Carlo dose calculation.

    PubMed

    Jensen, Michael D; Abdellatif, Ady; Chen, Jeff; Wong, Eugene

    2012-04-21

    Respiratory motion may lead to dose errors when treating thoracic and abdominal tumours with radiotherapy. The interplay between complex multileaf collimator patterns and patient respiratory motion could result in unintuitive dose changes. We have developed a treatment reconstruction simulation computer code that accounts for interplay effects by combining multileaf collimator controller log files, respiratory trace log files, 4DCT images and a Monte Carlo dose calculator. Two three-dimensional (3D) IMRT step-and-shoot plans, a concave target and integrated boost were delivered to a 1D rigid motion phantom. Three sets of experiments were performed with 100%, 50% and 25% duty cycle gating. The log files were collected, and five simulation types were performed on each data set: continuous isocentre shift, discrete isocentre shift, 4DCT, 4DCT delivery average and 4DCT plan average. Analysis was performed using 3D gamma analysis with passing criteria of 2%, 2 mm. The simulation framework was able to demonstrate that a single fraction of the integrated boost plan was more sensitive to interplay effects than the concave target. Gating was shown to reduce the interplay effects. We have developed a 4DCT Monte Carlo simulation method that accounts for IMRT interplay effects with respiratory motion by utilizing delivery log files.

  1. Analysis of the request patterns to the NSSDC on-line archive

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore

    1994-01-01

    NASA missions, both for earth science and for space science, collect huge amounts of data, and the rate at which data is being gathered is increasing. For example, the EOSDIS project is expected to collect petabytes per year. In addition, these archives are being made available to remote users over the Internet. The ability to manage the growth of the size and request activity of scientific archives depends on an understanding of the access patterns of scientific users. The National Space Science Data Center (NSSDC) of NASA Goddard Space Flight Center has run their on-line mass storage archive of space data, the National Data Archive and Distribution Service (NDADS), since November 1991. A large world-wide space research community makes use of NSSDC, requesting more than 20,000 files per month. Since the initiation of their service, they have maintained log files which record all accesses the archive. In this report, we present an analysis of the NDADS log files. We analyze the log files, and discuss several issues, including caching, reference patterns, clustering, and system loading.

  2. SU-F-T-233: Evaluation of Treatment Delivery Parameters Using High Resolution ELEKTA Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kabat, C; Defoor, D; Alexandrian, A

    2016-06-15

    Purpose: As modern linacs have become more technologically advanced with the implementation of IGRT and IMRT with HDMLCs, a requirement for more elaborate tracking techniques to monitor components’ integrity is paramount. ElektaLog files are generated every 40 milliseconds, which can be analyzed to track subtle changes and provide another aspect of quality assurance. This allows for constant monitoring of fraction consistency in addition to machine reliability. With this in mind, it was the aim of the study to evaluate if ElektaLog files can be utilized for linac consistency QA. Methods: ElektaLogs were reviewed for 16 IMRT patient plans with >16more » fractions. Logs were analyzed by creating fluence maps from recorded values of MLC locations, jaw locations, and dose per unit time. Fluence maps were then utilized to calculate a 2D gamma index with a 2%–2mm criteria for each fraction. ElektaLogs were also used to analyze positional errors for MLC leaves and jaws, which were used to compute an overall error for the MLC banks, Y-jaws, and X-jaws by taking the root-meansquare value of the individual recorded errors during treatment. Additionally, beam on time was calculated using the number of ElektaLog file entries within the file. Results: The average 2D gamma for all 16 patient plans was found to be 98.0±2.0%. Recorded gamma index values showed an acceptable correlation between fractions. Average RMS values for MLC leaves and the jaws resulted in a leaf variation of roughly 0.3±0.08 mm and jaw variation of about 0.15±0.04 mm, both of which fall within clinical tolerances. Conclusion: The use of ElektaLog files for day-to-day evaluation of linac integrity and patient QA can be utilized to allow for reliable analysis of system accuracy and performance.« less

  3. Log file-based patient dose calculations of double-arc VMAT for head-and-neck radiotherapy.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Majima, Kazuhiro; Matsushita, Haruo; Takeda, Ken; Jingu, Keiichi

    2018-04-01

    The log file-based method cannot display dosimetric changes due to linac component miscalibration because of the insensitivity of log files to linac component miscalibration. The purpose of this study was to supply dosimetric changes in log file-based patient dose calculations for double-arc volumetric-modulated arc therapy (VMAT) in head-and-neck cases. Fifteen head-and-neck cases participated in this study. For each case, treatment planning system (TPS) doses were produced by double-arc and single-arc VMAT. Miscalibration-simulated log files were generated by inducing a leaf miscalibration of ±0.5 mm into the log files that were acquired during VMAT irradiation. Subsequently, patient doses were estimated using the miscalibration-simulated log files. For double-arc VMAT, regarding planning target volume (PTV), the change from TPS dose to miscalibration-simulated log file dose in D mean was 0.9 Gy and that for tumor control probability was 1.4%. As for organ-at-risks (OARs), the change in D mean was <0.7 Gy and normal tissue complication probability was <1.8%. A comparison between double-arc and single-arc VMAT for PTV showed statistically significant differences in the changes evaluated by D mean and radiobiological metrics (P < 0.01), even though the magnitude of these differences was small. Similarly, for OARs, the magnitude of these changes was found to be small. Using the log file-based method for PTV and OARs, the log file-based method estimate of patient dose using the double-arc VMAT has accuracy comparable to that obtained using the single-arc VMAT. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  4. SU-F-T-288: Impact of Trajectory Log Files for Clarkson-Based Independent Dose Verification of IMRT and VMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, R; Kamima, T; Tachibana, H

    2016-06-15

    Purpose: To investigate the effect of the trajectory files from linear accelerator for Clarkson-based independent dose verification in IMRT and VMAT plans. Methods: A CT-based independent dose verification software (Simple MU Analysis: SMU, Triangle Products, Japan) with a Clarksonbased algorithm was modified to calculate dose using the trajectory log files. Eclipse with the three techniques of step and shoot (SS), sliding window (SW) and Rapid Arc (RA) was used as treatment planning system (TPS). In this study, clinically approved IMRT and VMAT plans for prostate and head and neck (HN) at two institutions were retrospectively analyzed to assess the dosemore » deviation between DICOM-RT plan (PL) and trajectory log file (TJ). An additional analysis was performed to evaluate MLC error detection capability of SMU when the trajectory log files was modified by adding systematic errors (0.2, 0.5, 1.0 mm) and random errors (5, 10, 30 mm) to actual MLC position. Results: The dose deviations for prostate and HN in the two sites were 0.0% and 0.0% in SS, 0.1±0.0%, 0.1±0.1% in SW and 0.6±0.5%, 0.7±0.9% in RA, respectively. The MLC error detection capability shows the plans for HN IMRT were the most sensitive and 0.2 mm of systematic error affected 0.7% dose deviation on average. Effect of the MLC random error did not affect dose error. Conclusion: The use of trajectory log files including actual information of MLC location, gantry angle, etc should be more effective for an independent verification. The tolerance level for the secondary check using the trajectory file may be similar to that of the verification using DICOM-RT plan file. From the view of the resolution of MLC positional error detection, the secondary check could detect the MLC position error corresponding to the treatment sites and techniques. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  5. COMBATXXI, JDAFS, and LBC Integration Requirements for EASE

    DTIC Science & Technology

    2015-10-06

    process as linear and as new data is made available, any previous analysis is obsolete and has to start the process over again. Figure 2 proposes a...final line of the manifest file names the scenario file associated with the run. Under the usual practice, the analyst now starts the COMBATXXI...describes which events are to be logged. Finally the scenario is started with the click of a button. The simulation generates logs of a couple of sorts

  6. Clinical impact of dosimetric changes for volumetric modulated arc therapy in log file-based patient dose calculations.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Matsunaga, Kenichi; Matsushita, Haruo; Majima, Kazuhiro; Jingu, Keiichi

    2017-10-01

    A log file-based method cannot detect dosimetric changes due to linac component miscalibration because log files are insensitive to miscalibration. Herein, clinical impacts of dosimetric changes on a log file-based method were determined. Five head-and-neck and five prostate plans were applied. Miscalibration-simulated log files were generated by inducing a linac component miscalibration into the log file. Miscalibration magnitudes for leaf, gantry, and collimator at the general tolerance level were ±0.5mm, ±1°, and ±1°, respectively, and at a tighter tolerance level achievable on current linac were ±0.3mm, ±0.5°, and ±0.5°, respectively. Re-calculations were performed on patient anatomy using log file data. Changes in tumor control probability/normal tissue complication probability from treatment planning system dose to re-calculated dose at the general tolerance level was 1.8% on planning target volume (PTV) and 2.4% on organs at risk (OARs) in both plans. These changes at the tighter tolerance level were improved to 1.0% on PTV and to 1.5% on OARs, with a statistically significant difference. We determined the clinical impacts of dosimetric changes on a log file-based method using a general tolerance level and a tighter tolerance level for linac miscalibration and found that a tighter tolerance level significantly improved the accuracy of the log file-based method. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  7. Cloud object store for checkpoints of high performance computing applications using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-04-19

    Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  8. Comparing Web and Touch Screen Transaction Log Files

    PubMed Central

    Huntington, Paul; Williams, Peter

    2001-01-01

    Background Digital health information is available on a wide variety of platforms including PC-access of the Internet, Wireless Application Protocol phones, CD-ROMs, and touch screen public kiosks. All these platforms record details of user sessions in transaction log files, and there is a growing body of research into the evaluation of this data. However, there is very little research that has examined the problems of comparing the transaction log files of kiosks and the Internet. Objectives To provide a first step towards examining the problems of comparing the transaction log files of kiosks and the Internet. Methods We studied two platforms: touch screen kiosks and a comparable Web site. For both of these platforms, we examined the menu structure (which affects transaction log file data), the log-file structure, and the metrics derived from log-file records. Results We found substantial differences between the generated metrics. Conclusions None of the metrics discussed can be regarded as an effective way of comparing the use of kiosks and Web sites. Two metrics stand out as potentially comparable and valuable: the number of user sessions per hour and user penetration of pages. PMID:11720960

  9. Cloud object store for archive storage of high performance computing data using decoupling middleware

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-06-30

    Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.

  10. Beyond Logging of Fingertip Actions: Analysis of Collaborative Learning Using Multiple Sources of Data

    ERIC Educational Resources Information Center

    Avouris, N.; Fiotakis, G.; Kahrimanis, G.; Margaritis, M.; Komis, V.

    2007-01-01

    In this article, we discuss key requirements for collecting behavioural data concerning technology-supported collaborative learning activities. It is argued that the common practice of analysis of computer generated log files of user interactions with software tools is not enough for building a thorough view of the activity. Instead, more…

  11. Use patterns of health information exchange through a multidimensional lens: conceptual framework and empirical validation.

    PubMed

    Politi, Liran; Codish, Shlomi; Sagy, Iftach; Fink, Lior

    2014-12-01

    Insights about patterns of system use are often gained through the analysis of system log files, which record the actual behavior of users. In a clinical context, however, few attempts have been made to typify system use through log file analysis. The present study offers a framework for identifying, describing, and discerning among patterns of use of a clinical information retrieval system. We use the session attributes of volume, diversity, granularity, duration, and content to define a multidimensional space in which each specific session can be positioned. We also describe an analytical method for identifying the common archetypes of system use in this multidimensional space. We demonstrate the value of the proposed framework with a log file of the use of a health information exchange (HIE) system by physicians in an emergency department (ED) of a large Israeli hospital. The analysis reveals five distinct patterns of system use, which have yet to be described in the relevant literature. The results of this study have the potential to inform the design of HIE systems for efficient and effective use, thus increasing their contribution to the clinical decision-making process. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Analysis of the Impact of Data Normalization on Cyber Event Correlation Query Performance

    DTIC Science & Technology

    2012-03-01

    2003). Organizations use it in planning, target marketing , decision-making, data analysis, and customer services (Shin, 2003). Organizations that...Following this IP address is a router message sequence number. This is a globally unique number for each router terminal and can range from...Appendix G, invokes the PERL parser for the log files from a particular USAF base, and invokes the CTL file that loads the resultant CSV file into the

  13. Workload Characterization and Performance Implications of Large-Scale Blog Servers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeon, Myeongjae; Kim, Youngjae; Hwang, Jeaho

    With the ever-increasing popularity of social network services (SNSs), an understanding of the characteristics of these services and their effects on the behavior of their host servers is critical. However, there has been a lack of research on the workload characterization of servers running SNS applications such as blog services. To fill this void, we empirically characterized real-world web server logs collected from one of the largest South Korean blog hosting sites for 12 consecutive days. The logs consist of more than 96 million HTTP requests and 4.7 TB of network traffic. Our analysis reveals the followings: (i) The transfermore » size of non-multimedia files and blog articles can be modeled using a truncated Pareto distribution and a log-normal distribution, respectively; (ii) User access for blog articles does not show temporal locality, but is strongly biased towards those posted with image or audio files. We additionally discuss the potential performance improvement through clustering of small files on a blog page into contiguous disk blocks, which benefits from the observed file access patterns. Trace-driven simulations show that, on average, the suggested approach achieves 60.6% better system throughput and reduces the processing time for file access by 30.8% compared to the best performance of the Ext4 file system.« less

  14. The NetLogger Toolkit V2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunter, Dan; Lee, Jason; Stoufer, Martin

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation ofmore » application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and penodically try to reconnect broken TCP pipe. A typical use for this is to store data on local disk while net is down. An event archiver can log one or more incoming NetLogger streams to a local disk file (netlogd) or to a mySQL database (netarchd). We have found exploratory, visual analysis of the log event data to be the most useful means of determining the causes of performance anomalies The NetLogger Visualization tool, niv, has been developed to provide a flexible and interactive graphical representation of system-level and application-level events.« less

  15. Teaching an Old Log New Tricks with Machine Learning.

    PubMed

    Schnell, Krista; Puri, Colin; Mahler, Paul; Dukatz, Carl

    2014-03-01

    To most people, the log file would not be considered an exciting area in technology today. However, these relatively benign, slowly growing data sources can drive large business transformations when combined with modern-day analytics. Accenture Technology Labs has built a new framework that helps to expand existing vendor solutions to create new methods of gaining insights from these benevolent information springs. This framework provides a systematic and effective machine-learning mechanism to understand, analyze, and visualize heterogeneous log files. These techniques enable an automated approach to analyzing log content in real time, learning relevant behaviors, and creating actionable insights applicable in traditionally reactive situations. Using this approach, companies can now tap into a wealth of knowledge residing in log file data that is currently being collected but underutilized because of its overwhelming variety and volume. By using log files as an important data input into the larger enterprise data supply chain, businesses have the opportunity to enhance their current operational log management solution and generate entirely new business insights-no longer limited to the realm of reactive IT management, but extending from proactive product improvement to defense from attacks. As we will discuss, this solution has immediate relevance in the telecommunications and security industries. However, the most forward-looking companies can take it even further. How? By thinking beyond the log file and applying the same machine-learning framework to other log file use cases (including logistics, social media, and consumer behavior) and any other transactional data source.

  16. Visual behavior characterization for intrusion and misuse detection

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.; Frincke, Deborah

    2001-05-01

    As computer and network intrusions become more and more of a concern, the need for better capabilities, to assist in the detection and analysis of intrusions also increase. System administrators typically rely on log files to analyze usage and detect misuse. However, as a consequence of the amount of data collected by each machine, multiplied by the tens or hundreds of machines under the system administrator's auspices, the entirety of the data available is neither collected nor analyzed. This is compounded by the need to analyze network traffic data as well. We propose a methodology for analyzing network and computer log information visually based on the analysis of the behavior of the users. Each user's behavior is the key to determining their intent and overriding activity, whether they attempt to hide their actions or not. Proficient hackers will attempt to hide their ultimate activities, which hinders the reliability of log file analysis. Visually analyzing the users''s behavior however, is much more adaptable and difficult to counteract.

  17. Understanding Academic Information Seeking Habits through Analysis of Web Server Log Files: The Case of the Teachers College Library Website

    ERIC Educational Resources Information Center

    Asunka, Stephen; Chae, Hui Soo; Hughes, Brian; Natriello, Gary

    2009-01-01

    Transaction logs of user activity on an academic library website were analyzed to determine general usage patterns on the website. This paper reports on insights gained from the analysis, and identifies and discusses issues relating to content access, interface design and general functionality of the website. (Contains 13 figures and 8 tables.)

  18. The Use of OPAC in a Large Academic Library: A Transactional Log Analysis Study of Subject Searching

    ERIC Educational Resources Information Center

    Villen-Rueda, Luis; Senso, Jose A.; de Moya-Anegon, Felix

    2007-01-01

    The analysis of user searches in catalogs has been the topic of research for over four decades, involving numerous studies and diverse methodologies. The present study looks at how different types of users effect queries in the catalog of a university library. For this purpose, we analyzed log files to determine which was the most frequent type of…

  19. Analyzing Log Files to Predict Students' Problem Solving Performance in a Computer-Based Physics Tutor

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2015-01-01

    This study investigates whether information saved in the log files of a computer-based tutor can be used to predict the problem solving performance of students. The log files of a computer-based physics tutoring environment called Andes Physics Tutor was analyzed to build a logistic regression model that predicted success and failure of students'…

  20. SU-E-T-784: Using MLC Log Files for Daily IMRT Delivery Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stathakis, S; Defoor, D; Linden, P

    2015-06-15

    Purpose: To verify daily intensity modulated radiation therapy (IMRT) treatments using multi-leaf collimator (MLC) log files. Methods: The MLC log files from a NovalisTX Varian linear accelerator were used in this study. The MLC files were recorded daily for all patients undergoing IMRT or volumetric modulated arc therapy (VMAT). The first record of each patient was used as reference and all records for subsequent days were compared against the reference. An in house MATLAB software code was used for the comparisons. Each MLC log file was converted to a fluence map (FM) and a gamma index (γ) analysis was usedmore » for the evaluation of each daily delivery for every patient. The tolerance for the gamma index was set to 2% dose difference and 2mm distance to agreement while points with signal of 10% or lower of the maximum value were excluded from the comparisons. Results: The γ between each of the reference FMs and the consecutive daily fraction FMs had an average value of 99.1% (ranged from 98.2 to 100.0%). The FM images were reconstructed at various resolutions in order to study the effect of the resolution on the γ and at the same time reduce the time for processing the images. We found that the comparison of images with the highest resolution (768×1024) yielded on average a lower γ (99.1%) than the ones with low resolution (192×256) (γ 99.5%). Conclusion: We developed an in-house software that allows us to monitor the quality of daily IMRT and VMAT treatment deliveries using information from the MLC log files of the linear accelerator. The information can be analyzed and evaluated as early as after the completion of each daily treatment. Such tool can be valuable to assess the effect of MLC positioning on plan quality, especially in the context of adaptive radiotherapy.« less

  1. Contamination Analysis Tools

    NASA Technical Reports Server (NTRS)

    Brieda, Lubos

    2015-01-01

    This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.

  2. 46 CFR Appendix A to Part 530 - Instructions for the Filing of Service Contracts

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... file service contracts. BTCL will direct OIRM to provide approved filers with a log-on ID and password. Filers who wish a third party (publisher) to file their service contracts must so indicate on Form FMC-83... home page, http://www.fmc.gov. A. Registration, Log-on ID and Password To register for filing, a...

  3. 46 CFR Appendix A to Part 530 - Instructions for the Filing of Service Contracts

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... file service contracts. BTCL will direct OIRM to provide approved filers with a log-on ID and password. Filers who wish a third party (publisher) to file their service contracts must so indicate on Form FMC-83... home page, http://www.fmc.gov. A. Registration, Log-on ID and Password To register for filing, a...

  4. Coastal bathymetry data collected in 2011 from the Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    DeWitt, Nancy T.; Pfeiffer, William R.; Bernier, Julie C.; Buster, Noreen A.; Miselis, Jennifer L.; Flocks, James G.; Reynolds, Billy J.; Wiese, Dana S.; Kelso, Kyle W.

    2014-01-01

    This report serves as an archive of processed interferometric swath and single-beam bathymetry data. Geographic Iinformation System data products include a 50-meter cell-size interpolated bathymetry grid surface, trackline maps, and point data files. Additional files include error analysis maps, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.

  5. Online Courses Assessment through Measuring and Archetyping of Usage Data

    ERIC Educational Resources Information Center

    Kazanidis, Ioannis; Theodosiou, Theodosios; Petasakis, Ioannis; Valsamidis, Stavros

    2016-01-01

    Database files and additional log files of Learning Management Systems (LMSs) contain an enormous volume of data which usually remain unexploited. A new methodology is proposed in order to analyse these data both on the level of both the courses and the learners. Specifically, "regression analysis" is proposed as a first step in the…

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, S; Ho, M; Chen, C

    Purpose: The use of log files to perform patient specific quality assurance for both protons and IMRT has been established. Here, we extend that approach to a proprietary log file format and compare our results to measurements in phantom. Our goal was to generate a system that would permit gross errors to be found within 3 fractions until direct measurements. This approach could eventually replace direct measurements. Methods: Spot scanning protons pass through multi-wire ionization chambers which provide information about the charge, location, and size of each delivered spot. We have generated a program that calculates the dose in phantommore » from these log files and compares the measurements with the plan. The program has 3 different spot shape models: single Gaussian, double Gaussian and the ASTROID model. The program was benchmarked across different treatment sites for 23 patients and 74 fields. Results: The dose calculated from the log files were compared to those generate by the treatment planning system (Raystation). While the dual Gaussian model often gave better agreement, overall, the ASTROID model gave the most consistent results. Using a 5%–3 mm gamma with a 90% passing criteria and excluding doses below 20% of prescription all patient samples passed. However, the degree of agreement of the log file approach was slightly worse than that of the chamber array measurement approach. Operationally, this implies that if the beam passes the log file model, it should pass direct measurement. Conclusion: We have established and benchmarked a model for log file QA in an IBA proteus plus system. The choice of optimal spot model for a given class of patients may be affected by factors such as site, field size, and range shifter and will be investigated further.« less

  7. Building analytical platform with Big Data solutions for log files of PanDA infrastructure

    NASA Astrophysics Data System (ADS)

    Alekseev, A. A.; Barreiro Megino, F. G.; Klimentov, A. A.; Korchuganova, T. A.; Maendo, T.; Padolski, S. V.

    2018-05-01

    The paper describes the implementation of a high-performance system for the processing and analysis of log files for the PanDA infrastructure of the ATLAS experiment at the Large Hadron Collider (LHC), responsible for the workload management of order of 2M daily jobs across the Worldwide LHC Computing Grid. The solution is based on the ELK technology stack, which includes several components: Filebeat, Logstash, ElasticSearch (ES), and Kibana. Filebeat is used to collect data from logs. Logstash processes data and export to Elasticsearch. ES are responsible for centralized data storage. Accumulated data in ES can be viewed using a special software Kibana. These components were integrated with the PanDA infrastructure and replaced previous log processing systems for increased scalability and usability. The authors will describe all the components and their configuration tuning for the current tasks, the scale of the actual system and give several real-life examples of how this centralized log processing and storage service is used to showcase the advantages for daily operations.

  8. An EXCEL macro for importing log ASCII standard (LAS) files into EXCEL worksheets

    NASA Astrophysics Data System (ADS)

    Özkaya, Sait Ismail

    1996-02-01

    An EXCEL 5.0 macro is presented for converting a LAS text file into an EXCEL worksheet. Although EXCEL has commands for importing text files and parsing text lines, LAS files must be decoded line-by-line because three different delimiters are used to separate fields of differing length. The macro is intended to eliminate manual decoding of LAS version 2.0. LAS is a floppy disk format for storage and transfer of log data as text files. LAS was proposed by the Canadian Well Logging Society. The present EXCEL macro decodes different sections of a LAS file, separates, and places the fields into different columns of an EXCEL worksheet. To import a LAS file into EXCEL without errors, the file must not contain any unrecognized symbols, and the data section must be the last section. The program does not check for the presence of mandatory sections or fields as required by LAS rules. Once a file is incorporated into EXCEL, mandatory sections and fields may be inspected visually.

  9. Development of a user-friendly system for image processing of electron microscopy by integrating a web browser and PIONE with Eos.

    PubMed

    Tsukamoto, Takafumi; Yasunaga, Takuo

    2014-11-01

    Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also need to provide common workspace for analysis because the client is physically separated from a server. We solved the file format problem by extension of rules of OptionControlFile of Eos. Furthermore, to solve workspace problems, we have developed two type of system. The first system is to use only local environments. The user runs a web server provided by Eos, access to a web client through a web browser, and manipulate the local files with GUI on the web browser. The second system is employing PIONE (Process-rule for Input/Output Negotiation Environment), which is our developing platform that works under heterogenic distributed environment. The users can put their resources, such as microscopic images, text files and so on, into the server-side environment supported by PIONE, and so experts can write PIONE rule definition, which defines a workflow of image processing. PIONE run each image processing on suitable computers, following the defined rule. PIONE has the ability of interactive manipulation, and user is able to try a command with various setting values. In this situation, we contribute to auto-generation of GUI for a PIONE workflow.As advanced functions, we have developed a module to log user actions. The logs include information such as setting values in image processing, procedure of commands and so on. If we use the logs effectively, we can get a lot of advantages. For example, when an expert may discover some know-how of image processing, other users can also share logs including his know-hows and so we may obtain recommendation workflow of image analysis, if we analyze logs. To implement social platform of image processing for electron microscopists, we have developed system infrastructure, as well. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. SU-F-T-177: Impacts of Gantry Angle Dependent Scanning Beam Properties for Proton Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Y; Clasie, B; Lu, H

    Purpose: In pencil beam scanning (PBS), the delivered spot MU, position and size are slightly different at different gantry angles. We investigated the level of delivery uncertainty at different gantry angles through a log file analysis. Methods: 34 PBS fields covering full 360 degrees gantry angle spread were collected retrospectively from 28 patients treated at our institution. All fields were delivered at zero gantry angle and the prescribed gantry angle, and measured at isocenter with the MatriXX 2D array detector at the prescribed gantry angle. The machine log files were analyzed to extract the delivered MU per spot and themore » beam position from the strip ionization chambers in the treatment nozzle. The beam size was separately measured as a function of gantry angle and beam energy. Using this information, the dose was calculated in a water phantom at both gantry angles and compared to the measurement using the 3D γ-index at 2mm/2%. Results: The spot-by-spot difference between the beam position in the log files from the delivery at the two gantry angles has a mean of 0.3 and 0.4 mm and a standard deviation of 0.6 and 0.7 mm for × and y directions, respectively. Similarly, the spot-by-spot difference between the MU in the log files from the delivery at the two gantry angles has a mean 0.01% and a standard deviation of 0.7%. These small deviations lead to an excellent agreement in dose calculations with an average γ pass rate for all fields being approximately 99.7%. When each calculation is compared to the measurement, a high correlation in γ was also found. Conclusion: Using machine logs files, we verified that PBS beam delivery at different gantry angles are sufficiently small and the planned spot position and MU. This study brings us one step closer to simplifying our patient-specific QA.« less

  11. SU-F-T-469: A Clinically Observed Discrepancy Between Image-Based and Log- Based MLC Position

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neal, B; Ahmed, M; Siebers, J

    2016-06-15

    Purpose: To present a clinical case which challenges the base assumption of log-file based QA, by showing that the actual position of a MLC leaf can suddenly deviate from its programmed and logged position by >1 mm as observed with real-time imaging. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used in cine mode to capture portal images during treatment. Visual monitoring identified an anomalous MLC leaf pair gap not otherwise detected by the automatic position verification. The position of the erred leaf was measured on EPID images and log files were analyzed for themore » treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days. Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3±0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusion: It has been clinically observed that log-file derived leaf positions can differ from their actual positions by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trusting log file records. Intra-treatment EPID imaging provides a method to capture departures from MLC planned positions. Work was supported in part by Varian Medical Systems.« less

  12. SU-E-T-144: Effective Analysis of VMAT QA Generated Trajectory Log Files for Medical Accelerator Predictive Maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Able, CM; Baydush, AH; Nguyen, C

    Purpose: To determine the effectiveness of SPC analysis for a model predictive maintenance process that uses accelerator generated parameter and performance data contained in trajectory log files. Methods: Each trajectory file is decoded and a total of 131 axes positions are recorded (collimator jaw position, gantry angle, each MLC, etc.). This raw data is processed and either axis positions are extracted at critical points during the delivery or positional change over time is used to determine axis velocity. The focus of our analysis is the accuracy, reproducibility and fidelity of each axis. A reference positional trace of the gantry andmore » each MLC is used as a motion baseline for cross correlation (CC) analysis. A total of 494 parameters (482 MLC related) were analyzed using Individual and Moving Range (I/MR) charts. The chart limits were calculated using a hybrid technique that included the use of the standard 3σ limits and parameter/system specifications. Synthetic errors/changes were introduced to determine the initial effectiveness of I/MR charts in detecting relevant changes in operating parameters. The magnitude of the synthetic errors/changes was based on: TG-142 and published analysis of VMAT delivery accuracy. Results: All errors introduced were detected. Synthetic positional errors of 2mm for collimator jaw and MLC carriage exceeded the chart limits. Gantry speed and each MLC speed are analyzed at two different points in the delivery. Simulated Gantry speed error (0.2 deg/sec) and MLC speed error (0.1 cm/sec) exceeded the speed chart limits. Gantry position error of 0.2 deg was detected by the CC maximum value charts. The MLC position error of 0.1 cm was detected by the CC maximum value location charts for every MLC. Conclusion: SPC I/MR evaluation of trajectory log file parameters may be effective in providing an early warning of performance degradation or component failure for medical accelerator systems.« less

  13. Development of Cross-Platform Software for Well Logging Data Visualization

    NASA Astrophysics Data System (ADS)

    Akhmadulin, R. K.; Miraev, A. I.

    2017-07-01

    Well logging data processing is one of the main sources of information in the oil-gas field analysis and is of great importance in the process of its development and operation. Therefore, it is important to have the software which would accurately and clearly provide the user with processed data in the form of well logs. In this work, there have been developed a software product which not only has the basic functionality for this task (loading data from .las files, well log curves display, etc.), but can be run in different operating systems and on different devices. In the article a subject field analysis and task formulation have been performed, and the software design stage has been considered. At the end of the work the resulting software product interface has been described.

  14. Log-less metadata management on metadata server for parallel file systems.

    PubMed

    Liao, Jianwei; Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally.

  15. Log-Less Metadata Management on Metadata Server for Parallel File Systems

    PubMed Central

    Xiao, Guoqiang; Peng, Xiaoning

    2014-01-01

    This paper presents a novel metadata management mechanism on the metadata server (MDS) for parallel and distributed file systems. In this technique, the client file system backs up the sent metadata requests, which have been handled by the metadata server, so that the MDS does not need to log metadata changes to nonvolatile storage for achieving highly available metadata service, as well as better performance improvement in metadata processing. As the client file system backs up certain sent metadata requests in its memory, the overhead for handling these backup requests is much smaller than that brought by the metadata server, while it adopts logging or journaling to yield highly available metadata service. The experimental results show that this newly proposed mechanism can significantly improve the speed of metadata processing and render a better I/O data throughput, in contrast to conventional metadata management schemes, that is, logging or journaling on MDS. Besides, a complete metadata recovery can be achieved by replaying the backup logs cached by all involved clients, when the metadata server has crashed or gone into nonoperational state exceptionally. PMID:24892093

  16. Geophysical log database for the Floridan aquifer system and southeastern Coastal Plain aquifer system in Florida and parts of Georgia, Alabama, and South Carolina

    USGS Publications Warehouse

    Williams, Lester J.; Raines, Jessica E.; Lanning, Amanda E.

    2013-04-04

    A database of borehole geophysical logs and other types of data files were compiled as part of ongoing studies of water availability and assessment of brackish- and saline-water resources. The database contains 4,883 logs from 1,248 wells in Florida, Georgia, Alabama, South Carolina, and from a limited number of offshore wells of the eastern Gulf of Mexico and the Atlantic Ocean. The logs can be accessed through a download directory organized by state and county for onshore wells and in a single directory for the offshore wells. A flat file database is provided that lists the wells, their coordinates, and the file listings.

  17. Index map of cross sections through parts of the Appalachian basin (Kentucky, New York, Ohio, Pennsylvania, Tennessee, Virginia, West Virginia): Chapter E.1 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    The appendixes in chapters E.4.1 and E.4.2 include (1) Log ASCII Standard (LAS) files, which encode gamma-ray, neutron, density, and other logs in text files that can be used by most well-logging software programs; and (2) graphic well-log traces. In the appendix to chapter E.4.1, the well-log traces are accompanied by lithologic descriptions with formation tops.

  18. A clinically observed discrepancy between image-based and log-based MLC positions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neal, Brian, E-mail: bpn2p@virginia.edu; Ahmed, Mahmoud; Kathuria, Kunal

    2016-06-15

    Purpose: To present a clinical case in which real-time intratreatment imaging identified an multileaf collimator (MLC) leaf to be consistently deviating from its programmed and logged position by >1 mm. Methods: An EPID-based exit-fluence dosimetry system designed to prevent gross delivery errors was used to capture cine during treatment images. The author serendipitously visually identified a suspected MLC leaf displacement that was not otherwise detected. The leaf position as recorded on the EPID images was measured and log-files were analyzed for the treatment in question, the prior day’s treatment, and for daily MLC test patterns acquired on those treatment days.more » Additional standard test patterns were used to quantify the leaf position. Results: Whereas the log-file reported no difference between planned and recorded positions, image-based measurements showed the leaf to be 1.3 ± 0.1 mm medial from the planned position. This offset was confirmed with the test pattern irradiations. Conclusions: It has been clinically observed that log-file derived leaf positions can differ from their actual position by >1 mm, and therefore cannot be considered to be the actual leaf positions. This cautions the use of log-based methods for MLC or patient quality assurance without independent confirmation of log integrity. Frequent verification of MLC positions through independent means is a necessary precondition to trust log-file records. Intratreatment EPID imaging provides a method to capture departures from MLC planned positions.« less

  19. Quantification of residual dose estimation error on log file-based patient dose calculation.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Matsunaga, Kenichi; Matsushita, Haruo; Majima, Kazuhiro; Jingu, Keiichi

    2016-05-01

    The log file-based patient dose estimation includes a residual dose estimation error caused by leaf miscalibration, which cannot be reflected on the estimated dose. The purpose of this study is to determine this residual dose estimation error. Modified log files for seven head-and-neck and prostate volumetric modulated arc therapy (VMAT) plans simulating leaf miscalibration were generated by shifting both leaf banks (systematic leaf gap errors: ±2.0, ±1.0, and ±0.5mm in opposite directions and systematic leaf shifts: ±1.0mm in the same direction) using MATLAB-based (MathWorks, Natick, MA) in-house software. The generated modified and non-modified log files were imported back into the treatment planning system and recalculated. Subsequently, the generalized equivalent uniform dose (gEUD) was quantified for the definition of the planning target volume (PTV) and organs at risks. For MLC leaves calibrated within ±0.5mm, the quantified residual dose estimation errors that obtained from the slope of the linear regression of gEUD changes between non- and modified log file doses per leaf gap are in head-and-neck plans 1.32±0.27% and 0.82±0.17Gy for PTV and spinal cord, respectively, and in prostate plans 1.22±0.36%, 0.95±0.14Gy, and 0.45±0.08Gy for PTV, rectum, and bladder, respectively. In this work, we determine the residual dose estimation errors for VMAT delivery using the log file-based patient dose calculation according to the MLC calibration accuracy. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  20. Information Retrieval Using Hadoop Big Data Analysis

    NASA Astrophysics Data System (ADS)

    Motwani, Deepak; Madan, Madan Lal

    This paper concern on big data analysis which is the cognitive operation of probing huge amounts of information in an attempt to get uncovers unseen patterns. Through Big Data Analytics Applications such as public and private organization sectors have formed a strategic determination to turn big data into cut throat benefit. The primary occupation of extracting value from big data give rise to a process applied to pull information from multiple different sources; this process is known as extract transforms and lode. This paper approach extract information from log files and Research Paper, awareness reduces the efforts for blueprint finding and summarization of document from several positions. The work is able to understand better Hadoop basic concept and increase the user experience for research. In this paper, we propose an approach for analysis log files for finding concise information which is useful and time saving by using Hadoop. Our proposed approach will be applied on different research papers on a specific domain and applied for getting summarized content for further improvement and make the new content.

  1. Storage of sparse files using parallel log-structured file system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    A sparse file is stored without holes by storing a data portion of the sparse file using a parallel log-structured file system; and generating an index entry for the data portion, the index entry comprising a logical offset, physical offset and length of the data portion. The holes can be restored to the sparse file upon a reading of the sparse file. The data portion can be stored at a logical end of the sparse file. Additional storage efficiency can optionally be achieved by (i) detecting a write pattern for a plurality of the data portions and generating a singlemore » patterned index entry for the plurality of the patterned data portions; and/or (ii) storing the patterned index entries for a plurality of the sparse files in a single directory, wherein each entry in the single directory comprises an identifier of a corresponding sparse file.« less

  2. A Varian DynaLog file-based procedure for patient dose-volume histogram-based IMRT QA.

    PubMed

    Calvo-Ortega, Juan F; Teke, Tony; Moragues, Sandra; Pozo, Miquel; Casals-Farran, Joan

    2014-03-06

    In the present study, we describe a method based on the analysis of the dynamic MLC log files (DynaLog) generated by the controller of a Varian linear accelerator in order to perform patient-specific IMRT QA. The DynaLog files of a Varian Millennium MLC, recorded during an IMRT treatment, can be processed using a MATLAB-based code in order to generate the actual fluence for each beam and so recalculate the actual patient dose distribution using the Eclipse treatment planning system. The accuracy of the DynaLog-based dose reconstruction procedure was assessed by introducing ten intended errors to perturb the fluence of the beams of a reference plan such that ten subsequent erroneous plans were generated. In-phantom measurements with an ionization chamber (ion chamber) and planar dose measurements using an EPID system were performed to investigate the correlation between the measured dose changes and the expected ones detected by the reconstructed plans for the ten intended erroneous cases. Moreover, the method was applied to 20 cases of clinical plans for different locations (prostate, lung, breast, and head and neck). A dose-volume histogram (DVH) metric was used to evaluate the impact of the delivery errors in terms of dose to the patient. The ionometric measurements revealed a significant positive correlation (R² = 0.9993) between the variations of the dose induced in the erroneous plans with respect to the reference plan and the corresponding changes indicated by the DynaLog-based reconstructed plans. The EPID measurements showed that the accuracy of the DynaLog-based method to reconstruct the beam fluence was comparable with the dosimetric resolution of the portal dosimetry used in this work (3%/3 mm). The DynaLog-based reconstruction method described in this study is a suitable tool to perform a patient-specific IMRT QA. This method allows us to perform patient-specific IMRT QA by evaluating the result based on the DVH metric of the planning CT image (patient DVH-based IMRT QA).

  3. 46 CFR 97.35-3 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... voyage is completed, the master or person in charge shall file the logbook with the Officer in Charge.... Such logs or records are not filed with the Officer in Charge, Marine Inspection, but must be kept... logs for the period of validity of the vessel's certificate of inspection. [CGD 95-027, 61 FR 26007...

  4. 46 CFR 97.35-3 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... voyage is completed, the master or person in charge shall file the logbook with the Officer in Charge.... Such logs or records are not filed with the Officer in Charge, Marine Inspection, but must be kept... logs for the period of validity of the vessel's certificate of inspection. [CGD 95-027, 61 FR 26007...

  5. SU-E-J-182: Reproducibility of Tumor Motion Probability Distribution Function in Stereotactic Body Radiation Therapy of Lung Using Real-Time Tumor-Tracking Radiotherapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiinoki, T; Hanazawa, H; Park, S

    2015-06-15

    Purpose: We aim to achieve new four-dimensional radiotherapy (4DRT) using the next generation real-time tumor-tracking (RTRT) system and flattening-filter-free techniques. To achieve new 4DRT, it is necessary to understand the respiratory motion of tumor. The purposes of this study were: 1.To develop the respiratory motion analysis tool using log files. 2.To evaluate the reproducibility of tumor motion probability distribution function (PDF) during stereotactic body RT (SBRT) of lung tumor. Methods: Seven patients having fiducial markers closely implanted to the lung tumor were enrolled in this study. The positions of fiducial markers were measured using the RTRT system (Mitsubishi Electronics Co.,more » JP) and recorded as two types of log files during the course of SBRT. For each patients, tumor motion range and tumor motion PDFs in left-right (LR), anterior-posterior (AP) and superior-inferior (SI) directions were calculated using log files of all beams per fraction (PDFn). Fractional PDF reproducibility (Rn) was calculated as Kullback-Leibler (KL) divergence between PDF1 and PDFn of tumor motion. The mean of Rn (Rm) was calculated for each patient and correlated to the patient’s mean tumor motion range (Am). The change of Rm during the course of SBRT was also evluated. These analyses were performed using in-house developed software. Results: The Rm were 0.19 (0.07–0.30), 0.14 (0.07–0.32) and 0.16 (0.09–0.28) in LR, AP and SI directions, respectively. The Am were 5.11 mm (2.58–9.99 mm), 7.81 mm (2.87–15.57 mm) and 11.26 mm (3.80–21.27 mm) in LR, AP and SI directions, respectively. The PDF reproducibility decreased as the tumor motion range increased in AP and SI direction. That decreased slightly through the course of RT in SI direction. Conclusion: We developed the respiratory motion analysis tool for 4DRT using log files and quantified the range and reproducibility of respiratory motion for lung tumors.« less

  6. Analysis of Student Activity in Web-Supported Courses as a Tool for Predicting Dropout

    ERIC Educational Resources Information Center

    Cohen, Anat

    2017-01-01

    Persistence in learning processes is perceived as a central value; therefore, dropouts from studies are a prime concern for educators. This study focuses on the quantitative analysis of data accumulated on 362 students in three academic course website log files in the disciplines of mathematics and statistics, in order to examine whether student…

  7. Log ASCII Standard (LAS) Files for Geophysical Wireline Well Logs and Their Application to Geologic Cross Sections Through the Central Appalachian Basin

    USGS Publications Warehouse

    Crangle, Robert D.

    2007-01-01

    Introduction The U.S. Geological Survey (USGS) uses geophysical wireline well logs for a variety of purposes, including stratigraphic correlation (Hettinger, 2001, Ryder, 2002), petroleum reservoir analyses (Nelson and Bird, 2005), aquifer studies (Balch, 1988), and synthetic seismic profiles (Kulander and Ryder, 2005). Commonly, well logs are easier to visualize, manipulate, and interpret when available in a digital format. In recent geologic cross sections E-E' and D-D', constructed through the central Appalachian basin (Ryder, Swezey, and others, in press; Ryder, Crangle, and others, in press), gamma ray well log traces and lithologic logs were used to correlate key stratigraphic intervals (Fig. 1). The stratigraphy and structure of the cross sections are illustrated through the use of graphical software applications (e.g., Adobe Illustrator). The gamma ray traces were digitized in Neuralog (proprietary software) from paper well logs and converted to a Log ASCII Standard (LAS) format. Once converted, the LAS files were transformed to images through an LAS-reader application (e.g., GeoGraphix Prizm) and then overlain in positions adjacent to well locations, used for stratigraphic control, on each cross section. This report summarizes the procedures used to convert paper logs to a digital LAS format using a third-party software application, Neuralog. Included in this report are LAS files for sixteen wells used in geologic cross section E-E' (Table 1) and thirteen wells used in geologic cross section D-D' (Table 2).

  8. 20 CFR 401.85 - Exempt systems.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... subsection (k)(2) of the Privacy Act: (A) The General Criminal Investigation Files, SSA; (B) The Criminal Investigations File, SSA; and, (C) The Program Integrity Case Files, SSA. (D) Civil and Administrative Investigative Files of the Inspector General, SSA/OIG. (E) Complaint Files and Log. SSA/OGC. (iii) Pursuant to...

  9. 47 CFR 76.1706 - Signal leakage logs and repair records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the probable cause of the leakage. The log shall be kept on file for a period of two years and shall... 47 Telecommunication 4 2010-10-01 2010-10-01 false Signal leakage logs and repair records. 76.1706... leakage logs and repair records. Cable operators shall maintain a log showing the date and location of...

  10. 47 CFR 76.1706 - Signal leakage logs and repair records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... the probable cause of the leakage. The log shall be kept on file for a period of two years and shall... 47 Telecommunication 4 2011-10-01 2011-10-01 false Signal leakage logs and repair records. 76.1706... leakage logs and repair records. Cable operators shall maintain a log showing the date and location of...

  11. SU-E-T-261: Plan Quality Assurance of VMAT Using Fluence Images Reconstituted From Log-Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katsuta, Y; Shimizu, E; Matsunaga, K

    2014-06-01

    Purpose: A successful VMAT plan delivery includes precise modulations of dose rate, gantry rotational and multi-leaf collimator (MLC) shapes. One of the main problem in the plan quality assurance is dosimetric errors associated with leaf-positional errors are difficult to analyze because they vary with MU delivered and leaf number. In this study, we calculated integrated fluence error image (IFEI) from log-files and evaluated plan quality in the area of all and individual MLC leaves scanned. Methods: The log-file reported the expected and actual position for inner 20 MLC leaves and the dose fraction every 0.25 seconds during prostate VMAT onmore » Elekta Synergy. These data were imported to in-house software that developed to calculate expected and actual fluence images from the difference of opposing leaf trajectories and dose fraction at each time. The IFEI was obtained by adding all of the absolute value of the difference between expected and actual fluence images corresponding. Results: In the area all MLC leaves scanned in the IFEI, the average and root mean square (rms) were 2.5 and 3.6 MU, the area of errors below 10, 5 and 3 MU were 98.5, 86.7 and 68.1 %, the 95 % of area was covered with less than error of 7.1 MU. In the area individual MLC leaves scanned in the IFEI, the average and rms value were 2.1 – 3.0 and 3.1 – 4.0 MU, the area of errors below 10, 5 and 3 MU were 97.6 – 99.5, 81.7 – 89.5 and 51.2 – 72.8 %, the 95 % of area was covered with less than error of 6.6 – 8.2 MU. Conclusion: The analysis of the IFEI reconstituted from log-file was provided detailed information about the delivery in the area of all and individual MLC leaves scanned.« less

  12. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  13. A Query Analysis of Consumer Health Information Retrieval

    PubMed Central

    Hong, Yi; de la Cruz, Norberto; Barnas, Gary; Early, Eileen; Gillis, Rick

    2002-01-01

    The log files of MCW HealthLink web site were analyzed to study users' needs for consumer health information and get a better understanding of the health topics users are searching for, the paths users usually take to find consumer health information and the way to improve search effectiveness.

  14. SU-F-T-230: A Simple Method to Assess Accuracy of Dynamic Wave Arc Irradiation Using An Electronic Portal Imaging Device and Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirashima, H; Miyabe, Y; Yokota, K

    2016-06-15

    Purpose: The Dynamic Wave Arc (DWA) technique, where the multi-leaf collimator (MLC) and gantry/ring move simultaneously in a predefined non-coplanar trajectory, has been developed on the Vero4DRT. The aim of this study is to develop a simple method for quality assurance of DWA delivery using an electronic portal imaging device (EPID) measurements and log files analysis. Methods: The Vero4DRT has an EPID on the beam axis, the resolution of which is 0.18 mm/pixel at the isocenter plane. EPID images were acquired automatically. To verify the detection accuracy of the MLC position by EPID images, the MLC position with intentional errorsmore » was assessed. Tests were designed considering three factors: (1) accuracy of the MLC position (2) dose output consistency with variable dose rate (160–400 MU/min), gantry speed (2.4–6°/s), ring speed (0.5–2.5°/s), and (3) MLC speed (1.6–4.2 cm/s). All the patterns were delivered to the EPID and compared with those obtained with a stationary radiation beam with a 0° gantry angle. The irradiation log, including the MLC position and gantry/ring angle, were recorded simultaneously. To perform independent checks of the machine accuracy, the MLC position and gantry/ring angle position were assessed using log files. Results: 0.1 mm intentional error can be detected by the EPID, which is smaller than the EPID pixel size. The dose outputs with different conditions of the dose rate and gantry/ring speed and MLC speed showed good agreement, with a root mean square (RMS) error of 0.76%. The RMS error between the detected and recorded data were 0.1 mm for the MLC position, 0.12° for the gantry angle, and 0.07° for the ring angle. Conclusion: The MLC position and dose outputs in variable conditions during DWA irradiation can be easily detected using EPID measurements and log file analysis. The proposed method is useful for routine verification. This research is (partially) supported by the Practical Research for Innovative Cancer Control (15Ack0106151h0001) from Japan Agency for Medical Research and development, AMED. Authors Takashi Mizowaki and Masahiro Hiraoka have consultancy agreement with Mitsubishi Heavy Industries, Ltd., Japan.« less

  15. Replication in the Harp File System

    DTIC Science & Technology

    1981-07-01

    Shrira Michael Williams iadly 1991 © Massachusetts Institute of Technology (To appear In the Proceedings of the Thirteenth ACM Symposium on Operating...S., Spector, A. Z., and Thompson, D. S. Distributed Logging for Transaction Processing. ACM Special Interest Group on Management of Data 1987 Annual ...System. USENIX Conference Proceedings , June, 1990, pp. 63-71. 15. Hagmann, R. Reimplementing the Cedar File System Using Logging and Group Commit

  16. 17 CFR 274.402 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... for access codes to file on EDGAR. 274.402 Section 274.402 Commodity and Securities Exchanges... Forms for Electronic Filing § 274.402 Form ID, uniform application for access codes to file on EDGAR..., filing agent or training agent to log on to the EDGAR system, submit filings, and change its CCC. (d...

  17. 17 CFR 274.402 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... for access codes to file on EDGAR. 274.402 Section 274.402 Commodity and Securities Exchanges... Forms for Electronic Filing § 274.402 Form ID, uniform application for access codes to file on EDGAR..., filing agent or training agent to log on to the EDGAR system, submit filings, and change its CCC. (d...

  18. Paleomagnetic dating: Methods, MATLAB software, example

    NASA Astrophysics Data System (ADS)

    Hnatyshin, Danny; Kravchinsky, Vadim A.

    2014-09-01

    A MATLAB software tool has been developed to provide an easy to use graphical interface for the plotting and interpretation of paleomagnetic data. The tool takes either paleomagnetic directions or paleopoles and compares them to a user defined apparent polar wander path or secular variation curve to determine the age of a paleomagnetic sample. Ages can be determined in two ways, either by translating the data onto the reference curve, or by rotating it about a set location (e.g. sampling location). The results are then compiled in data tables which can be exported as an excel file. This data can also be plotted using variety of built-in stereographic projections, which can then be exported as an image file. This software was used to date the giant Sukhoi Log gold deposit in Russia. Sukhoi Log has undergone a complicated history of faulting, folding, metamorphism, and is the vicinity of many granitic bodies. Paleomagnetic analysis of Sukhoi Log allowed for the timing of large scale thermal or chemical events to be determined. Paleomagnetic analysis from gold mineralized black shales was used to define the natural remanent magnetization recorded at Sukhoi Log. The obtained paleomagnetic direction from thermal demagnetization produced a paleopole at 61.3°N, 155.9°E, with the semi-major axis and semi-minor axis of the 95% confidence ellipse being 16.6° and 15.9° respectively. This paleopole is compared to the Siberian apparent polar wander path (APWP) by translating the paleopole to the nearest location on the APWP. This produced an age of 255.2- 31.0+ 32.0Ma and is the youngest well defined age known for Sukhoi Log. We propose that this is the last major stage of activity at Sukhoi Log, and likely had a role in determining the present day state of mineralization seen at the deposit.

  19. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file

    NASA Astrophysics Data System (ADS)

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-01

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases. This work was partly presented at the 58th Annual meeting of American Association of Physicists in Medicine.

  20. Verification of respiratory-gated radiotherapy with new real-time tumour-tracking radiotherapy system using cine EPID images and a log file.

    PubMed

    Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko

    2017-02-21

    A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were  ⩽3 mm and  ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases.

  1. A high-speed scintillation-based electronic portal imaging device to quantitatively characterize IMRT delivery.

    PubMed

    Ranade, Manisha K; Lynch, Bart D; Li, Jonathan G; Dempsey, James F

    2006-01-01

    We have developed an electronic portal imaging device (EPID) employing a fast scintillator and a high-speed camera. The device is designed to accurately and independently characterize the fluence delivered by a linear accelerator during intensity modulated radiation therapy (IMRT) with either step-and-shoot or dynamic multileaf collimator (MLC) delivery. Our aim is to accurately obtain the beam shape and fluence of all segments delivered during IMRT, in order to study the nature of discrepancies between the plan and the delivered doses. A commercial high-speed camera was combined with a terbium-doped gadolinium-oxy-sulfide (Gd2O2S:Tb) scintillator to form an EPID for the unaliased capture of two-dimensional fluence distributions of each beam in an IMRT delivery. The high speed EPID was synchronized to the accelerator pulse-forming network and gated to capture every possible pulse emitted from the accelerator, with an approximate frame rate of 360 frames-per-second (fps). A 62-segment beam from a head-and-neck IMRT treatment plan requiring 68 s to deliver was recorded with our high speed EPID producing approximately 6 Gbytes of imaging data. The EPID data were compared with the MLC instruction files and the MLC controller log files. The frames were binned to provide a frame rate of 72 fps with a signal-to-noise ratio that was sufficient to resolve leaf positions and segment fluence. The fractional fluence from the log files and EPID data agreed well. An ambiguity in the motion of the MLC during beam on was resolved. The log files reported leaf motions at the end of 33 of the 42 segments, while the EPID observed leaf motions in only 7 of the 42 segments. The static IMRT segment shapes observed by the high speed EPID were in good agreement with the shapes reported in the log files. The leaf motions observed during beam-on for step-and-shoot delivery were not temporally resolved by the log files.

  2. Geologic cross section E-E' through the Appalachian basin from the Findlay arch, Wood County, Ohio, to the Valley and Ridge province, Pendleton County, West Virginia: Chapter E.4.2 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Swezey, Christopher S.; Crangle, Robert D.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    This chapter is a re-release of U.S. Geological Survey Scientific Investigations Map 2985, of the same title, by Ryder and others (2008). For this chapter, two appendixes have been added that do not appear with the original version. Appendix A provides Log ASCII Standard (LAS) files for each drill hole along cross-section E–E'; they are text files which encode gamma-ray, neutron, density, and other logs that can be used by most well-logging software. Appendix B provides graphic well-log traces from each drill hole.

  3. VizieR Online Data Catalog: CoRoT red giants abundances (Morel+, 2014)

    NASA Astrophysics Data System (ADS)

    Morel, T.; Miglio, A.; Lagarde, N.; Montalban, J.; Rainer, M.; Poretti, E.; Eggenberger, P.; Hekker, S.; Kallinger, T.; Mosser, B.; Valentini, M.; Carrier, F.; Hareter, M.; Mantegazza, L.

    2014-02-01

    The equivalent widths were measured manually assuming Gaussian profiles or Voigt profiles for the few lines with extended damping wings. Lines with an unsatisfactory fit or significantly affected by telluric features were discarded. Only values eventually retained for the analysis are provided. For the chemical abundances, the usual notation is used: [X/Y]=[log({epsilon}(X))-log({epsilon}(Y))]star - [log({epsilon}(X))-log({epsilon}(Y))]⊙ with log{epsilon}(X)=12+log[N(X)/N(H)] (N is the number density of the species). For lithium, the following notation is used: [Li/H]=log(N(Li))star-log(N(Li))⊙. The adopted solar abundances are taken from Grevesse & Sauval (1998SSRv...85..161G), except for Li for which we adopt our derived values: log({epsilon}(Li))⊙=1.09 and 1.13 in LTE and NLTE, respectively (see text). All the abundances are computed under the assumption of LTE, except Li for which values corrected for departures from LTE using the data of Lind et al. (2009A&A...503..541L) are also provided. All the quoted error bars are 1-sigma uncertainties. (6 data files).

  4. 78 FR 40474 - Sustaining Power Solutions LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-05

    .... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...

  5. 78 FR 34371 - Longfellow Wind, LLC: Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-07

    .... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...

  6. The new idea of transporting tailings-logs in tailings slurry pipeline and the innovation of technology of mining waste-fill method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin Yu; Wang Fuji; Tao Yan

    2000-07-01

    This paper introduced a new idea of transporting mine tailings-logs in mine tailings-slurry pipeline and a new technology of mine cemented filing of tailings-logs with tailings-slurry. The hydraulic principles, the compaction of tailings-logs and the mechanic function of fillbody of tailings-logs cemented by tailings-slurry have been discussed.

  7. User's manual for SEDCALC, a computer program for computation of suspended-sediment discharge

    USGS Publications Warehouse

    Koltun, G.F.; Gray, John R.; McElhone, T.J.

    1994-01-01

    Sediment-Record Calculations (SEDCALC), a menu-driven set of interactive computer programs, was developed to facilitate computation of suspended-sediment records. The programs comprising SEDCALC were developed independently in several District offices of the U.S. Geological Survey (USGS) to minimize the intensive labor associated with various aspects of sediment-record computations. SEDCALC operates on suspended-sediment-concentration data stored in American Standard Code for Information Interchange (ASCII) files in a predefined card-image format. Program options within SEDCALC can be used to assist in creating and editing the card-image files, as well as to reformat card-image files to and from formats used by the USGS Water-Quality System. SEDCALC provides options for creating card-image files containing time series of equal-interval suspended-sediment concentrations from 1. digitized suspended-sediment-concentration traces, 2. linear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals, and 3. nonlinear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals. Suspended-sediment discharge can be computed from the streamflow and suspended-sediment-concentration data or by application of transport relations derived by regressing log-transformed instantaneous streamflows on log-transformed instantaneous suspended-sediment concentrations or discharges. The computed suspended-sediment discharge data are stored in card-image files that can be either directly imported to the USGS Automated Data Processing System or used to generate plots by means of other SEDCALC options.

  8. TraceContract

    NASA Technical Reports Server (NTRS)

    Kavelund, Klaus; Barringer, Howard

    2012-01-01

    TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.

  9. Web processing service for landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Sandric, I.; Ursaru, P.; Chitu, D.; Mihai, B.; Savulescu, I.

    2012-04-01

    Hazard analysis requires heavy computation and specialized software. Web processing services can offer complex solutions that can be accessed through a light client (web or desktop). This paper presents a web processing service (both WPS and Esri Geoprocessing Service) for landslides hazard assessment. The web processing service was build with Esri ArcGIS Server solution and Python, developed using ArcPy, GDAL Python and NumPy. A complex model for landslide hazard analysis using both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation was build and published as WPS and Geoprocessing service using ArcGIS Standard Enterprise 10.1. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. All these parameters can be served by the client from other WFS services or by uploading and processing the data on the server. The user can select the option of creating the first and second derivatives from the DEM automatically on the server or to upload the data already calculated. One of the main dynamic factors from the landslide analysis model is leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index can be derived from various satellite images or downloaded as a product. The upload of such data (time series) is possible using a NetCDF file format. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created and it can be downloaded from the server with the log file

  10. 17 CFR 239.63 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... for access codes to file on EDGAR. 239.63 Section 239.63 Commodity and Securities Exchanges SECURITIES... Statements § 239.63 Form ID, uniform application for access codes to file on EDGAR. Form ID must be filed by... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification...

  11. 78 FR 54888 - Guzman Power Markets, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-06

    ... the eFiling link to log on and submit the intervention or protests. Persons unable to file... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... electronic service, persons with Internet access who will eFile a document and/or be listed as a contact for...

  12. 17 CFR 239.63 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... for access codes to file on EDGAR. 239.63 Section 239.63 Commodity and Securities Exchanges SECURITIES... Statements § 239.63 Form ID, uniform application for access codes to file on EDGAR. Form ID must be filed by... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification...

  13. 78 FR 28835 - Salton Sea Power Generation Company; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    .... Select the eFiling link to log on and submit the intervention or protests. Persons unable to file... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... facilitate electronic service, persons with Internet access who will eFile a document and/or be listed as a...

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wijesooriya, K; Seitter, K; Desai, V

    Purpose: To present our single institution experience on catching errors with trajectory log file analysis. The reported causes of failures, probability of occurrences (O), severity of effects (S), and the probability of the failures to be undetected (D) could be added to guidelines of FMEA analysis. Methods: From March 2013 to March 2014, 19569 patient treatment fields/arcs were analyzed. This work includes checking all 131 treatment delivery parameters for all patients, all treatment sites and all treatment delivery fractions. TrueBeam trajectory log files for all treatment field types as well as all imaging types were accessed, read in every 20ms,more » and every control point (total of 37 million parameters) compared to the physician approved plan in the planning system. Results: Couch angle outlier occurrence: N= 327, range = −1.7 −1.2 deg; gantry angle outlier occurrence: N =59, range = 0.09 – 5.61 deg, collimator angle outlier occurrence: N = 13, range = −0.2 – 0.2 deg. VMAT cases have slightly larger variations in mechanical parameters. MLC: 3D single control point fields have a maximum deviation of 0.04 mm, 39 step and shoot IMRT cases have MLC −0.3 – 0.5 mm deviations, all (1286) VMAT cases have −0.9 – 0.7 mm deviations. Two possible serious errors were found: 1) A 4 cm isocenter shift for the PA beam of an AP-PA pair, under-dosing a portion of PTV by 25%. 2) Delivery with MLC leaves abutted behind the jaws as opposed to the midline as planned, leading to a under-dosing of a small volume of the PTV by 25%, by just the boost plan. Due to their error origin, neither of these errors could have been detected by pre-treatment verification. Conclusion: Performing Trajectory Log file analysis could catch typically undetected errors to avoid potentially adverse incidents.« less

  15. 77 FR 55817 - Delek Crude Logistics, LLC; Notice of Petition for Waiver

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ... using the eRegistration link. Select the eFiling link to log on and submit the intervention or protests... number. eFiling is encouraged. More detailed information relating to filing requirements, interventions...'') grant a temporary waiver of the filing and reporting requirements of sections 6 and 201 of the...

  16. Geologic cross section D-D' through the Appalachian basin from the Findlay arch, Sandusky County, Ohio, to the Valley and Ridge province, Hardy County, West Virginia: Chapter E.4.1 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Crangle, Robert D.; Trippi, Michael H.; Swezey, Christopher S.; Lentz, Erika E.; Rowan, Elisabeth L.; Hope, Rebecca S.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    This chapter is a re-release of U.S. Geological Survey Scientific Investigations Map 3067, of the same title, by Ryder and others (2009). For this chapter, two appendixes have been added that do not appear with the original version. Appendix A provides Log ASCII Standard (LAS) files for each drill hole along cross-section D-D'; they are text files which encode gamma-ray, neutron, density, and other logs that can be used by most well-logging software. Appendix B provides graphic well-log traces and lithologic descriptions with formation tops from each drill hole.

  17. Parallel checksumming of data chunks of a shared data object using a log-structured file system

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-09-06

    Checksum values are generated and used to verify the data integrity. A client executing in a parallel computing system stores a data chunk to a shared data object on a storage node in the parallel computing system. The client determines a checksum value for the data chunk; and provides the checksum value with the data chunk to the storage node that stores the shared object. The data chunk can be stored on the storage node with the corresponding checksum value as part of the shared object. The storage node may be part of a Parallel Log-Structured File System (PLFS), and the client may comprise, for example, a Log-Structured File System client on a compute node or burst buffer. The checksum value can be evaluated when the data chunk is read from the storage node to verify the integrity of the data that is read.

  18. 15 CFR 762.3 - Records exempt from recordkeeping requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...; (2) Special export file list; (3) Vessel log from freight forwarder; (4) Inspection certificate; (5... form; (12) Financial hold form; (13) Export parts shipping problem form; (14) Draft number log; (15) Expense invoice mailing log; (16) Financial status report; (17) Bank release of guarantees; (18) Cash...

  19. 15 CFR 762.3 - Records exempt from recordkeeping requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...; (2) Special export file list; (3) Vessel log from freight forwarder; (4) Inspection certificate; (5... form; (12) Financial hold form; (13) Export parts shipping problem form; (14) Draft number log; (15) Expense invoice mailing log; (16) Financial status report; (17) Bank release of guarantees; (18) Cash...

  20. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    PubMed

    Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen

    2015-01-01

    Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  1. An analysis of image storage systems for scalable training of deep neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Young, Steven R; Patton, Robert M

    This study presents a principled empirical evaluation of image storage systems for training deep neural networks. We employ the Caffe deep learning framework to train neural network models for three different data sets, MNIST, CIFAR-10, and ImageNet. While training the models, we evaluate five different options to retrieve training image data: (1) PNG-formatted image files on local file system; (2) pushing pixel arrays from image files into a single HDF5 file on local file system; (3) in-memory arrays to hold the pixel arrays in Python and C++; (4) loading the training data into LevelDB, a log-structured merge tree based key-valuemore » storage; and (5) loading the training data into LMDB, a B+tree based key-value storage. The experimental results quantitatively highlight the disadvantage of using normal image files on local file systems to train deep neural networks and demonstrate reliable performance with key-value storage based storage systems. When training a model on the ImageNet dataset, the image file option was more than 17 times slower than the key-value storage option. Along with measurements on training time, this study provides in-depth analysis on the cause of performance advantages/disadvantages of each back-end to train deep neural networks. We envision the provided measurements and analysis will shed light on the optimal way to architect systems for training neural networks in a scalable manner.« less

  2. Cooperative storage of shared files in a parallel computing system with dynamic block size

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2015-11-10

    Improved techniques are provided for parallel writing of data to a shared object in a parallel computing system. A method is provided for storing data generated by a plurality of parallel processes to a shared object in a parallel computing system. The method is performed by at least one of the processes and comprises: dynamically determining a block size for storing the data; exchanging a determined amount of the data with at least one additional process to achieve a block of the data having the dynamically determined block size; and writing the block of the data having the dynamically determined block size to a file system. The determined block size comprises, e.g., a total amount of the data to be stored divided by the number of parallel processes. The file system comprises, for example, a log structured virtual parallel file system, such as a Parallel Log-Structured File System (PLFS).

  3. VizieR Online Data Catalog: GAMA. Stellar mass budget (Moffett+, 2016)

    NASA Astrophysics Data System (ADS)

    Moffett, A. J.; Lange, R.; Driver, S. P.; Robotham, A. S. G.; Kelvin, L. S.; Alpaslan, M.; Andrews, S. K.; Bland-Hawthorn, J.; Brough, S.; Cluver, M. E.; Colless, M.; Davies, L. J. M.; Holwerda, B. W.; Hopkins, A. M.; Kafle, P. R.; Liske, J.; Meyer, M.

    2018-04-01

    Using the recently expanded Galaxy and Mass Assembly (GAMA) survey phase II visual morphology sample and the large-scale bulge and disc decomposition analysis of Lange et al. (2016MNRAS.462.1470L), we derive new stellar mass function fits to galaxy spheroid and disc populations down to log(M*/Mȯ)=8. (1 data file).

  4. Learner Characteristics Predict Performance and Confidence in E-Learning: An Analysis of User Behavior and Self-Evaluation

    ERIC Educational Resources Information Center

    Jeske, Debora; Roßnagell, Christian Stamov; Backhaus, Joy

    2014-01-01

    We examined the role of learner characteristics as predictors of four aspects of e-learning performance, including knowledge test performance, learning confidence, learning efficiency, and navigational effectiveness. We used both self reports and log file records to compute the relevant statistics. Regression analyses showed that both need for…

  5. 78 FR 70299 - Capacity Markets Partners, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-25

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  6. 78 FR 59923 - Buffalo Dunes Wind Project, LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  7. 78 FR 28833 - Lighthouse Energy Group, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  8. 78 FR 29366 - Wheelabrator Baltimore, LP; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-20

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  9. 77 FR 64978 - Sunbury Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  10. 78 FR 62300 - Burgess Biopower LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-15

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  11. 78 FR 75561 - South Bay Energy Corp.; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  12. 78 FR 28833 - Ebensburg Power Company; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  13. 78 FR 72673 - Yellow Jacket Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-03

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street, NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  14. 78 FR 44557 - Guttman Energy Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  15. 78 FR 68052 - Covanta Haverhill Association, LP; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  16. 78 FR 49506 - Source Power & Gas LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  17. 77 FR 64980 - Noble Americas Energy Solutions LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE...://www.ferc.gov . To facilitate electronic service, persons with Internet access who will eFile a... using the eRegistration link. Select the eFiling link to log on and submit the intervention or protests...

  18. 78 FR 46939 - DWP Energy Holdings, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  19. 78 FR 28833 - CE Leathers Company; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  20. 78 FR 59014 - Lakeswind Power Partners, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-25

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street, NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  1. 78 FR 75560 - Green Current Solutions, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  2. 77 FR 64980 - Collegiate Clean Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  3. 77 FR 64977 - Frontier Utilities New York LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  4. 78 FR 62299 - West Deptford Energy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-15

    ... protest should file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  5. 78 FR 52913 - Allegany Generating Station LLC; Supplemental Notice That Initial Market-Based Rate Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-27

    ... to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  6. SedMob: A mobile application for creating sedimentary logs in the field

    NASA Astrophysics Data System (ADS)

    Wolniewicz, Pawel

    2014-05-01

    SedMob is an open-source, mobile software package for creating sedimentary logs, targeted for use in tablets and smartphones. The user can create an unlimited number of logs, save data from each bed in the log as well as export and synchronize the data with a remote server. SedMob is designed as a mobile interface to SedLog: a free multiplatform package for drawing graphic logs that runs on PC computers. Data entered into SedMob are saved in the CSV file format, fully compatible with SedLog.

  7. Model Analyst’s Toolkit User Guide, Version 7.1.0

    DTIC Science & Technology

    2015-08-01

    Help > About)  Environment details ( operating system )  metronome.log file, located in your MAT 7.1.0 installation folder  Any log file that...requirements to run the Model Analyst’s Toolkit:  Windows XP operating system (or higher) with Service Pack 2 and all critical Windows updates installed...application icon on your desktop  Create a Quick Launch icon – Creates a MAT application icon on the taskbar for operating systems released

  8. 18 CFR 270.304 - Tight formation gas.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... determination that natural gas is tight formation gas must file with the jurisdictional agency an application... formation; (d) A complete copy of the well log, including the log heading identifying the designated tight...

  9. MO-F-CAMPUS-I-01: A System for Automatically Calculating Organ and Effective Dose for Fluoroscopically-Guided Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Rana, V

    2015-06-15

    Purpose: A system was developed that automatically calculates the organ and effective dose for individual fluoroscopically-guided procedures using a log of the clinical exposure parameters. Methods: We have previously developed a dose tracking system (DTS) to provide a real-time color-coded 3D- mapping of skin dose. This software produces a log file of all geometry and exposure parameters for every x-ray pulse during a procedure. The data in the log files is input into PCXMC, a Monte Carlo program that calculates organ and effective dose for projections and exposure parameters set by the user. We developed a MATLAB program to readmore » data from the log files produced by the DTS and to automatically generate the definition files in the format used by PCXMC. The processing is done at the end of a procedure after all exposures are completed. Since there are thousands of exposure pulses with various parameters for fluoroscopy, DA and DSA and at various projections, the data for exposures with similar parameters is grouped prior to entry into PCXMC to reduce the number of Monte Carlo calculations that need to be performed. Results: The software developed automatically transfers data from the DTS log file to PCXMC and runs the program for each grouping of exposure pulses. When the dose from all exposure events are calculated, the doses for each organ and all effective doses are summed to obtain procedure totals. For a complicated interventional procedure, the calculations can be completed on a PC without manual intervention in less than 30 minutes depending on the level of data grouping. Conclusion: This system allows organ dose to be calculated for individual procedures for every patient without tedious calculations or data entry so that estimates of stochastic risk can be obtained in addition to the deterministic risk estimate provided by the DTS. Partial support from NIH grant R01EB002873 and Toshiba Medical Systems Corp.« less

  10. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  11. 78 FR 28834 - Salton Sea Power L.L.C.; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  12. 78 FR 28835 - Del Ranch Company; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  13. 78 FR 28835 - Patua Project LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  14. 78 FR 75561 - Great Bay Energy V, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  15. 77 FR 64981 - Homer City Generation, L.P.; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  16. 77 FR 69819 - Cirrus Wind 1, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-21

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  17. 77 FR 64979 - Great Bay Energy IV, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  18. 77 FR 53195 - H.A. Wagner LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-31

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  19. 78 FR 59923 - Mammoth Three LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  20. 78 FR 61945 - Tuscola Wind II, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-07

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  1. 77 FR 69819 - QC Power Strategies Fund LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-21

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  2. 78 FR 75561 - Astral Energy LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... file with the Federal Energy Regulatory Commission, 888 First Street NE., Washington, DC 20426, in... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  3. An Efficient Format for Nearly Constant-Time Access to Arbitrary Time Intervals in Large Trace Files

    DOE PAGES

    Chan, Anthony; Gropp, William; Lusk, Ewing

    2008-01-01

    A powerful method to aid in understanding the performance of parallel applications uses log or trace files containing time-stamped events and states (pairs of events). These trace files can be very large, often hundreds or even thousands of megabytes. Because of the cost of accessing and displaying such files, other methods are often used that reduce the size of the tracefiles at the cost of sacrificing detail or other information. This paper describes a hierarchical trace file format that provides for display of an arbitrary time window in a time independent of the total size of the file and roughlymore » proportional to the number of events within the time window. This format eliminates the need to sacrifice data to achieve a smaller trace file size (since storage is inexpensive, it is necessary only to make efficient use of bandwidth to that storage). The format can be used to organize a trace file or to create a separate file of annotations that may be used with conventional trace files. We present an analysis of the time to access all of the events relevant to an interval of time and we describe experiments demonstrating the performance of this file format.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stathakis, S; Defoor, D; Linden, P

    Purpose: To study the frequency of Multi-Leaf Collimator (MLC) leaf failures, investigate methods to predict them and reduce linac downtime. Methods: A Varian HD120 MLC was used in our study. The hyperterminal MLC errors logged from 06/2012 to 12/2014 were collected. Along with the hyperterminal errors, the MLC motor changes and all other MLC interventions by the linear accelerator engineer were recorded. The MLC dynalog files were also recorded on a daily basis for each treatment and during linac QA. The dynalog files were analyzed to calculate root mean square errors (RMS) and cumulative MLC travel distance per motor. Anmore » in-house MatLab code was used to analyze all dynalog files, record RMS errors and calculate the distance each MLC traveled per day. Results: A total of 269 interventions were recorded over a period of 18 months. Of these, 146 included MLC motor leaf change, 39 T-nut replacements, and 84 MLC cleaning sessions. Leaves close to the middle of each side required the most maintenance. In the A bank, leaves A27 to A40 recorded 73% of all interventions, while the same leaves in the B bank counted for 52% of the interventions. On average, leaves in the middle of the bank had their motors changed approximately every 1500m of travel. Finally, it was found that the number of RMS errors increased prior to an MLC motor change. Conclusion: An MLC dynalog file analysis software was developed that can be used to log daily MLC usage. Our eighteen-month data analysis showed that there is a correlation between the distance an MLC travels, the RMS and the life of the MLC motor. We plan to use this tool to predict MLC motor failures and with proper and timely intervention, reduce the downtime of the linac during clinical hours.« less

  5. Sight Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G.

    2014-09-01

    Enables applications to emit log information into an output file and produced a structured visual summary of the log data, as well as various statistical analyses of it. This makes it easier for developers to understand the behavior of their applications.

  6. 75 FR 60122 - Notice of Public Information Collection(s) Being Reviewed by the Federal Communications...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-29

    ... the respondents, including the use of automated collection techniques or other forms of information...: OMB Control Number: 3060-0360. Title: Section 80.409, Station Logs. Form No.: N/A. Type of Review... for filing suits upon such claims. Section 80.409(d), Ship Radiotelegraph Logs: Logs of ship stations...

  7. 78 FR 28834 - Elmore Company; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  8. 78 FR 49507 - OriGen Energy LLC ; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... securities and assumptions of liability. Any person desiring to intervene or to protest should file with the... with Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log...

  9. 78 FR 49507 - ORNI 47 LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  10. 77 FR 64981 - BITHENERGY, Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-24

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and...

  11. 78 FR 40473 - eBay Inc.; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for Blanket...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-05

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  12. 78 FR 28832 - CalEnergy, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-16

    ... assumptions of liability. Any person desiring to intervene or to protest should file with the Federal Energy... access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and submit...

  13. 17 CFR 259.602 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...)—allows a filer, filing agent or training agent to log on to the EDGAR system, submit filings, and change... agent to change its Password. [69 FR 22710, Apr. 26, 2004] Editorial Note: For Federal Register... section of the printed volume and at at www.fdsys.gov. ...

  14. 17 CFR 259.602 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...)—allows a filer, filing agent or training agent to log on to the EDGAR system, submit filings, and change... agent to change its Password. [69 FR 22710, Apr. 26, 2004] Editorial Note: For Federal Register... section of the printed volume and on GPO Access. ...

  15. Logs Perl Module

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owen, R. K.

    2007-04-04

    A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.

  16. Expansion of the roadway reference log : KYSPR-99-201.

    DOT National Transportation Integrated Search

    2000-05-01

    The objectives of this study were to: 1) expand the current route log to include milepoints for all intersections on state maintained roads and 2) recommend a procedure for establishing milepoints and maintaining the file with up-to-date information....

  17. 78 FR 52524 - Sunoco Pipeline LP; Notice of Petition for Declaratory Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-23

    ... link to log on and submit the intervention or protests. Persons unable to file electronically should... described in their petition. Any person desiring to intervene or to protest in this proceedings must file in... service, persons with Internet access who will eFile a document and/or be listed as a contact for an...

  18. 78 FR 62349 - Sunoco Pipeline L.P.; Notice of Petition for Declaratory Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-18

    ... to log on and submit the intervention or protests. Persons unable to file electronically should... petition. Any person desiring to intervene or to protest in this proceeding must file in accordance with..., persons with Internet access who will eFile a document and/or be listed as a contact for an intervenor...

  19. 17 CFR 249.446 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification Authorization Code (PMAC)—allows a filer, filing agent or training agent to change its Password. [69 FR 22710... Sections Affected, which appears in the Finding Aids section of the printed volume and on GPO Access. ...

  20. 78 FR 77155 - Grant Program To Assess, Evaluate, and Promote Development of Tribal Energy and Mineral Resources

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-20

    ... through DEMD's in-house databases; Well log interpretation, including correlation of formation tops.... Files must have descriptive file names to help DEMD quickly locate specific components of the proposal...

  1. An analysis of technology usage for streaming digital video in support of a preclinical curriculum.

    PubMed

    Dev, P; Rindfleisch, T C; Kush, S J; Stringer, J R

    2000-01-01

    Usage of streaming digital video of lectures in preclinical courses was measured by analysis of the data in the log file maintained on the web server. We observed that students use the video when it is available. They do not use it to replace classroom attendance but rather for review before examinations or when a class has been missed. Usage of video has not increased significantly for any course within the 18 month duration of this project.

  2. 20 CFR 658.414 - Referral of non-JS-related complaints.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... applicable, were referred on the complaint log specified in § 658.410(c)(1). The JS official shall also prepare and keep the file specified in § 658.410(c)(3) for the complaints filed pursuant to paragraph (a...

  3. Web-based analysis and publication of flow cytometry experiments.

    PubMed

    Kotecha, Nikesh; Krutzik, Peter O; Irish, Jonathan M

    2010-07-01

    Cytobank is a Web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a Web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permission, from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at http://www.cytobank.org. (c) 2010 by John Wiley & Sons, Inc.

  4. Web-Based Analysis and Publication of Flow Cytometry Experiments

    PubMed Central

    Kotecha, Nikesh; Krutzik, Peter O.; Irish, Jonathan M.

    2014-01-01

    Cytobank is a web-based application for storage, analysis, and sharing of flow cytometry experiments. Researchers use a web browser to log in and use a wide range of tools developed for basic and advanced flow cytometry. In addition to providing access to standard cytometry tools from any computer, Cytobank creates a platform and community for developing new analysis and publication tools. Figure layouts created on Cytobank are designed to allow transparent access to the underlying experiment annotation and data processing steps. Since all flow cytometry files and analysis data are stored on a central server, experiments and figures can be viewed or edited by anyone with the proper permissions from any computer with Internet access. Once a primary researcher has performed the initial analysis of the data, collaborators can engage in experiment analysis and make their own figure layouts using the gated, compensated experiment files. Cytobank is available to the scientific community at www.cytobank.org PMID:20578106

  5. 78 FR 49506 - E.ON Global Commodities North America LLC; Supplemental Notice That Initial Market-Based Rate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-14

    ... intervene or to protest should file with the Federal Energy Regulatory Commission, 888 First Street NE... . To facilitate electronic service, persons with Internet access who will eFile a document and/or be...Registration link. Select the eFiling link to log on and submit the intervention or protests. Persons unable to...

  6. 78 FR 63977 - Enable Bakken Crude Services, LLC; Notice of Request For Waiver

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-25

    ... person desiring to intervene or to protest in this proceedings must file in accordance with Rules 211 and... Internet access who will eFile a document and/or be listed as a contact for an intervenor must create and validate an eRegistration account using the eRegistration link. Select the eFiling link to log on and...

  7. 17 CFR 249.446 - Form ID, uniform application for access codes to file on EDGAR.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... log on to the EDGAR system, submit filings, and change its CCC. (d) Password Modification Authorization Code (PMAC)—allows a filer, filing agent or training agent to change its Password. [69 FR 22710... Sections Affected, which appears in the Finding Aids section of the printed volume and at at www.fdsys.gov. ...

  8. MAIL LOG, program theory, volume 2

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.

  9. SU-G-JeP1-08: Dual Modality Verification for Respiratory Gating Using New Real- Time Tumor Tracking Radiotherapy System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiinoki, T; Hanazawa, H; Shibuya, K

    Purpose: The respirato ry gating system combined the TrueBeam and a new real-time tumor-tracking radiotherapy system (RTRT) was installed. The RTRT system consists of two x-ray tubes and color image intensifiers. Using fluoroscopic images, the fiducial marker which was implanted near the tumor was tracked and was used as the internal surrogate for respiratory gating. The purposes of this study was to develop the verification technique of the respiratory gating with the new RTRT using cine electronic portal image device images (EPIDs) of TrueBeam and log files of the RTRT. Methods: A patient who underwent respiratory gated SBRT of themore » lung using the RTRT were enrolled in this study. For a patient, the log files of three-dimensional coordinate of fiducial marker used as an internal surrogate were acquired using the RTRT. Simultaneously, the cine EPIDs were acquired during respiratory gated radiotherapy. The data acquisition was performed for one field at five sessions during the course of SBRT. The residual motion errors were calculated using the log files (E{sub log}). The fiducial marker used as an internal surrogate into the cine EPIDs was automatically extracted by in-house software based on the template-matching algorithm. The differences between the the marker positions of cine EPIDs and digitally reconstructed radiograph were calculated (E{sub EPID}). Results: Marker detection on EPID using in-house software was influenced by low image contrast. For one field during the course of SBRT, the respiratory gating using the RTRT showed the mean ± S.D. of 95{sup th} percentile E{sub EPID} were 1.3 ± 0.3 mm,1.1 ± 0.5 mm,and those of E{sub log} were 1.5 ± 0.2 mm, 1.1 ± 0.2 mm in LR and SI directions, respectively. Conclusion: We have developed the verification method of respiratory gating combined TrueBeam and new real-time tumor-tracking radiotherapy system using EPIDs and log files.« less

  10. [Investigation of Elekta linac characteristics for VMAT].

    PubMed

    Luo, Guangwen; Zhang, Kunyi

    2012-01-01

    The aim of this study is to investigate the characteristics of Elekta delivery system for volumetric modulated arc therapy (VMAT). Five VMAT plans were delivered in service mode and dose rates, and speed of gantry and MLC leaves were analyzed by log files. Results showed that dose rates varied between 6 dose rates. Gantry and MLC leaf speed dynamically varied during delivery. The technique of VMAT requires linac to dynamically control more parameters, and these key dynamic variables during VMAT delivery can be checked by log files. Quality assurance procedure should be carried out for VMAT related parameter.

  11. TU-CD-304-11: Veritas 2.0: A Cloud-Based Tool to Facilitate Research and Innovation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mishra, P; Patankar, A; Etmektzoglou, A

    Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verifiedmore » via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto.« less

  12. 18 CFR 401.110 - Fees.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... include staff time associated with: (A) Processing FOIA requests; (B) Locating and reviewing files; (C) Monitoring file reviews; (D) Generating computer records (electronic print-outs); and (E) Preparing logs of..., black and white copies. The charge for copying standard sized, black and white public records shall be...

  13. 18 CFR 401.110 - Fees.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... include staff time associated with: (A) Processing FOIA requests; (B) Locating and reviewing files; (C) Monitoring file reviews; (D) Generating computer records (electronic print-outs); and (E) Preparing logs of..., black and white copies. The charge for copying standard sized, black and white public records shall be...

  14. 9 CFR 327.10 - Samples; inspection of consignments; refusal of entry; marking.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... import establishment and approved by the Director, Import Inspection Division, is on file at the import... (iv) That the establishment will maintain a daily stamping log containing the following information... covering the product to be inspected. The daily stamping log must be retained by the establishment in...

  15. 46 CFR 78.37-3 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... completed, the master or person in charge shall file the logbook with the Officer in Charge, Marine... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  16. 46 CFR 131.610 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) OFFSHORE SUPPLY VESSELS OPERATIONS Logs § 131... them. (d) When a voyage is completed, or after a specified time has elapsed, the master shall file the... alternative log or record for making entries required by law, including regulations in this subchapter. This...

  17. 46 CFR 131.610 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) OFFSHORE SUPPLY VESSELS OPERATIONS Logs § 131... them. (d) When a voyage is completed, or after a specified time has elapsed, the master shall file the... alternative log or record for making entries required by law, including regulations in this subchapter. This...

  18. 9 CFR 327.10 - Samples; inspection of consignments; refusal of entry; marking.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... import establishment and approved by the Director, Import Inspection Division, is on file at the import... (iv) That the establishment will maintain a daily stamping log containing the following information... covering the product to be inspected. The daily stamping log must be retained by the establishment in...

  19. 46 CFR 78.37-3 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... completed, the master or person in charge shall file the logbook with the Officer in Charge, Marine... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  20. Comparison of fracture and deformation in the rotary endodontic instruments: Protaper versus K-3 system.

    PubMed

    Nagi, Sana Ehsen; Khan, Farhan Raza; Rahman, Munawar

    2016-03-01

    This experimental study was done on extracted human teeth to compare the fracture and deformation of the two rotary endodontic files system namely K-3 and Protapers. It was conducted at the dental clinics of the Aga Khan University Hospital, Karachi, A log of file deformation or fracture during root canal preparation was kept. The location of fracture was noted along with the identity of the canal in which fracture took place. The fracture in the two rotary systems was compared. SPSS 20 was used for data analysis. Of the 172(80.4%) teeth possessing more than 15 degrees of curvature, fracture occurred in 7(4.1%) cases and deformation in 10(5.8%). Of the 42(19.6%) teeth possessing less than 15 degrees of curvature, fracture occurred in none of them while deformation was seen in 1(2.4%). There was no difference in K-3 and Protaper files with respect to file deformation and fracture. Most of the fractures occurred in mesiobuccal canals of maxillary molars, n=3(21.4%). The likelihood of file fracture increased 5.65-fold when the same file was used more than 3 times. Irrespective of the rotary system, apical third of the root canal space was the most common site for file fracture.

  1. 40 CFR 60.288a - Reporting.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... test to generate a submission package file, which documents performance test data. You must then submit the file generated by the ERT through the EPA's Compliance and Emissions Data Reporting Interface (CEDRI), which can be accessed by logging in to the EPA's Central Data Exchange (CDX) (https://cdx.epa...

  2. Developing a Complete and Effective ACT-R Architecture

    DTIC Science & Technology

    2008-01-01

    of computational primitives , as contrasted with the predominant “one-off” and “grab-bag” cognitive models in the field. These architectures have...transport/ semaphore protocols connected via a glue script. Both protocols rely on the fact that file rename and file remove operations are atomic...the Trial Log file until just prior to processing the next input request. Thus, to perform synchronous identifications it is necessary to run an

  3. Techtalk: Telecommunications for Improving Developmental Education.

    ERIC Educational Resources Information Center

    Caverly, David C.; Broderick, Bill

    1993-01-01

    Explains how to access the Internet, discussing hardware and software considerations, connectivity, and types of access available to users. Describes the uses of electronic mail; TELNET, a method for remotely logging onto another computer; and anonymous File Transfer Protocol (FTP), a method for downloading files from a remote computer. (MAB)

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilsche, Thomas; Schuchart, Joseph; Cope, Joseph

    Event tracing is an important tool for understanding the performance of parallel applications. As concurrency increases in leadership-class computing systems, the quantity of performance log data can overload the parallel file system, perturbing the application being observed. In this work we present a solution for event tracing at leadership scales. We enhance the I/O forwarding system software to aggregate and reorganize log data prior to writing to the storage system, significantly reducing the burden on the underlying file system for this type of traffic. Furthermore, we augment the I/O forwarding system with a write buffering capability to limit the impactmore » of artificial perturbations from log data accesses on traced applications. To validate the approach, we modify the Vampir tracing tool to take advantage of this new capability and show that the approach increases the maximum traced application size by a factor of 5x to more than 200,000 processors.« less

  5. 46 CFR 196.35-3 - Logbooks and records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... form CG-706 or in the owner's format for an official logbook. Such logs must be kept available for a... master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection. (b) The... of making entries therein as required by law or regulations in this subchapter. Such logs or records...

  6. 46 CFR 196.35-3 - Logbooks and records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... form CG-706 or in the owner's format for an official logbook. Such logs must be kept available for a... master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection. (b) The... of making entries therein as required by law or regulations in this subchapter. Such logs or records...

  7. 46 CFR 35.07-5 - Logbooks and records-TB/ALL.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., the master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  8. 29 CFR 1960.28 - Employee reports of unsafe or unhealthful working conditions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... report of an existing or potential unsafe or unhealthful working condition should be recorded on a log maintained at the establishment. If an agency finds it inappropriate to maintain a log of written reports at... sequentially numbered case file, coded for identification, should be assigned for purposes of maintaining an...

  9. 20 CFR 658.422 - Handling of non-JS-related complaints by the Regional Administrator.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... non-JS-related complaints alleging violations of employment related laws shall be logged. The... which the complainant (or complaint) was referred on a complaint log, similar to the one described in § 658.410(c)(1). The appropriate regional official shall also prepare and keep the file specified in...

  10. 46 CFR 35.07-5 - Logbooks and records-TB/ALL.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., the master or person in charge shall file the logbook with the Officer in Charge, Marine Inspection... purposes of making entries therein as required by law or regulations in this subchapter. Such logs or... records of tests and inspections of fire fighting equipment must be maintained with the vessel's logs for...

  11. LogScope

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex

    2012-01-01

    LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).

  12. QALMA: A computational toolkit for the analysis of quality protocols for medical linear accelerators in radiation therapy

    NASA Astrophysics Data System (ADS)

    Rahman, Md Mushfiqur; Lei, Yu; Kalantzis, Georgios

    2018-01-01

    Quality Assurance (QA) for medical linear accelerator (linac) is one of the primary concerns in external beam radiation Therapy. Continued advancements in clinical accelerators and computer control technology make the QA procedures more complex and time consuming which often, adequate software accompanied with specific phantoms is required. To ameliorate that matter, we introduce QALMA (Quality Assurance for Linac with MATLAB), a MALAB toolkit which aims to simplify the quantitative analysis of QA for linac which includes Star-Shot analysis, Picket Fence test, Winston-Lutz test, Multileaf Collimator (MLC) log file analysis and verification of light & radiation field coincidence test.

  13. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hellfire Missile Test at Yuma Proving Ground

    DTIC Science & Technology

    2001-11-01

    that there were· no· target misses. The Hellfire missile does not have a depleted uranium head . . -,, 2.2.2.3 Tank movement During the test, the...guide otber users through the use of this. complicated program. The_input data files for NOISEMAP consist of a root file name with several extensions...SOURCES subdirectory. This file will have the root file name followed by an accession number, then the .bps extension. The user must check the *.log

  14. 25 CFR 215.23 - Cooperation between superintendent and district mining supervisor.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... notices, reports, drill logs, maps, and records, and all other information relating to mining operations required by said regulations to be submitted by lessees, and shall maintain a file thereof for the superintendent. (b) The files of the Geological Survey supervisor relating to lead and zinc leases of Quapaw...

  15. Agentless Cloud-Wide Monitoring of Virtual Disk State

    DTIC Science & Technology

    2015-10-01

    packages include Apache, MySQL , PHP, Ruby on Rails, Java Application Servers, and many others. Figure 2.12 shows the results of a run of the Software...Linux, Apache, MySQL , PHP (LAMP) set of applications. Thus, many file-level update logs will contain the same versions of files repeated across many

  16. Military Standard Common APSE (Ada Programming Support Environment) Interface Set (CAIS).

    DTIC Science & Technology

    1985-01-01

    QUEUEASE. LAST-KEY (QUEENAME) . LASTREI.TIONI(QUEUE-NAME). FILE-NODE. PORN . ATTRIBUTTES. ACCESSCONTROL. LEVEL); CLOSE (QUEUE BASE); CLOSE(FILE NODE...PROPOSED XIIT-STD-C.4 31 J NNUAfY logs procedure zTERT (ITERATOR: out NODE ITERATON; MAMIE: NAME STRING.KIND: NODE KID : KEY : RELATIONSHIP KEY PA1TTE1 :R

  17. 47 CFR 76.1704 - Proof-of-performance test data.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-performance test data. (a) The proof of performance tests required by § 76.601 shall be maintained on file at... subscribers, subject to the requirements of § 76.601(d). Note to § 76.1704: If a signal leakage log is being... log must be retained for the period specified in § 76.601(d). ...

  18. 49 CFR Appendix A to Part 225 - Schedule of Civil Penalties 1

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... $1,000 $2,000 225.11Reports of accidents/ incidents 2,500 5,000 225.12(a): Failure to file Railroad... noncompliance: (1) a missing or incomplete log entry for a particular employee's injury or illness; or (2) a missing or incomplete log record for a particular rail equipment accident or incident. Each day a...

  19. 47 CFR 76.1704 - Proof-of-performance test data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-performance test data. (a) The proof of performance tests required by § 76.601 shall be maintained on file at... subscribers, subject to the requirements of § 76.601(d). Note to § 76.1704: If a signal leakage log is being... log must be retained for the period specified in § 76.601(d). ...

  20. A Clustering Methodology of Web Log Data for Learning Management Systems

    ERIC Educational Resources Information Center

    Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros

    2012-01-01

    Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…

  1. Consistency of Students' Pace in Online Learning

    ERIC Educational Resources Information Center

    Hershkovitz, Arnon; Nachmias, Rafi

    2009-01-01

    The purpose of this study is to investigate the consistency of students' behavior regarding their pace of actions over sessions within an online course. Pace in a session is defined as the number of logged actions divided by session length (in minutes). Log files of 6,112 students were collected, and datasets were constructed for examining pace…

  2. SU-C-BRD-03: Analysis of Accelerator Generated Text Logs for Preemptive Maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Able, CM; Baydush, AH; Nguyen, C

    2014-06-15

    Purpose: To develop a model to analyze medical accelerator generated parameter and performance data that will provide an early warning of performance degradation and impending component failure. Methods: A robust 6 MV VMAT quality assurance treatment delivery was used to test the constancy of accelerator performance. The generated text log files were decoded and analyzed using statistical process control (SPC) methodology. The text file data is a single snapshot of energy specific and overall systems parameters. A total of 36 system parameters were monitored which include RF generation, electron gun control, energy control, beam uniformity control, DC voltage generation, andmore » cooling systems. The parameters were analyzed using Individual and Moving Range (I/MR) charts. The chart limits were calculated using a hybrid technique that included the use of the standard 3σ limits and the parameter/system specification. Synthetic errors/changes were introduced to determine the initial effectiveness of I/MR charts in detecting relevant changes in operating parameters. The magnitude of the synthetic errors/changes was based on: the value of 1 standard deviation from the mean operating parameter of 483 TB systems, a small fraction (≤ 5%) of the operating range, or a fraction of the minor fault deviation. Results: There were 34 parameters in which synthetic errors were introduced. There were 2 parameters (radial position steering coil, and positive 24V DC) in which the errors did not exceed the limit of the I/MR chart. The I chart limit was exceeded for all of the remaining parameters (94.2%). The MR chart limit was exceeded in 29 of the 32 parameters (85.3%) in which the I chart limit was exceeded. Conclusion: Statistical process control I/MR evaluation of text log file parameters may be effective in providing an early warning of performance degradation or component failure for digital medical accelerator systems. Research is Supported by Varian Medical Systems, Inc.« less

  3. Sawmill: A Logging File System for a High-Performance RAID Disk Array

    DTIC Science & Technology

    1995-01-01

    from limiting disk performance, new controller architectures connect the disks directly to the network so that data movement bypasses the file server...These developments raise two questions for file systems: how to get the best performance from a RAID, and how to use such a controller architecture ...the RAID-II storage system; this architecture provides a fast data path that moves data rapidly among the disks, high-speed controller memory, and the

  4. 32 CFR 776.80 - Initial screening and Rules Counsel.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Director, JA Division, HQMC, to JAR. (b) JAG(13) and JAR shall log all complaints received and will ensure... within 30 days of the date of its return, the Rules Counsel may close the file without further action... action to close the file. (2) Complaints that comply with the requirements shall be further reviewed by...

  5. Predicting Correctness of Problem Solving from Low-Level Log Data in Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Cetintas, Suleyman; Si, Luo; Xin, Yan Ping; Hord, Casey

    2009-01-01

    This paper proposes a learning based method that can automatically determine how likely a student is to give a correct answer to a problem in an intelligent tutoring system. Only log files that record students' actions with the system are used to train the model, therefore the modeling process doesn't require expert knowledge for identifying…

  6. 9 CFR 381.204 - Marking of poultry products offered for entry; official import inspection marks and devices.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., Import Inspection Division, is on file at the import inspection facility where the inspection is to be... stamping log containing the following information for each lot of product: the date of inspection, the... container marks, and the MP-410 number covering the product to be inspected. The daily stamping log must be...

  7. From Log Files to Assessment Metrics: Measuring Students' Science Inquiry Skills Using Educational Data Mining

    ERIC Educational Resources Information Center

    Gobert, Janice D.; Sao Pedro, Michael; Raziuddin, Juelaila; Baker, Ryan S.

    2013-01-01

    We present a method for assessing science inquiry performance, specifically for the inquiry skill of designing and conducting experiments, using educational data mining on students' log data from online microworlds in the Inq-ITS system (Inquiry Intelligent Tutoring System; www.inq-its.org). In our approach, we use a 2-step process: First we use…

  8. Oracle Applications Patch Administration Tool (PAT) Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2002-01-04

    PAT is a Patch Administration Tool that provides analysis, tracking, and management of Oracle Application patches. This includes capabilities as outlined below: Patch Analysis & Management Tool Outline of capabilities: Administration Patch Data Maintenance -- track Oracle Application patches applied to what database instance & machine Patch Analysis capture text files (readme.txt and driver files) form comparison detail report comparison detail PL/SQL package comparison detail SQL scripts detail JSP module comparison detail Parse and load the current applptch.txt (10.7) or load patch data from Oracle Application database patch tables (11i) Display Analysis -- Compare patch to be applied with currentmore » Oracle Application installed Appl_top code versions Patch Detail Module comparison detail Analyze and display one Oracle Application module patch. Patch Management -- automatic queue and execution of patches Administration Parameter maintenance -- setting for directory structure of Oracle Application appl_top Validation data maintenance -- machine names and instances to patch Operation Patch Data Maintenance Schedule a patch (queue for later execution) Run a patch (queue for immediate execution) Review the patch logs Patch Management Reports« less

  9. Parallel file system with metadata distributed across partitioned key-value store c

    DOEpatents

    Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron

    2017-09-19

    Improved techniques are provided for storing metadata associated with a plurality of sub-files associated with a single shared file in a parallel file system. The shared file is generated by a plurality of applications executing on a plurality of compute nodes. A compute node implements a Parallel Log Structured File System (PLFS) library to store at least one portion of the shared file generated by an application executing on the compute node and metadata for the at least one portion of the shared file on one or more object storage servers. The compute node is also configured to implement a partitioned data store for storing a partition of the metadata for the shared file, wherein the partitioned data store communicates with partitioned data stores on other compute nodes using a message passing interface. The partitioned data store can be implemented, for example, using Multidimensional Data Hashing Indexing Middleware (MDHIM).

  10. Ground-water data for the Hanna and Carbon basins, south-central Wyoming, through 1980

    USGS Publications Warehouse

    Daddow, P.B.

    1986-01-01

    Groundwater resources in the Hanna and Carbon Basins of Wyoming were assessed in a study from 1974 through 1980 because of the development of coal mining in the area. Data collected from 105 wells during that study, including well-completion records, lithologic logs, and water levels, are presented. The data are from stock wells, coal-test holes completed as observation wells by the U.S. Geological Survey. The data are mostly from mined coal-bearing formations: the Tertiary Hanna Formation and the Tertiary and Cretaceous Ferris Formation. Well-completion data and lithologic logs were collected on-site during drilling of the wells or from U.S. Geological Survey files, company records, Wyoming State Engineer well-permit files, and published reports. (USGS)

  11. VizieR Online Data Catalog: The Gemini Observation Log (CADC, 2001-)

    NASA Astrophysics Data System (ADS)

    Association of Universities For Research in Astronomy

    2018-01-01

    This database contains a log of the Gemini Telescope observations since 2001, managed by the Canadian Astronomical Data Center (CADC). The data are regularly updated (see the date of the last version at the end of this file). The Gemini Observatory consists of twin 8.1-meter diameter optical/infrared telescopes located on two of the best observing sites on the planet. From their locations on mountains in Hawai'i and Chile, Gemini Observatory's telescopes can collectively access the entire sky. Gemini is operated by a partnership of five countries including the United States, Canada, Brazil, Argentina and Chile. Any astronomer in these countries can apply for time on Gemini, which is allocated in proportion to each partner's financial stake. (1 data file).

  12. REPHLEX II: An information management system for the ARS Water Data Base

    NASA Astrophysics Data System (ADS)

    Thurman, Jane L.

    1993-08-01

    The REPHLEX II computer system is an on-line information management system which allows scientists, engineers, and other researchers to retrieve data from the ARS Water Data Base using asynchronous communications. The system features two phone lines handling baud rates from 300 to 2400, customized menus to facilitate browsing, help screens, direct access to information and data files, electronic mail processing, file transfers using the XMODEM protocol, and log-in procedures which capture information on new users, process passwords, and log activity for a permanent audit trail. The primary data base on the REPHLEX II system is the ARS Water Data Base which consists of rainfall and runoff data from experimental agricultural watersheds located in the United States.

  13. Ontology based log content extraction engine for a posteriori security control.

    PubMed

    Azkia, Hanieh; Cuppens-Boulahia, Nora; Cuppens, Frédéric; Coatrieux, Gouenou

    2012-01-01

    In a posteriori access control, users are accountable for actions they performed and must provide evidence, when required by some legal authorities for instance, to prove that these actions were legitimate. Generally, log files contain the needed data to achieve this goal. This logged data can be recorded in several formats; we consider here IHE-ATNA (Integrating the healthcare enterprise-Audit Trail and Node Authentication) as log format. The difficulty lies in extracting useful information regardless of the log format. A posteriori access control frameworks often include a log filtering engine that provides this extraction function. In this paper we define and enforce this function by building an IHE-ATNA based ontology model, which we query using SPARQL, and show how the a posteriori security controls are made effective and easier based on this function.

  14. A Prototype Implementation of a Time Interval File Protection System in Linux

    DTIC Science & Technology

    2006-09-01

    when a user logs in, the /etc/ passwd file is read by the system to get the user’s home directory. The user’s login shell then changes the directory...and don. • Users can be added with the command: # useradd – m <username> • Set the password by: # passwd <username> • Make a copy of the

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiCostanzo, D; Ayan, A; Woollard, J

    Purpose: To predict potential failures of hardware within the Varian TrueBeam linear accelerator in order to proactively replace parts and decrease machine downtime. Methods: Machine downtime is a problem for all radiation oncology departments and vendors. Most often it is the result of unexpected equipment failure, and increased due to lack of in-house clinical engineering support. Preventative maintenance attempts to assuage downtime, but often is ineffective at preemptively preventing many failure modes such as MLC motor failures, the need to tighten a gantry chain, or the replacement of a jaw motor, among other things. To attempt to alleviate downtime, softwaremore » was developed in house that determines the maximum value of each axis enumerated in the Truebeam trajectory log files. After patient treatments, this data is stored in a SQL database. Microsoft Power BI is used to plot the average maximum error of each day of each machine as a function of time. The results are then correlated with actual faults that occurred at the machine with the help of Varian service engineers. Results: Over the course of six months, 76,312 trajectory logs have been written into the database and plotted in Power BI. Throughout the course of analysis MLC motors have been replaced on three machines due to the early warning of the trajectory log analysis. The service engineers have also been alerted to possible gantry issues on one occasion due to the aforementioned analysis. Conclusion: Analyzing the trajectory log data is a viable and effective early warning system for potential failures of the TrueBeam linear accelerator. With further analysis and tightening of the tolerance values used to determine a possible imminent failure, it should be possible to pinpoint future issues more thoroughly and for more axes of motion.« less

  16. Well 9-1 Logs and Data: Roosevelt Hot Spring Area, Utah (FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 9-1 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  17. Exploring Online Students' Self-Regulated Learning with Self-Reported Surveys and Log Files: A Data Mining Approach

    ERIC Educational Resources Information Center

    Cho, Moon-Heum; Yoo, Jin Soung

    2017-01-01

    Many researchers who are interested in studying students' online self-regulated learning (SRL) have heavily relied on self-reported surveys. Data mining is an alternative technique that can be used to discover students' SRL patterns from large data logs saved on a course management system. The purpose of this study was to identify students' online…

  18. Family Child Care Inventory-Keeper: The Complete Log for Depreciating and Insuring Your Property. Redleaf Business Series.

    ERIC Educational Resources Information Center

    Copeland, Tom

    Figuring depreciation can be the most difficult aspect of filing tax returns for a family child care program. This inventory log for family child care programs is designed to assist in keeping track of the furniture, appliances, and other property used in the child care business; once these items have been identified, they can be deducted as…

  19. Wister, CA Downhole and Seismic Data

    DOE Data Explorer

    Akerley, John

    2010-12-18

    This submission contains Downhole geophysical logs associated with Wister, CA Wells 12-27 and 85-20. The logs include Spontaneous Potential (SP), HILT Caliper (HCAL), Gamma Ray (GR), Array Induction (AIT), and Neutron Porosity (NPOR) data. Also included are a well log, Injection Test, Pressure Temperature Spinner log, shut in temperature survey, a final well schematic, and files about the well's location and drilling history. This submission also contains data from a three-dimensional (3D) multi-component (3C) seismic reflection survey on the Wister Geothermal prospect area in the northern portion of the Imperial Valley, California. The Wister seismic survey area was 13.2 square miles. (Resistivity image logs (Schlumberger FMI) in 85-20 indicate that maximum horizontal stress (Shmax) is oriented NNE but that open fractures are oriented suboptimally).

  20. No3CoGP: non-conserved and conserved coexpressed gene pairs.

    PubMed

    Mal, Chittabrata; Aftabuddin, Md; Kundu, Sudip

    2014-12-08

    Analyzing the microarray data of different conditions, one can identify the conserved and condition-specific genes and gene modules, and thus can infer the underlying cellular activities. All the available tools based on Bioconductor and R packages differ in how they extract differential coexpression and at what level they study. There is a need for a user-friendly, flexible tool which can start analysis using raw or preprocessed microarray data and can report different levels of useful information. We present a GUI software, No3CoGP: Non-Conserved and Conserved Coexpressed Gene Pairs which takes Affymetrix microarray data (.CEL files or log2 normalized.txt files) along with annotation file (.csv file), Chip Definition File (CDF file) and probe file as inputs, utilizes the concept of network density cut-off and Fisher's z-test to extract biologically relevant information. It can identify four possible types of gene pairs based on their coexpression relationships. These are (i) gene pair showing coexpression in one condition but not in the other, (ii) gene pair which is positively coexpressed in one condition but negatively coexpressed in the other condition, (iii) positively and (iv) negatively coexpressed in both the conditions. Further, it can generate modules of coexpressed genes. Easy-to-use GUI interface enables researchers without knowledge in R language to use No3CoGP. Utilization of one or more CPU cores, depending on the availability, speeds up the program. The output files stored in the respective directories under the user-defined project offer the researchers to unravel condition-specific functionalities of gene, gene sets or modules.

  1. Well 14-2 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 14-2 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  2. Well 52-21 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 52-21 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  3. Well 82-33 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Data Explorer

    Joe Moore

    2016-03-03

    This is a compilation of logs and data from Well 82-33 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  4. Well Acord 1-26 Logs and Data: Roosevelt Hot Spring Area, Utah (Utah FORGE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joe Moore

    This is a compilation of logs and data from Well Acord 1-26 in the Roosevelt Hot Springs area in Utah. This well is also in the Utah FORGE study area. The file is in a compressed .zip format and there is a data inventory table (Excel spreadsheet) in the root folder that is a guide to the data that is accessible in subfolders.

  5. Patterns of usage for a Web-based clinical information system.

    PubMed

    Chen, Elizabeth S; Cimino, James J

    2004-01-01

    Understanding how clinicians are using clinical information systems to assist with their everyday tasks is valuable to the system design and development process. Developers of such systems are interested in monitoring usage in order to make enhancements. System log files are rich resources for gaining knowledge about how the system is being used. We have analyzed the log files of our Web-based clinical information system (WebCIS) to obtain various usage statistics including which WebCIS features are frequently being used. We have also identified usage patterns, which convey how the user is traversing the system. We present our method and these results as well as describe how the results can be used to customize menus, shortcut lists, and patient reports in WebCIS and similar systems.

  6. Simulation Control Graphical User Interface Logging Report

    NASA Technical Reports Server (NTRS)

    Hewling, Karl B., Jr.

    2012-01-01

    One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.

  7. Geohydrologic and water-quality characterization of a fractured-bedrock test hole in an area of Marcellus shale gas development, Bradford County, Pennsylvania

    USGS Publications Warehouse

    Risser, Dennis W.; Williams, John H.; Hand, Kristen L.; Behr, Rose-Anna; Markowski, Antonette K.

    2013-01-01

    Open-File Miscellaneous Investigation 13–01.1 presents the results of geohydrologic investigations on a 1,664-foot-deep core hole drilled in the Bradford County part of the Gleason 7.5-minute quadrangle in north-central Pennsylvania. In the text, the authors discuss their methods of investigation, summarize physical and analytical results, and place those results in context. Four appendices include (1) a full description of the core in an Excel worksheet; (2) water-quality and core-isotope analytical results in Excel workbooks; (3) geophysical logs in LAS and PDF files, and an Excel workbook containing attitudes of bedding and fractures calculated from televiewer logs; and (4) MP4 clips from the downhole video at selected horizons.

  8. Zebra: A striped network file system

    NASA Technical Reports Server (NTRS)

    Hartman, John H.; Ousterhout, John K.

    1992-01-01

    The design of Zebra, a striped network file system, is presented. Zebra applies ideas from log-structured file system (LFS) and RAID research to network file systems, resulting in a network file system that has scalable performance, uses its servers efficiently even when its applications are using small files, and provides high availability. Zebra stripes file data across multiple servers, so that the file transfer rate is not limited by the performance of a single server. High availability is achieved by maintaining parity information for the file system. If a server fails its contents can be reconstructed using the contents of the remaining servers and the parity information. Zebra differs from existing striped file systems in the way it stripes file data: Zebra does not stripe on a per-file basis; instead it stripes the stream of bytes written by each client. Clients write to the servers in units called stripe fragments, which are analogous to segments in an LFS. Stripe fragments contain file blocks that were written recently, without regard to which file they belong. This method of striping has numerous advantages over per-file striping, including increased server efficiency, efficient parity computation, and elimination of parity update.

  9. Navigating Streams of Paper.

    ERIC Educational Resources Information Center

    Bennett-Abney, Cheryl

    2001-01-01

    Three organizational tools for counselors are described: three-ring binder for notes, forms, and schedules; daily log of time and activities; and a tickler file with tasks arranged by days of the week. (SK)

  10. Automated clustering-based workload characterization

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Menasce, Daniel A.; Yesha, Yelena

    1996-01-01

    The demands placed on the mass storage systems at various federal agencies and national laboratories are continuously increasing in intensity. This forces system managers to constantly monitor the system, evaluate the demand placed on it, and tune it appropriately using either heuristics based on experience or analytic models. Performance models require an accurate workload characterization. This can be a laborious and time consuming process. It became evident from our experience that a tool is necessary to automate the workload characterization process. This paper presents the design and discusses the implementation of a tool for workload characterization of mass storage systems. The main features of the tool discussed here are: (1)Automatic support for peak-period determination. Histograms of system activity are generated and presented to the user for peak-period determination; (2) Automatic clustering analysis. The data collected from the mass storage system logs is clustered using clustering algorithms and tightness measures to limit the number of generated clusters; (3) Reporting of varied file statistics. The tool computes several statistics on file sizes such as average, standard deviation, minimum, maximum, frequency, as well as average transfer time. These statistics are given on a per cluster basis; (4) Portability. The tool can easily be used to characterize the workload in mass storage systems of different vendors. The user needs to specify through a simple log description language how the a specific log should be interpreted. The rest of this paper is organized as follows. Section two presents basic concepts in workload characterization as they apply to mass storage systems. Section three describes clustering algorithms and tightness measures. The following section presents the architecture of the tool. Section five presents some results of workload characterization using the tool.Finally, section six presents some concluding remarks.

  11. Cyber Fundamental Exercises

    DTIC Science & Technology

    2013-03-01

    the /bin, /sbin, /etc, /var/log, /home, /proc, /root, /dev, /tmp, and /lib directories • Describe the purpose of the /etc/shadow and /etc/ passwd ...UNLIMITED 19 2.6.2 /etc/ passwd and /etc/shadow The /etc/shadow file didn’t exist on early Linux distributions. Originally only root could access the...etc/ passwd file, which stored user names, user configuration information, and passwords. However, when common programs such as ls running under

  12. Analysis of the access patterns at GSFC distributed active archive center

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore; Bedet, Jean-Jacques

    1996-01-01

    The Goddard Space Flight Center (GSFC) Distributed Active Archive Center (DAAC) has been operational for more than two years. Its mission is to support existing and pre Earth Observing System (EOS) Earth science datasets, facilitate the scientific research, and test Earth Observing System Data and Information System (EOSDIS) concepts. Over 550,000 files and documents have been archived, and more than six Terabytes have been distributed to the scientific community. Information about user request and file access patterns, and their impact on system loading, is needed to optimize current operations and to plan for future archives. To facilitate the management of daily activities, the GSFC DAAC has developed a data base system to track correspondence, requests, ingestion and distribution. In addition, several log files which record transactions on Unitree are maintained and periodically examined. This study identifies some of the users' requests and file access patterns at the GSFC DAAC during 1995. The analysis is limited to the subset of orders for which the data files are under the control of the Hierarchical Storage Management (HSM) Unitree. The results show that most of the data volume ordered was for two data products. The volume was also mostly made up of level 3 and 4 data and most of the volume was distributed on 8 mm and 4 mm tapes. In addition, most of the volume ordered was for deliveries in North America although there was a significant world-wide use. There was a wide range of request sizes in terms of volume and number of files ordered. On an average 78.6 files were ordered per request. Using the data managed by Unitree, several caching algorithms have been evaluated for both hit rate and the overhead ('cost') associated with the movement of data from near-line devices to disks. The algorithm called LRU/2 bin was found to be the best for this workload, but the STbin algorithm also worked well.

  13. 47 CFR 22.359 - Emission limitations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... + 10 log (P) dB. (b) Measurement procedure. Compliance with these rules is based on the use of... contract in their station files and disclose it to prospective assignees or transferees and, upon request...

  14. 7 CFR 274.5 - Record retention and forms security.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... control logs, or similar controls from the point of initial receipt through the issuance and.... (2) For notices of change which initiate, update or terminate the master issuance file, the State...

  15. Network Basics.

    ERIC Educational Resources Information Center

    Tennant, Roy

    1992-01-01

    Explains how users can find and access information resources available on the Internet. Highlights include network information centers (NICs); lists, both formal and informal; computer networking protocols, including international standards; electronic mail; remote log-in; and file transfer. (LRW)

  16. Evaluation of electrical impedance ratio measurements in accuracy of electronic apex locators.

    PubMed

    Kim, Pil-Jong; Kim, Hong-Gee; Cho, Byeong-Hoon

    2015-05-01

    The aim of this paper was evaluating the ratios of electrical impedance measurements reported in previous studies through a correlation analysis in order to explicit it as the contributing factor to the accuracy of electronic apex locator (EAL). The literature regarding electrical property measurements of EALs was screened using Medline and Embase. All data acquired were plotted to identify correlations between impedance and log-scaled frequency. The accuracy of the impedance ratio method used to detect the apical constriction (APC) in most EALs was evaluated using linear ramp function fitting. Changes of impedance ratios for various frequencies were evaluated for a variety of file positions. Among the ten papers selected in the search process, the first-order equations between log-scaled frequency and impedance were in the negative direction. When the model for the ratios was assumed to be a linear ramp function, the ratio values decreased if the file went deeper and the average ratio values of the left and right horizontal zones were significantly different in 8 out of 9 studies. The APC was located within the interval of linear relation between the left and right horizontal zones of the linear ramp model. Using the ratio method, the APC was located within a linear interval. Therefore, using the impedance ratio between electrical impedance measurements at different frequencies was a robust method for detection of the APC.

  17. Modulation indices for volumetric modulated arc therapy.

    PubMed

    Park, Jong Min; Park, So-Yeon; Kim, Hyoungnyoun; Kim, Jin Ho; Carlson, Joel; Ye, Sung-Joon

    2014-12-07

    The aim of this study is to present a modulation index (MI) for volumetric modulated arc therapy (VMAT) based on the speed and acceleration analysis of modulating-parameters such as multi-leaf collimator (MLC) movements, gantry rotation and dose-rate, comprehensively. The performance of the presented MI (MIt) was evaluated with correlation analyses to the pre-treatment quality assurance (QA) results, differences in modulating-parameters between VMAT plans versus dynamic log files, and differences in dose-volumetric parameters between VMAT plans versus reconstructed plans using dynamic log files. For comparison, the same correlation analyses were performed for the previously suggested modulation complexity score (MCS(v)), leaf travel modulation complexity score (LTMCS) and MI by Li and Xing (MI Li&Xing). In the two-tailed unpaired parameter condition, p values were acquired. The Spearman's rho (r(s)) values of MIt, MCSv, LTMCS and MI Li&Xing to the local gamma passing rate with 2%/2 mm criterion were -0.658 (p < 0.001), 0.186 (p = 0.251), 0.312 (p = 0.05) and -0.455 (p = 0.003), respectively. The values of rs to the modulating-parameter (MLC positions) differences were 0.917, -0.635, -0.857 and 0.795, respectively (p < 0.001). For dose-volumetric parameters, MIt showed higher statistically significant correlations than the conventional MIs. The MIt showed good performance for the evaluation of the modulation-degree of VMAT plans.

  18. VizieR Online Data Catalog: NGC 2264, NGC 2547 and NGC 2516 stellar radii (Jackson+, 2016)

    NASA Astrophysics Data System (ADS)

    Jackson, R. J.; Jeffries, R. D.; Randich, S.; Bragaglia, A.; Carraro, G.; Costado, M. T.; Flaccomio, E.; Lanzafame; Lardo, C.; Monaco, L.; Morbidelli, L.; Smiljanic, R.; Zaggia, S.

    2015-11-01

    File Table1.dat contains Photometric and spectroscopic data of GES Survey targets in clusters in NGC 2547, NGC 2516, NGC 22264 downloaded from the Edinburugh GES archive (http://ges/roe.ac.uk/) . Photometric data comprised the (Cousins) I magnitude and 2MASS J, H and K magnitudes. Spectroscopic data comprises the signal to noise ratio, S/N of the target spectrum, the radial velocity, RV (in km/s), the projected equatorial velocity, vsini (in km/s), the number of separate observations co-added to produce the target spectrum and the log of effective temperature (logTeff) of the template spectrum fitted to measure RV and vsini. The absolute precision in RV, pRV (in km/s) and relative precision vsini (pvsini) were estimated, as a function of the logTeff, vsini and S/N, using the prescription described in Jackson et al. (2015A&A...580A..75J, Cat. J/A+A/580/A75). File Table3.dat contains measured and calculated properties of cluster targets with resolved vsini and a reported rotation period. The cluster name, right ascension, RA (deg) and declination, Dec (deg) are given for targets with measured periods given in the literature. Dynamic properties comprise: the radial velocity, RV (in km/s), the absolute precision in RV, pRV (km/s), the projected equatorial velocity, vsini (in km/s), the relative precision in vsini (pvsini) and the rotational period (in days). Also shown are values of absolute K magnitude, MK log of luminosity, log L (in solar units) and probability of cluster membership estimated using cluster data given in the text. Period shows reported values of cluster taken from the literature Estimated values of the projected radius, Rsini (in Rsolar) and uncertainty in projected radius, e_Rsini (in Rsolar) are given for targets where vsini>5km/s and pvsini>0.2. The final column shows a flag which is set to 1 for targets in cluster NGC 2264 where a (H-K) versus (J-H) colour-colour plot indicates possible infra-red excess. Period shows reported values of cluster taken from the literature (2 data files).

  19. Optimizing Earth Data Search Ranking using Deep Learning and Real-time User Behaviour

    NASA Astrophysics Data System (ADS)

    Jiang, Y.; Yang, C. P.; Armstrong, E. M.; Huang, T.; Moroni, D. F.; McGibbney, L. J.; Greguska, F. R., III

    2017-12-01

    Finding Earth science data has been a challenging problem given both the quantity of data available and the heterogeneity of the data across a wide variety of domains. Current search engines in most geospatial data portals tend to induce end users to focus on one single data characteristic dimension (e.g., term frequency-inverse document frequency (TF-IDF) score, popularity, release date, etc.). This approach largely fails to take account of users' multidimensional preferences for geospatial data, and hence may likely result in a less than optimal user experience in discovering the most applicable dataset out of a vast range of available datasets. With users interacting with search engines, sufficient information is already hidden in the log files. Compared with explicit feedback data, information that can be derived/extracted from log files is virtually free and substantially more timely. In this dissertation, I propose an online deep learning framework that can quickly update the learning function based on real-time user clickstream data. The contributions of this framework include 1) a log processor that can ingest, process and create training data from web logs in a real-time manner; 2) a query understanding module to better interpret users' search intent using web log processing results and metadata; 3) a feature extractor that identifies ranking features representing users' multidimensional interests of geospatial data; and 4) a deep learning based ranking algorithm that can be trained incrementally using user behavior data. The search ranking results will be evaluated using precision at K and normalized discounted cumulative gain (NDCG).

  20. 43 CFR 2743.3 - Leased disposal sites.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... review of all records and inspection reports on file with the Bureau of Land Management, State, and local... landfill concerning site management and a review of all reports and logs pertaining to the type and amount...

  1. 25 CFR 214.13 - Diligence; annual expenditures; mining records.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... within 90 days after an ore body of sufficient quantity is discovered, and shown by the logs or records.... Lessee shall, before commencing operations, file with the superintendent a plat and preliminary statement...

  2. 47 CFR 22.861 - Emission limitations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... below the transmitting power (P) by a factor of at least 43 + 10 log (P) dB. (b) Measurement procedure... maintain a copy of the contract in their station files and disclose it to prospective assignees or...

  3. Analysis of the factors influencing healthcare professionals' adoption of mobile electronic medical record (EMR) using the unified theory of acceptance and use of technology (UTAUT) in a tertiary hospital.

    PubMed

    Kim, Seok; Lee, Kee-Hyuck; Hwang, Hee; Yoo, Sooyoung

    2016-01-30

    Although the factors that affect the end-user's intention to use a new system and technology have been researched, the previous studies have been theoretical and do not verify the factors that affected the adoption of a new system. Thus, this study aimed to confirm the factors that influence users' intentions to utilize a mobile electronic health records (EMR) system using both a questionnaire survey and a log file analysis that represented the real use of the system. After observing the operation of a mobile EMR system in a tertiary university hospital for seven months, we performed an offline survey regarding the user acceptance of the system based on the Unified Theory of Acceptance and Use of Technology (UTAUT) and the Technology Acceptance Model (TAM). We surveyed 942 healthcare professionals over two weeks and performed a structural equation modeling (SEM) analysis to identify the intention to use the system among the participants. Next, we compared the results of the SEM analysis with the results of the analyses of the actual log files for two years to identify further insights into the factors that affected the intention of use. For these analyses, we used SAS 9.0 and AMOS 21. Of the 942 surveyed end-users, 48.3 % (23.2 % doctors and 68.3 % nurses) responded. After eliminating six subjects who completed the survey insincerely, we conducted the SEM analyses on the data from 449 subjects (65 doctors and 385 nurses). The newly suggested model satisfied the standards of model fitness, and the intention to use it was especially high due to the influences of Performance Expectancy on Attitude and Attitude. Based on the actual usage log analyses, both the doctors and nurses used the menus to view the inpatient lists, alerts, and patients' clinical data with high frequency. Specifically, the doctors frequently retrieved laboratory results, and the nurses frequently retrieved nursing notes and used the menu to assume the responsibilities of nursing work. In this study, the end-users' intentions to use the mobile EMR system were particularly influenced by Performance Expectancy and Attitude. In reality, the usage log revealed high-frequency use of the functions to improve the continuity of care and work efficiency. These results indicate the influence of the factor of performance expectancy on the intention to use the mobile EMR system. Consequently, we suggest that when determining the implementation of mobile EMR systems, the functions that are related to workflow with ability to increase performance should be considered first.

  4. Umbra (core)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, Jon David; Oppel III, Fred J.; Hart, Brian E.

    Umbra is a flexible simulation framework for complex systems that can be used by itself for modeling, simulation, and analysis, or to create specific applications. It has been applied to many operations, primarily dealing with robotics and system of system simulations. This version, from 4.8 to 4.8.3b, incorporates bug fixes, refactored code, and new managed C++ wrapper code that can be used to bridge new applications written in C# to the C++ libraries. The new managed C++ wrapper code includes (project/directories) BasicSimulation, CSharpUmbraInterpreter, LogFileView, UmbraAboutBox, UmbraControls, UmbraMonitor and UmbraWrapper.

  5. Parallel log structured file system collective buffering to achieve a compact representation of scientific and/or dimensional data

    DOEpatents

    Grider, Gary A.; Poole, Stephen W.

    2015-09-01

    Collective buffering and data pattern solutions are provided for storage, retrieval, and/or analysis of data in a collective parallel processing environment. For example, a method can be provided for data storage in a collective parallel processing environment. The method comprises receiving data to be written for a plurality of collective processes within a collective parallel processing environment, extracting a data pattern for the data to be written for the plurality of collective processes, generating a representation describing the data pattern, and saving the data and the representation.

  6. AliEn—ALICE environment on the GRID

    NASA Astrophysics Data System (ADS)

    Saiz, P.; Aphecetche, L.; Bunčić, P.; Piskač, R.; Revsbech, J.-E.; Šego, V.; Alice Collaboration

    2003-04-01

    AliEn ( http://alien.cern.ch) (ALICE Environment) is a Grid framework built on top of the latest Internet standards for information exchange and authentication (SOAP, PKI) and common Open Source components. AliEn provides a virtual file catalogue that allows transparent access to distributed datasets and a number of collaborating Web services which implement the authentication, job execution, file transport, performance monitor and event logging. In the paper we will present the architecture and components of the system.

  7. Parallel compression of data chunks of a shared data object using a log-structured file system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    2016-10-25

    Techniques are provided for parallel compression of data chunks being written to a shared object. A client executing on a compute node or a burst buffer node in a parallel computing system stores a data chunk generated by the parallel computing system to a shared data object on a storage node by compressing the data chunk; and providing the data compressed data chunk to the storage node that stores the shared object. The client and storage node may employ Log-Structured File techniques. The compressed data chunk can be de-compressed by the client when the data chunk is read. A storagemore » node stores a data chunk as part of a shared object by receiving a compressed version of the data chunk from a compute node; and storing the compressed version of the data chunk to the shared data object on the storage node.« less

  8. Design and development of an automatic data acquisition system for a balance study using a smartcard system.

    PubMed

    Ambrozy, C; Kolar, N A; Rattay, F

    2010-01-01

    For measurement value logging of board angle values during balance training, it is necessary to develop a measurement system. This study will provide data for a balance study using the smartcard. The data acquisition comes automatically. An individually training plan for each proband is necessary. To store the proband identification a smartcard with an I2C data bus protocol and an E2PROM memory system is used. For reading the smartcard data a smartcard reader is connected via universal serial bus (USB) to a notebook. The data acquisition and smartcard read programme is designed with Microsoft® Visual C#. A training plan file contains the individual training plan for each proband. The data of the test persons are saved in a proband directory. Each event is automatically saved as a log-file for the exact documentation. This system makes study development easy and time-saving.

  9. VizieR Online Data Catalog: GOALS sample PACS and SPIRE fluxes (Chu+, 2017)

    NASA Astrophysics Data System (ADS)

    Chu, J. K.; Sanders, D. B.; Larson, K. L.; Mazzarella, J. M.; Howell, J. H.; Diaz-Santos, T.; Xu, K. C.; Paladini, R.; Schulz, B.; Shupe, D.; Appleton, P.; Armus, L.; Billot, N.; Chan, B. H. P.; Evans, A. S.; Fadda, D.; Frayer, D. T.; Haan, S.; Ishida, C. M.; Iwasawa, K.; Kim, D.-C.; Lord, S.; Murphy, E.; Petric, A.; Privon, G. C.; Surace, J. A.; Treister, E.

    2017-06-01

    The IRAS RBGS contains 179 LIRGs (log(LIR/Lȯ)= 22 ultra-luminous infrared galaxies (ULIRGs: log(LIR/Lȯ)>=12.0); these 201 total objects comprise the GOALS sample (Armus et al. 2009), a statistically complete flux-limited sample of infrared-luminous galaxies in the local universe. This paper presents imaging and photometry for all 201 LIRGs and LIRG systems in the IRAS RBGS that were observed during our GOALS Herschel OT1 program. (4 data files).

  10. The key image and case log application: new radiology software for teaching file creation and case logging that incorporates elements of a social network.

    PubMed

    Rowe, Steven P; Siddiqui, Adeel; Bonekamp, David

    2014-07-01

    To create novel radiology key image software that is easy to use for novice users, incorporates elements adapted from social networking Web sites, facilitates resident and fellow education, and can serve as the engine for departmental sharing of interesting cases and follow-up studies. Using open-source programming languages and software, radiology key image software (the key image and case log application, KICLA) was developed. This system uses a lightweight interface with the institutional picture archiving and communications systems and enables the storage of key images, image series, and cine clips. It was designed to operate with minimal disruption to the radiologists' daily workflow. Many features of the user interface have been inspired by social networking Web sites, including image organization into private or public folders, flexible sharing with other users, and integration of departmental teaching files into the system. We also review the performance, usage, and acceptance of this novel system. KICLA was implemented at our institution and achieved widespread popularity among radiologists. A large number of key images have been transmitted to the system since it became available. After this early experience period, the most commonly encountered radiologic modalities are represented. A survey distributed to users revealed that most of the respondents found the system easy to use (89%) and fast at allowing them to record interesting cases (100%). Hundred percent of respondents also stated that they would recommend a system such as KICLA to their colleagues. The system described herein represents a significant upgrade to the Digital Imaging and Communications in Medicine teaching file paradigm with efforts made to maximize its ease of use and inclusion of characteristics inspired by social networking Web sites that allow the system additional functionality such as individual case logging. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  11. 47 CFR 22.917 - Emission limitations for cellular equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... frequency ranges must be attenuated below the transmitting power (P) by a factor of at least 43 + 10 log(P... such contract shall maintain a copy of the contract in their station files and disclose it to...

  12. Detection of Anomalous Insiders in Collaborative Environments via Relational Analysis of Access Logs

    PubMed Central

    Chen, You; Malin, Bradley

    2014-01-01

    Collaborative information systems (CIS) are deployed within a diverse array of environments, ranging from the Internet to intelligence agencies to healthcare. It is increasingly the case that such systems are applied to manage sensitive information, making them targets for malicious insiders. While sophisticated security mechanisms have been developed to detect insider threats in various file systems, they are neither designed to model nor to monitor collaborative environments in which users function in dynamic teams with complex behavior. In this paper, we introduce a community-based anomaly detection system (CADS), an unsupervised learning framework to detect insider threats based on information recorded in the access logs of collaborative environments. CADS is based on the observation that typical users tend to form community structures, such that users with low a nity to such communities are indicative of anomalous and potentially illicit behavior. The model consists of two primary components: relational pattern extraction and anomaly detection. For relational pattern extraction, CADS infers community structures from CIS access logs, and subsequently derives communities, which serve as the CADS pattern core. CADS then uses a formal statistical model to measure the deviation of users from the inferred communities to predict which users are anomalies. To empirically evaluate the threat detection model, we perform an analysis with six months of access logs from a real electronic health record system in a large medical center, as well as a publicly-available dataset for replication purposes. The results illustrate that CADS can distinguish simulated anomalous users in the context of real user behavior with a high degree of certainty and with significant performance gains in comparison to several competing anomaly detection models. PMID:25485309

  13. The design and implementation of an automated system for logging clinical experiences using an anesthesia information management system.

    PubMed

    Simpao, Allan; Heitz, James W; McNulty, Stephen E; Chekemian, Beth; Brenn, B Randall; Epstein, Richard H

    2011-02-01

    Residents in anesthesia training programs throughout the world are required to document their clinical cases to help ensure that they receive adequate training. Current systems involve self-reporting, are subject to delayed updates and misreported data, and do not provide a practicable method of validation. Anesthesia information management systems (AIMS) are being used increasingly in training programs and are a logical source for verifiable documentation. We hypothesized that case logs generated automatically from an AIMS would be sufficiently accurate to replace the current manual process. We based our analysis on the data reporting requirements of the American College of Graduate Medical Education (ACGME). We conducted a systematic review of ACGME requirements and our AIMS record, and made modifications after identifying data element and attribution issues. We studied 2 methods (parsing of free text procedure descriptions and CPT4 procedure code mapping) to automatically determine ACGME case categories and generated AIMS-based case logs and compared these to assignments made by manual inspection of the anesthesia records. We also assessed under- and overreporting of cases entered manually by our residents into the ACGME website. The parsing and mapping methods assigned cases to a majority of the ACGME categories with accuracies of 95% and 97%, respectively, as compared with determinations made by 2 residents and 1 attending who manually reviewed all procedure descriptions. Comparison of AIMS-based case logs with reports from the ACGME Resident Case Log System website showed that >50% of residents either underreported or overreported their total case counts by at least 5%. The AIMS database is a source of contemporaneous documentation of resident experience that can be queried to generate valid, verifiable case logs. The extent of AIMS adoption by academic anesthesia departments should encourage accreditation organizations to support uploading of AIMS-based case log files to improve accuracy and to decrease the clerical burden on anesthesia residents.

  14. 75 FR 76426 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-08

    ..., access control lists, file system permissions, intrusion detection and prevention systems and log..., address, mailing address, country, organization, phone, fax, mobile, pager, Defense Switched Network (DSN..., address, mailing address, country, organization, phone, fax, mobile, pager, Defense Switched Network (DSN...

  15. TableViewer for Herschel Data Processing

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Schulz, B.

    2006-07-01

    The TableViewer utility is a GUI tool written in Java to support interactive data processing and analysis for the Herschel Space Observatory (Pilbratt et al. 2001). The idea was inherited from a prototype written in IDL (Schulz et al. 2005). It allows to graphically view and analyze tabular data organized in columns with equal numbers of rows. It can be run either as a standalone application, where data access is restricted to FITS (FITS 1999) files only, or it can be run from the Quick Look Analysis(QLA) or Interactive Analysis(IA) command line, from where also objects are accessible. The graphic display is very versatile, allowing plots in either linear or log scales. Zooming, panning, and changing data columns is performed rapidly using a group of navigation buttons. Selecting and de-selecting of fields of data points controls the input to simple analysis tasks like building a statistics table, or generating power spectra. The binary data stored in a TableDataset^1, a Product or in FITS files can also be displayed as tabular data, where values in individual cells can be modified. TableViewer provides several processing utilities which, besides calculation of statistics either for all channels or for selected channels, and calculation of power spectra, allows to convert/repair datasets by changing the unit name of data columns, and by modifying data values in columns with a simple calculator tool. Interactively selected data can be separated out, and modified data sets can be saved to FITS files. The tool will be very helpful especially in the early phases of Herschel data analysis when a quick access to contents of data products is important. TableDataset and Product are Java classes defined in herschel.ia.dataset.

  16. Comparison of drilling reports and detailed geophysical analysis of ground-water production in bedrock wells

    USGS Publications Warehouse

    Paillet, Frederick; Duncanson, Russell

    1994-01-01

    The most extensive data base for fractured bedrock aquifers consists of drilling reports maintained by various state agencies. We investigated the accuracy and reliability of such reports by comparing a representative set of reports for nine wells drilled by conventional air percussion methods in granite with a suite of geophysical logs for the same wells designed to identify the depths of fractures intersecting the well bore which may have produced water during aquifer tests. Production estimates reported by the driller ranged from less than 1 to almost 10 gallons per minute. The moderate drawdowns maintained during subsequent production tests were associated with approximately the same flows as those measured when boreholes were dewatered during air percussion drilling. We believe the estimates of production during drilling and drawdown tests were similar because partial fracture zone dewatering during drilling prevented larger inflows otherwise expected from the steeper drawdowns during drilling. The fractures and fracture zones indicated on the drilling report and the amounts of water produced by these fractures during drilling generally agree with those identified from the geophysical log analysis. Most water production occurred from two fractured and weathered zones which are separated by an interval of unweathered granite. The fractures identified in the drilling reports show various depth discrepancies in comparison to the geophysical logs, which are subject to much better depth control. However, the depths of the fractures associated with water production on the drilling report are comparable to the depths of the fractures shown to be the source of water inflow in the geophysical log analysis. Other differences in the relative contribution of flow from fracture zones may by attributed to the differences between the hydraulic conditions during drilling, which represent large, prolonged drawdowns, and pumping tests, which consisted of smaller drawdowns maintained over shorter periods. We conclude that drilling reports filed by experienced well drillers contain useful information about the depth, thickness, degree of weathering, and production capacity of fracture zones supplying typical domestic water wells. The accuracy of this information could be improved if relatively simple and inexpensive geophysical well logs such as gamma, caliper, and normal resistivity logs were routinely run in conjunction with bedrock drilling projects.

  17. TH-A-9A-10: Prostate SBRT Delivery with Flattening-Filter-Free Mode: Benefit and Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, T; Yuan, L; Sheng, Y

    Purpose: Flattening-filter-free (FFF) beam mode offered on TrueBeam™ linac enables delivering IMRT at 2400 MU/min dose rate. This study investigates the benefit and delivery accuracy of using high dose rate in the context of prostate SBRT. Methods: 8 prostate SBRT patients were retrospectively studied. In 5 cases treated with 600-MU/min dose rate, continuous prostate motion data acquired during radiation-beam-on was used to analyze motion range. In addition, the initial 1/3 of prostate motion trajectories during each radiation-beam-on was separated to simulate motion range if 2400-MU/min were used. To analyze delivery accuracy in FFF mode, MLC trajectory log files from anmore » additional 3 cases treated at 2400-MU/min were acquired. These log files record MLC expected and actual positions every 20ms, and therefore can be used to reveal delivery accuracy. Results: (1) Benefit. On average treatment at 600-MU/min takes 30s per beam; whereas 2400-MU/min requires only 11s. When shortening delivery time to ~1/3, the prostate motion range was significantly smaller (p<0.001). Largest motion reduction occurred in Sup-Inf direction, from [−3.3mm, 2.1mm] to [−1.7mm, 1.7mm], followed by reduction from [−2.1mm, 2.4mm] to [−1.0mm, 2.4mm] in Ant-Pos direction. No change observed in LR direction [−0.8mm, 0.6mm]. The combined motion amplitude (vector norm) confirms that average motion and ranges are significantly smaller when beam-on was limited to the 1st 1/3 of actual delivery time. (2) Accuracy. Trajectory log file analysis showed excellent delivery accuracy with at 2400 MU/min. Most leaf deviations during beam-on were within 0.07mm (99-percentile). Maximum leaf-opening deviations during each beam-on were all under 0.1mm for all leaves. Dose-rate was maintained at 2400-MU/min during beam-on without dipping. Conclusion: Delivery prostate SBRT with 2400 MU/min is both beneficial and accurate. High dose rates significantly reduced both treatment time and intra-beam prostate motion range. Excellent delivery accuracy was confirmed with very small leaf motion deviation.« less

  18. A Scientific Data Provenance Harvester for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan, Eric G.; Raju, Bibi; Elsethagen, Todd O.

    Data provenance provides a way for scientists to observe how experimental data originates, conveys process history, and explains influential factors such as experimental rationale and associated environmental factors from system metrics measured at runtime. The US Department of Energy Office of Science Integrated end-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project has developed a provenance harvester that is capable of collecting observations from file based evidence typically produced by distributed applications. To achieve this, file based evidence is extracted and transformed into an intermediate data format inspired in part by W3C CSV on the Web recommendations, calledmore » the Harvester Provenance Application Interface (HAPI) syntax. This syntax provides a general means to pre-stage provenance into messages that are both human readable and capable of being written to a provenance store, Provenance Environment (ProvEn). HAPI is being applied to harvest provenance from climate ensemble runs for Accelerated Climate Modeling for Energy (ACME) project funded under the U.S. Department of Energy’s Office of Biological and Environmental Research (BER) Earth System Modeling (ESM) program. ACME informally provides provenance in a native form through configuration files, directory structures, and log files that contain success/failure indicators, code traces, and performance measurements. Because of its generic format, HAPI is also being applied to harvest tabular job management provenance from Belle II DIRAC scheduler relational database tables as well as other scientific applications that log provenance related information.« less

  19. 77 FR 10451 - Fishing Tackle Containing Lead; Disposition of Petition Filed Pursuant to TSCA Section 21

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-22

    ... visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...

  20. Gamma-index method sensitivity for gauging plan delivery accuracy of volumetric modulated arc therapy.

    PubMed

    Park, Jong In; Park, Jong Min; Kim, Jung-In; Park, So-Yeon; Ye, Sung-Joon

    2015-12-01

    The aim of this study was to investigate the sensitivity of the gamma-index method according to various gamma criteria for volumetric modulated arc therapy (VMAT). Twenty head and neck (HN) and twenty prostate VMAT plans were retrospectively selected for this study. Both global and local 2D gamma evaluations were performed with criteria of 3%/3 mm, 2%/2 mm, 1%/2 mm and 2%/1 mm. In this study, the global and local gamma-index calculated the differences in doses relative to the maximum dose and the dose at the current measurement point, respectively. Using log files acquired during delivery, the differences in parameters at every control point between the VMAT plans and the log files were acquired. The differences in dose-volumetric parameters between reconstructed VMAT plans using the log files and the original VMAT plans were calculated. The Spearman's rank correlation coefficients (rs) were calculated between the passing rates and those differences. Considerable correlations with statistical significances were observed between global 1%/2 mm, local 1%/2 mm and local 2%/1 mm and the MLC position differences (rs = -0.712, -0.628 and -0.581). The numbers of rs values with statistical significance between the passing rates and the changes in dose-volumetric parameters were largest in global 2%/2 mm (n = 16), global 2%/1 mm (n = 15) and local 2%/1 mm (n = 13) criteria. Local gamma-index method with 2%/1 mm generally showed higher sensitivity to detect deviations between a VMAT plan and the delivery of the VMAT plan. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  1. Neuropsychological constraints to human data production on a global scale

    NASA Astrophysics Data System (ADS)

    Gros, C.; Kaczor, G.; Marković, D.

    2012-01-01

    Which are the factors underlying human information production on a global level? In order to gain an insight into this question we study a corpus of 252-633 mil. publicly available data files on the Internet corresponding to an overall storage volume of 284-675 Terabytes. Analyzing the file size distribution for several distinct data types we find indications that the neuropsychological capacity of the human brain to process and record information may constitute the dominant limiting factor for the overall growth of globally stored information, with real-world economic constraints having only a negligible influence. This supposition draws support from the observation that the files size distributions follow a power law for data without a time component, like images, and a log-normal distribution for multimedia files, for which time is a defining qualia.

  2. SU-E-T-100: Designing a QA Tool for Enhance Dynamic Wedges Based On Dynalog Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yousuf, A; Hussain, A

    2014-06-01

    Purpose: A robust quality assurance (QA) program for computer controlled enhanced dynamic wedge (EDW) has been designed and tested. Calculations to perform such QA test is based upon the EDW dynamic log files generated during dose delivery. Methods: Varian record and verify system generates dynamic log (dynalog) files during dynamic dose delivery. The system generated dynalog files contain information such as date and time of treatment, energy, monitor units, wedge orientation, and type of treatment. It also contains the expected calculated segmented treatment tables (STT) and the actual delivered STT for the treatment delivery as a verification record. These filesmore » can be used to assess the integrity and precision of the treatment plan delivery. The plans were delivered with a 6 MV beam from a Varian linear accelerator. For available EDW angles (10°, 15°, 20°, 25°, 30°, 45°, and 60°) Varian STT values were used to manually calculate monitor units for each segment. It can also be used to calculate the EDW factors. Independent verification of fractional MUs per segment was performed against those generated from dynalog files. The EDW factors used to calculate MUs in TPS were dosimetrically verified in solid water phantom with semiflex chamber on central axis. Results: EDW factors were generated from the STT provided by Varian and verified against practical measurements. The measurements were in agreement of the order of 1 % to the calculated EDW data. Variation between the MUs per segment obtained from dynalog files and those manually calculated was found to be less than 2%. Conclusion: An efficient and easy tool to perform routine QA procedure of EDW is suggested. The method can be easily implemented in any institution without a need for expensive QA equipment. An error of the order of ≥2% can be easily detected.« less

  3. 75 FR 27051 - Privacy Act of 1974: System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-13

    ... address and appears below: DOT/FMCSA 004 SYSTEM NAME: National Consumer Complaint Database (NCCDB.... A system, database, and procedures for filing and logging consumer complaints relating to household... are stored in an automated system operated and maintained at the Volpe National Transportation Systems...

  4. 20 CFR 655.201 - Temporary labor certification applications.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Temporary labor certification applications... applications. (a)(1) An employer who anticipates a labor shortage of workers for agricultural or logging... an agent file, in duplicate, a temporary labor certification application, signed by the employer...

  5. Application of Architectural Patterns and Lightweight Formal Method for the Validation and Verification of Safety Critical Systems

    DTIC Science & Technology

    2013-09-01

    to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS

  6. Development of a Methodology for Customizing Insider Threat Auditing on a Linux Operating System

    DTIC Science & Technology

    2010-03-01

    information /etc/group, passwd ,gshadow,shadow,/security/opasswd 16 User A attempts to access User B directory 17 User A attempts to access User B file w/o...configuration Handled by audit rules for root actions Audit user write attempts to system files -w /etc/group –p wxa -w /etc/ passwd –p wxa -w /etc/gshadow –p...information (/etc/group, /etc/ passwd , /etc/gshadow, /etc/shadow, /etc/sudoers, /etc/security/opasswd) Procedure: 1. User2 logs into the system

  7. 75 FR 69644 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-15

    ..., organization, phone, fax, mobile, pager, Defense Switched Network (DSN) phone, other fax, other mobile, other.../Transport Layer Security (SSL/ TLS) connections, access control lists, file system permissions, intrusion detection and prevention systems and log monitoring. Complete access to all records is restricted to and...

  8. The RIACS Intelligent Auditing and Categorizing System

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1988-01-01

    The organization of the RIACS auditing package is described along with how to installation instructions and how to interpret the output. How to set up both local and remote file system auditing is given. Logging is done on a time driven basis, and auditing in a passive mode.

  9. VizieR Online Data Catalog: Wide binaries in Tycho-Gaia: search method (Andrews+, 2017)

    NASA Astrophysics Data System (ADS)

    Andrews, J. J.; Chaname, J.; Agueros, M. A.

    2017-11-01

    Our catalogue of wide binaries identified in the Tycho-Gaia Astrometric Solution catalogue. The Gaia source IDs, Tycho IDs, astrometry, posterior probabilities for both the log-flat prior and power-law prior models, and angular separation are presented. (1 data file).

  10. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  11. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  12. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  13. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  14. 46 CFR 380.24 - Schedule of retention periods and description of records.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... bonds, salvage data, and claim files; (4) Contracts, agreements, franchises, licenses, etc., such as subsidy, charter, ship construction, and pooling agreements; (5) Vessel operating records such as log... Administration: (1) Ship construction or reconversion records such as bids, plans, progress payments, and...

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doug Blankenship

    Natural fracture data from wells 33-7, 33A-7,52A-7, 52B-7 and 83-11 at West Flank. Fracture orientations were determined from image logs of these wells (see accompanying submissions). Data files contain depth, apparent (in wellbore reference frame) and true (in geographic reference frame) azimuth and dip, respectively.

  16. TU-D-209-05: Automatic Calculation of Organ and Effective Dose for CBCT and Interventional Fluoroscopic Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Oines, A

    Purpose: To compare PCXMC and EGSnrc calculated organ and effective radiation doses from cone-beam computed tomography (CBCT) and interventional fluoroscopically-guided procedures using automatic exposure-event grouping. Methods: For CBCT, we used PCXMC20Rotation.exe to automatically calculate the doses and compared the results to those calculated using EGSnrc with the Zubal patient phantom. For interventional procedures, we use the dose tracking system (DTS) which we previously developed to produce a log file of all geometry and exposure parameters for every x-ray pulse during a procedure, and the data in the log file is input into PCXMC and EGSnrc for dose calculation. A MATLABmore » program reads data from the log files and groups similar exposures to reduce calculation time. The definition files are then automatically generated in the format used by PCXMC and EGSnrc. Processing is done at the end of the procedure after all exposures are completed. Results: For the Toshiba Infinix CBCT LCI-Middle-Abdominal protocol, most organ doses calculated with PCXMC20Rotation closely matched those calculated with EGSnrc. The effective doses were 33.77 mSv with PCXMC20Rotation and 32.46 mSv with EGSnrc. For a simulated interventional cardiac procedure, similar close agreement in organ dose was obtained between the two codes; the effective doses were 12.02 mSv with PCXMC and 11.35 mSv with EGSnrc. The calculations can be completed on a PC without manual intervention in less than 15 minutes with PCXMC and in about 10 hours with EGSnrc, depending on the level of data grouping and accuracy desired. Conclusion: Effective dose and most organ doses in CBCT and interventional radiology calculated by PCXMC closely match those calculated by EGSnrc. Data grouping, which can be done automatically, makes the calculation time with PCXMC on a standard PC acceptable. This capability expands the dose information that can be provided by the DTS. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.« less

  17. VizieR Online Data Catalog: New atmospheric parameters of MILES cool stars (Sharma+, 2016)

    NASA Astrophysics Data System (ADS)

    Sharma, K.; Prugniel, P.; Singh, H. P.

    2015-11-01

    MILES V2 spectral interpolator The FITS file is an improved version of MILES interpolator previously presented in PVK. It contains the coefficients of the interpolator, which allows one to compute an interpolated spectrum, giving an effective temperature, log of surface gravity and metallicity (Teff, logg, and [Fe/H]). The file consists of three extensions containing the three temperature regimes described in the paper. Extension Teff range 0 warm 4000-9000K 1 hot >7000K 2 cold <4550K The three functions are linearly interpolated in the Teff overlapping regions. Each extension contains a 2D image-type array, whose first axis is the wavelength described by a WCS (Air wavelength, starting at 3536Å, step=0.9Å). This FITS file can be used by the ULySS v1.3 or higher. (5 data files).

  18. VizieR Online Data Catalog: Distances to RRab stars from WISE and Gaia (Sesar+, 2017)

    NASA Astrophysics Data System (ADS)

    Sesar, B.; Fouesneau, M.; Price-Whelan, A. M.; Bailer-Jones, C. A. L.; Gould, A.; Rix, H.-W.

    2017-10-01

    To constrain the period-luminosity-metallicity (PLZ) relations for RR Lyrae stars in WISE W1 and W2 bands, we use TGAS trigonometric parallaxes (barω), spectroscopic metallicities ([Fe/H]; Fernley+ 1998, J/A+A/330/515), log-periods (logP, base 10), and apparent magnitudes (m; Klein+ 2014, J/MNRAS/440/L96) for 102 RRab stars within ~2.5kpc from the Sun. The E(B-V) reddening at a star's position is obtained from the Schlegel+ (1998ApJ...500..525S) dust map. (1 data file).

  19. Wave-Ice Interaction and the Marginal Ice Zone

    DTIC Science & Technology

    2013-09-30

    concept, using a high-quality attitude and heading reference system ( AHRS ) together with an accurate twin-antennae GPS compass. The instruments logged...the AHRS parameters at 50Hz, together with GPS-derived fixes, heading (accurate to better than 1o) and velocities at 10Hz. The 30MB hourly files

  20. Log on to the Future: One School's Success Story.

    ERIC Educational Resources Information Center

    Hovenic, Ginger

    This paper describes Clear View Elementary School's (California) successful experience with integrating technology into the curriculum. Since its inception seven years ago, the school has acquired 250 computers, networked them all on two central file servers, and computerized the library and trained all staff members to be proficient facilitators…

  1. 40 CFR 146.14 - Information to be considered by the Director.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., logging procedures, deviation checks, and a drilling, testing, and coring program; and (16) A certificate... information listed below which are current and accurate in the file. For a newly drilled Class I well, the..., construction, date drilled, location, depth, record of plugging and/or completion, and any additional...

  2. All Aboard the Internet.

    ERIC Educational Resources Information Center

    Descy, Don E.

    1993-01-01

    This introduction to the Internet with examples for Macintosh computer users demonstrates the ease of using e-mail, participating on discussion group listservs, logging in to remote sites using Telnet, and obtaining resources using the File Transfer Protocol (FTP). Included are lists of discussion groups, Telnet sites, and FTP Archive sites. (EA)

  3. The Internet and Technical Services: A Point Break Approach.

    ERIC Educational Resources Information Center

    McCombs, Gillian M.

    1994-01-01

    Discusses implications of using the Internet for library technical services. Topics addressed include creative uses of the Internet; three basic applications on the Internet, i.e., electronic mail, remote log-in to another computer, and file transfer; electronic processing of information; electronic access to information; and electronic processing…

  4. 77 FR 35956 - Appalachian Power Company; Notice of Application Accepted for Filing, Soliciting Motions To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-15

    ...) screened intake structures; (3) a concrete powerhouse containing three turbine-generator units with a total... structures; (3) a concrete powerhouse containing three turbine-generator units with a total installed... by a log boom; (2) screened intake structures; (3) a concrete powerhouse containing three turbine...

  5. Library Web Proxy Use Survey Results.

    ERIC Educational Resources Information Center

    Murray, Peter E.

    2001-01-01

    Outlines the use of proxy Web servers by libraries and reports on a survey on their use in libraries. Highlights include proxy use for remote resource access, for filtering, for bandwidth conservation, and for gathering statistics; privacy policies regarding the use of proxy server log files; and a copy of the survey. (LRW)

  6. Analyzing Medical Image Search Behavior: Semantics and Prediction of Query Results.

    PubMed

    De-Arteaga, Maria; Eggel, Ivan; Kahn, Charles E; Müller, Henning

    2015-10-01

    Log files of information retrieval systems that record user behavior have been used to improve the outcomes of retrieval systems, understand user behavior, and predict events. In this article, a log file of the ARRS GoldMiner search engine containing 222,005 consecutive queries is analyzed. Time stamps are available for each query, as well as masked IP addresses, which enables to identify queries from the same person. This article describes the ways in which physicians (or Internet searchers interested in medical images) search and proposes potential improvements by suggesting query modifications. For example, many queries contain only few terms and therefore are not specific; others contain spelling mistakes or non-medical terms that likely lead to poor or empty results. One of the goals of this report is to predict the number of results a query will have since such a model allows search engines to automatically propose query modifications in order to avoid result lists that are empty or too large. This prediction is made based on characteristics of the query terms themselves. Prediction of empty results has an accuracy above 88%, and thus can be used to automatically modify the query to avoid empty result sets for a user. The semantic analysis and data of reformulations done by users in the past can aid the development of better search systems, particularly to improve results for novice users. Therefore, this paper gives important ideas to better understand how people search and how to use this knowledge to improve the performance of specialized medical search engines.

  7. Comparing image search behaviour in the ARRS GoldMiner search engine and a clinical PACS/RIS.

    PubMed

    De-Arteaga, Maria; Eggel, Ivan; Do, Bao; Rubin, Daniel; Kahn, Charles E; Müller, Henning

    2015-08-01

    Information search has changed the way we manage knowledge and the ubiquity of information access has made search a frequent activity, whether via Internet search engines or increasingly via mobile devices. Medical information search is in this respect no different and much research has been devoted to analyzing the way in which physicians aim to access information. Medical image search is a much smaller domain but has gained much attention as it has different characteristics than search for text documents. While web search log files have been analysed many times to better understand user behaviour, the log files of hospital internal systems for search in a PACS/RIS (Picture Archival and Communication System, Radiology Information System) have rarely been analysed. Such a comparison between a hospital PACS/RIS search and a web system for searching images of the biomedical literature is the goal of this paper. Objectives are to identify similarities and differences in search behaviour of the two systems, which could then be used to optimize existing systems and build new search engines. Log files of the ARRS GoldMiner medical image search engine (freely accessible on the Internet) containing 222,005 queries, and log files of Stanford's internal PACS/RIS search called radTF containing 18,068 queries were analysed. Each query was preprocessed and all query terms were mapped to the RadLex (Radiology Lexicon) terminology, a comprehensive lexicon of radiology terms created and maintained by the Radiological Society of North America, so the semantic content in the queries and the links between terms could be analysed, and synonyms for the same concept could be detected. RadLex was mainly created for the use in radiology reports, to aid structured reporting and the preparation of educational material (Lanlotz, 2006) [1]. In standard medical vocabularies such as MeSH (Medical Subject Headings) and UMLS (Unified Medical Language System) specific terms of radiology are often underrepresented, therefore RadLex was considered to be the best option for this task. The results show a surprising similarity between the usage behaviour in the two systems, but several subtle differences can also be noted. The average number of terms per query is 2.21 for GoldMiner and 2.07 for radTF, the used axes of RadLex (anatomy, pathology, findings, …) have almost the same distribution with clinical findings being the most frequent and the anatomical entity the second; also, combinations of RadLex axes are extremely similar between the two systems. Differences include a longer length of the sessions in radTF than in GoldMiner (3.4 and 1.9 queries per session on average). Several frequent search terms overlap but some strong differences exist in the details. In radTF the term "normal" is frequent, whereas in GoldMiner it is not. This makes intuitive sense, as in the literature normal cases are rarely described whereas in clinical work the comparison with normal cases is often a first step. The general similarity in many points is likely due to the fact that users of the two systems are influenced by their daily behaviour in using standard web search engines and follow this behaviour in their professional search. This means that many results and insights gained from standard web search can likely be transferred to more specialized search systems. Still, specialized log files can be used to find out more on reformulations and detailed strategies of users to find the right content. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. SU-F-T-308: Mobius FX Evaluation and Comparison Against a Commercial 4D Detector Array for VMAT Plan QA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vazquez Quino, L; Huerta Hernandez, C; Morrow, A

    2016-06-15

    Purpose: To evaluate the use of MobiusFX as a pre-treatment verification IMRT QA tool and compare it with a commercial 4D detector array for VMAT plan QA. Methods: 15 VMAT plan QA of different treatment sites were delivered and measured by traditional means with the 4D detector array ArcCheck (Sun Nuclear corporation) and at the same time measurement in linac treatment logs (Varian Dynalogs files) were analyzed from the same delivery with MobiusFX software (Mobius Medical Systems). VMAT plan QAs created in Eclipse treatment planning system (Varian) in a TrueBeam linac machine (Varian) were delivered and analyzed with the gammamore » analysis routine from SNPA software (Sun Nuclear corporation). Results: Comparable results in terms of the gamma analysis with 99.06% average gamma passing with 3%,3mm passing rate is observed in the comparison among MobiusFX, ArcCheck measurements, and the Treatment Planning System dose calculated. When going to a stricter criterion (1%,1mm) larger discrepancies are observed in different regions of the measurements with an average gamma of 66.24% between MobiusFX and ArcCheck. Conclusion: This work indicates the potential for using MobiusFX as a routine pre-treatment patient specific IMRT method for quality assurance purposes and its advantages as a phantom-less method which reduce the time for IMRT QA measurement. MobiusFX is capable of produce similar results of those by traditional methods used for patient specific pre-treatment verification VMAT QA. Even the gamma results comparing to the TPS are similar the analysis of both methods show that the errors being identified by each method are found in different regions. Traditional methods like ArcCheck are sensitive to setup errors and dose difference errors coming from the linac output. On the other hand linac log files analysis record different errors in the VMAT QA associated with the MLCs and gantry motion that by traditional methods cannot be detected.« less

  9. PKI solar thermal plant evaluation at Capitol Concrete Products, Topeka, Kansas

    NASA Astrophysics Data System (ADS)

    Hauger, J. S.; Borton, D. N.

    1982-07-01

    A system feasibility test to determine the technical and operational feasibility of using a solar collector to provide industrial process heat is discussed. The test is of a solar collector system in an industrial test bed plant at Capitol Concrete Products in Topeka, Kansas, with an experiment control at Sandia National Laboratories, Albuquerque. Plant evaluation will occur during a year-long period of industrial utilization. It will include performance testing, operability testing, and system failure analysis. Performance data will be recorded by a data acquisition system. User, community, and environmental inputs will be recorded in logs, journals, and files. Plant installation, start-up, and evaluation, are anticipated for late November, 1981.

  10. PKI solar thermal plant evaluation at Capitol Concrete Products, Topeka, Kansas

    NASA Technical Reports Server (NTRS)

    Hauger, J. S.; Borton, D. N.

    1982-01-01

    A system feasibility test to determine the technical and operational feasibility of using a solar collector to provide industrial process heat is discussed. The test is of a solar collector system in an industrial test bed plant at Capitol Concrete Products in Topeka, Kansas, with an experiment control at Sandia National Laboratories, Albuquerque. Plant evaluation will occur during a year-long period of industrial utilization. It will include performance testing, operability testing, and system failure analysis. Performance data will be recorded by a data acquisition system. User, community, and environmental inputs will be recorded in logs, journals, and files. Plant installation, start-up, and evaluation, are anticipated for late November, 1981.

  11. The Added Value of Log File Analyses of the Use of a Personal Health Record for Patients With Type 2 Diabetes Mellitus

    PubMed Central

    Kelders, Saskia M.; Braakman-Jansen, Louise M. A.; van Gemert-Pijnen, Julia E. W. C.

    2014-01-01

    The electronic personal health record (PHR) is a promising technology for improving the quality of chronic disease management. Until now, evaluations of such systems have provided only little insight into why a particular outcome occurred. The aim of this study is to gain insight into the navigation process (what functionalities are used, and in what sequence) of e-Vita, a PHR for patients with type 2 diabetes mellitus (T2DM), to increase the efficiency of the system and improve the long-term adherence. Log data of the first visits in the first 6 weeks after the release of a renewed version of e-Vita were analyzed to identify the usage patterns that emerge when users explore a new application. After receiving the invitation, 28% of all registered users visited e-Vita. In total, 70 unique usage patterns could be identified. When users visited the education service first, 93% of all users ended their session. Most users visited either 1 or 5 or more services during their first session, but the distribution of the routes was diffuse. In conclusion, log file analyses can provide valuable prompts for improving the system design of a PHR. In this way, the match between the system and its users and the long-term adherence has the potential to increase. PMID:24876574

  12. ILRS Station Reporting

    NASA Technical Reports Server (NTRS)

    Noll, Carey E.; Pearlman, Michael Reisman; Torrence, Mark H.

    2013-01-01

    Network stations provided system configuration documentation upon joining the ILRS. This information, found in the various site and system log files available on the ILRS website, is essential to the ILRS analysis centers, combination centers, and general user community. Therefore, it is imperative that the station personnel inform the ILRS community in a timely fashion when changes to the system occur. This poster provides some information about the various documentation that must be maintained. The ILRS network consists of over fifty global sites actively ranging to over sixty satellites as well as five lunar reflectors. Information about these stations are available on the ILRS website (http://ilrs.gsfc.nasa.gov/network/stations/index.html). The ILRS Analysis Centers must have current information about the stations and their system configuration in order to use their data in generation of derived products. However, not all information available on the ILRS website is as up-to-date as necessary for correct analysis of their data.

  13. PDB explorer -- a web based algorithm for protein annotation viewer and 3D visualization.

    PubMed

    Nayarisseri, Anuraj; Shardiwal, Rakesh Kumar; Yadav, Mukesh; Kanungo, Neha; Singh, Pooja; Shah, Pratik; Ahmed, Sheaza

    2014-12-01

    The PDB file format, is a text format characterizing the three dimensional structures of macro molecules available in the Protein Data Bank (PDB). Determined protein structure are found in coalition with other molecules or ions such as nucleic acids, water, ions, Drug molecules and so on, which therefore can be described in the PDB format and have been deposited in PDB database. PDB is a machine generated file, it's not human readable format, to read this file we need any computational tool to understand it. The objective of our present study is to develop a free online software for retrieval, visualization and reading of annotation of a protein 3D structure which is available in PDB database. Main aim is to create PDB file in human readable format, i.e., the information in PDB file is converted in readable sentences. It displays all possible information from a PDB file including 3D structure of that file. Programming languages and scripting languages like Perl, CSS, Javascript, Ajax, and HTML have been used for the development of PDB Explorer. The PDB Explorer directly parses the PDB file, calling methods for parsed element secondary structure element, atoms, coordinates etc. PDB Explorer is freely available at http://www.pdbexplorer.eminentbio.com/home with no requirement of log-in.

  14. Fort Bliss Geothermal Area Data: Temperature profile, logs, schematic model and cross section

    DOE Data Explorer

    Adam Brandt

    2015-11-15

    This dataset contains a variety of data about the Fort Bliss geothermal area, part of the southern portion of the Tularosa Basin, New Mexico. The dataset contains schematic models for the McGregor Geothermal System, a shallow temperature survey of the Fort Bliss geothermal area. The dataset also contains Century OH logs, a full temperature profile, and complete logs from well RMI 56-5, including resistivity and porosity data, drill logs with drill rate, depth, lithology, mineralogy, fractures, temperature, pit total, gases, and descriptions among other measurements as well as CDL, CNL, DIL, GR Caliper and Temperature files. A shallow (2 meter depth) temperature survey of the Fort Bliss geothermal area with 63 data points is also included. Two cross sections through the Fort Bliss area, also included, show well position and depth. The surface map included shows faults and well spatial distribution. Inferred and observed fault distributions from gravity surveys around the Fort Bliss geothermal area.

  15. Methods for peak-flow frequency analysis and reporting for streamgages in or near Montana based on data through water year 2015

    USGS Publications Warehouse

    Sando, Steven K.; McCarthy, Peter M.

    2018-05-10

    This report documents the methods for peak-flow frequency (hereinafter “frequency”) analysis and reporting for streamgages in and near Montana following implementation of the Bulletin 17C guidelines. The methods are used to provide estimates of peak-flow quantiles for 50-, 42.9-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities for selected streamgages operated by the U.S. Geological Survey Wyoming-Montana Water Science Center (WY–MT WSC). These annual exceedance probabilities correspond to 2-, 2.33-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-year recurrence intervals, respectively.Standard procedures specific to the WY–MT WSC for implementing the Bulletin 17C guidelines include (1) the use of the Expected Moments Algorithm analysis for fitting the log-Pearson Type III distribution, incorporating historical information where applicable; (2) the use of weighted skew coefficients (based on weighting at-site station skew coefficients with generalized skew coefficients from the Bulletin 17B national skew map); and (3) the use of the Multiple Grubbs-Beck Test for identifying potentially influential low flows. For some streamgages, the peak-flow records are not well represented by the standard procedures and require user-specified adjustments informed by hydrologic judgement. The specific characteristics of peak-flow records addressed by the informed-user adjustments include (1) regulated peak-flow records, (2) atypical upper-tail peak-flow records, and (3) atypical lower-tail peak-flow records. In all cases, the informed-user adjustments use the Expected Moments Algorithm fit of the log-Pearson Type III distribution using the at-site station skew coefficient, a manual potentially influential low flow threshold, or both.Appropriate methods can be applied to at-site frequency estimates to provide improved representation of long-term hydroclimatic conditions. The methods for improving at-site frequency estimates by weighting with regional regression equations and by Maintenance of Variance Extension Type III record extension are described.Frequency analyses were conducted for 99 example streamgages to indicate various aspects of the frequency-analysis methods described in this report. The frequency analyses and results for the example streamgages are presented in a separate data release associated with this report consisting of tables and graphical plots that are structured to include information concerning the interpretive decisions involved in the frequency analyses. Further, the separate data release includes the input files to the PeakFQ program, version 7.1, including the peak-flow data file and the analysis specification file that were used in the peak-flow frequency analyses. Peak-flow frequencies are also reported in separate data releases for selected streamgages in the Beaverhead River and Clark Fork Basins and also for selected streamgages in the Ruby, Jefferson, and Madison River Basins.

  16. Disaster Radio for Communication of Vital Messages and Health-Related Information: Experiences From the Haiyan Typhoon, the Philippines.

    PubMed

    Hugelius, Karin; Gifford, Mervyn; Örtenwall, Per; Adolfsson, Annsofie

    2016-08-01

    Crisis communication is seen as an integrated and essential part of disaster management measures. After Typhoon Haiyan (Yolanda) in the Philippines 2013, radio was used to broadcast information to the affected community. The aim of this study was to describe how disaster radio was used to communicate vital messages and health-related information to the public in one affected region after Typhoon Haiyan. Mixed-methods analysis using qualitative content analysis and descriptive statistics was used to analyze 2587 logged radio log files. Radio was used to give general information and to demonstrate the capability of officials to manage the situation, to encourage, to promote recovery and foster a sense of hope, and to give practical advice and encourage self-activity. The content and focus of the messages changed over time. Encouraging messages were the most frequently broadcast messages. Health-related messages were a minor part of all information broadcast and gaps in the broadcast over time were found. Disaster radio can serve as a transmitter of vital messages including health-related information and psychological support in disaster areas. The present study indicated the potential for increased use. The perception, impact, and use of disaster radio need to be further evaluated. (Disaster Med Public Health Preparedness. 2016;10:591-597).

  17. A compendium of P- and S-wave velocities from surface-to-borehole logging; summary and reanalysis of previously published data and analysis of unpublished data

    USGS Publications Warehouse

    Boore, David M.

    2003-01-01

    For over 28 years, the U.S. Geological Survey (USGS) has been acquiring seismic velocity and geologic data at a number of locations in California, many of which were chosen because strong ground motions from earthquakes were recorded at the sites. The method for all measurements involves picking first arrivals of P- and S-waves from a surface source recorded at various depths in a borehole (as opposed to noninvasive methods, such as the SASW method [e.g., Brown et al., 2002]). The results from most of the sites are contained in a series of U.S. Geological Survey Open-File Reports (see References). Until now, none of the results have been available as computer files, and before 1992 the interpretation of the arrival times was in terms of piecemeal interval velocities, with no attempt to derive a layered model that would fit the travel times in an overall sense (the one exception is Porcella, 1984). In this report I reanalyze all of the arrival times in terms of layered models for P- and for S-wave velocities at each site, and I provide the results as computer files. In addition to the measurements reported in the open-file reports, I also include some borehole results from other reports, as well as some results never before published. I include data for 277 boreholes (at the time of this writing; more will be added to the web site as they are obtained), all in California (I have data from boreholes in Washington and Utah, but these will be published separately). I am also in the process of interpreting travel time data obtained using a seismic cone penetrometer at hundreds of sites; these data can be interpreted in the same way of those obtained from surface-to-borehole logging. When available, the data will be added to the web site (see below for information on obtaining data from the World Wide Web (WWW)). In addition to the basic borehole data and results, I provide information concerning strong-motion stations that I judge to be close enough to the boreholes that the borehole velocity models can be used as the velocity models beneath the stations.

  18. Stratigraphic framework of Cambrian and Ordovician rocks in the central Appalachian basin from Medina County, Ohio, through southwestern and south-central Pennsylvania to Hampshire County, West Virginia: Chapter E.2.2 in Coal and petroleum resources in the Appalachian basin: distribution, geologic framework, and geochemical character

    USGS Publications Warehouse

    Ryder, Robert T.; Harris, Anita G.; Repetski, John E.; Crangle, Robert D.; Ruppert, Leslie F.; Ryder, Robert T.

    2014-01-01

    This chapter is a re-release of U.S. Geological Survey Bulletin 1839-K, of the same title, by Ryder and others (1992; online version 2.0 revised and digitized by Robert D. Crangle, Jr., 2003). It consists of one file of the report text as it appeared in USGS Bulletin 1839-K and a second file containing the cross section, figures 1 and 2, and tables 1 and 2 on one oversized sheet; the second file was digitized in 2003 as version 2.0 and also includes the gamma-ray well log traces.

  19. On the Automation of the MarkIII Data Analysis System.

    NASA Astrophysics Data System (ADS)

    Schwegmann, W.; Schuh, H.

    1999-03-01

    A faster and semiautomatic data analysis is an important contribution to the acceleration of the VLBI procedure. A concept for the automation of one of the most widely used VLBI software packages the MarkIII Data Analysis System was developed. Then, the program PWXCB, which extracts weather and cable calibration data from the station log-files, was automated supplementing the existing Fortran77 program-code. The new program XLOG and its results will be presented. Most of the tasks in the VLBI data analysis are very complex and their automation requires typical knowledge-based techniques. Thus, a knowledge-based system (KBS) for support and guidance of the analyst is being developed using the AI-workbench BABYLON, which is based on methods of artificial intelligence (AI). The advantages of a KBS for the MarkIII Data Analysis System and the required steps to build a KBS will be demonstrated. Examples about the current status of the project will be given, too.

  20. General Chemistry Division. Quarterly report, July--September 1978

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrar, J.E.

    1978-11-17

    Status of the following studies is given: nonaqueous titrimetry; molar absorbance of 1,3,5,-triamine-2,4,6,-trinitrobenzene in dimethylsulfoxide, potentiometric microdetermination of pentaerythritol tetranitrate (PETN) in PETN-containing composites; potentiometric semimicrodetermination of some tetrazoles with silver nitrate; applications of a mode-locked krypton ion laser; time-resolved spectroscopy; photoelectrochemistry; evaluation of a prototype atomic emission source system; laser spectroscopy of neptunium; high-performance liquid chromatography of polyphenyl ether; acquisition of a portable, computerized mass spectrometer; improved inlet for quantitative mass spectrometry; a computer data system for the UTI gas analyzers; analysis of perfluorobutene-2; examination of iridium coatings; source of high-intensity, polarized x rays for fluorescence analysis; mass spectrometermore » for the coal gasification field test; materials protection measurement guides; the LOG system of sample file control; and methylation of platinum compounds by methylcobalamin. (LK)« less

  1. Improved method estimating bioconcentration/bioaccumulation factor from octanol/water partition coefficient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meylan, W.M.; Howard, P.H.; Aronson, D.

    1999-04-01

    A compound`s bioconcentration factor (BDF) is the most commonly used indicator of its tendency to accumulate in aquatic organisms from the surrounding medium. Because it is expensive to measure, the BCF is generally estimated from the octanol/water partition coefficient (K{sub ow}), but currently used regression equations were developed from small data sets that do not adequately represent the wide range of chemical substances now subject to review. To develop and improved method, the authors collected BCF data in a file that contained information on measured BCFs and other key experimental details for 694 chemicals. Log BCF was then regressed againstmore » log K{sub ow} and chemicals with significant deviations from the line of best fit were analyzed by chemical structure. The resulting algorithm classifies a substance as either nonionic or ionic, the latter group including carboxylic acids, sulfonic acids and their salts, and quaternary N compounds. Log BCF for nonionics is estimated from log K{sub ow} and a series of correction factors if applicable; different equations apply for log K{sub ow} 1.0 to 7.0 and >7.0. For ionics, chemicals are categorized by log K{sub ow} and a log BCF in the range 0.5 to 1.75 is assigned. Organometallics, nonionics with long alkyl chains, and aromatic azo compounds receive special treatment. The correlation coefficient and mean error for log BCF indicate that the new method is a significantly better fit to existing data than other methods.« less

  2. Use of treatment log files in spot scanning proton therapy as part of patient-specific quality assurance

    PubMed Central

    Li, Heng; Sahoo, Narayan; Poenisch, Falk; Suzuki, Kazumichi; Li, Yupeng; Li, Xiaoqiang; Zhang, Xiaodong; Lee, Andrew K.; Gillin, Michael T.; Zhu, X. Ronald

    2013-01-01

    Purpose: The purpose of this work was to assess the monitor unit (MU) values and position accuracy of spot scanning proton beams as recorded by the daily treatment logs of the treatment control system, and furthermore establish the feasibility of using the delivered spot positions and MU values to calculate and evaluate delivered doses to patients. Methods: To validate the accuracy of the recorded spot positions, the authors generated and executed a test treatment plan containing nine spot positions, to which the authors delivered ten MU each. The spot positions were measured with radiographic films and Matrixx 2D ion-chambers array placed at the isocenter plane and compared for displacements from the planned and recorded positions. Treatment logs for 14 patients were then used to determine the spot MU values and position accuracy of the scanning proton beam delivery system. Univariate analysis was used to detect any systematic error or large variation between patients, treatment dates, proton energies, gantry angles, and planned spot positions. The recorded patient spot positions and MU values were then used to replace the spot positions and MU values in the plan, and the treatment planning system was used to calculate the delivered doses to patients. The results were compared with the treatment plan. Results: Within a treatment session, spot positions were reproducible within ±0.2 mm. The spot positions measured by film agreed with the planned positions within ±1 mm and with the recorded positions within ±0.5 mm. The maximum day-to-day variation for any given spot position was within ±1 mm. For all 14 patients, with ∼1 500 000 spots recorded, the total MU accuracy was within 0.1% of the planned MU values, the mean (x, y) spot displacement from the planned value was (−0.03 mm, −0.01 mm), the maximum (x, y) displacement was (1.68 mm, 2.27 mm), and the (x, y) standard deviation was (0.26 mm, 0.42 mm). The maximum dose difference between calculated dose to the patient based on the plan and recorded data was within 2%. Conclusions: The authors have shown that the treatment log file in a spot scanning proton beam delivery system is precise enough to serve as a quality assurance tool to monitor variation in spot position and MU value, as well as the delivered dose uncertainty from the treatment delivery system. The analysis tool developed here could be useful for assessing spot position uncertainty and thus dose uncertainty for any patient receiving spot scanning proton beam therapy. PMID:23387726

  3. Production, prices, employment, and trade in Northwest forest industries, third quarter 1996.

    Treesearch

    Debra D. Warren

    1997-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries: international trade in logs, lumber, and plywood: volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  4. Production, prices, employment, and trade in Northwest forest industries, all quarters 2000.

    Treesearch

    Debra D. Warren

    2002-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  5. Production, prices, employment, and trade in Northwest forest industries, all quarters 2002.

    Treesearch

    Debra D. Warren

    2004-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  6. Production, prices, employment, and trade in Northwest forest industries, all quarters 2005.

    Treesearch

    Debra D. Warren

    2007-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  7. Production, prices, employment, and trade in Northwest forest industries, all quarters 2006.

    Treesearch

    Debra D. Warren

    2008-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  8. Production, prices, employment, and trade in Northwest forest industries, all quarters 2004.

    Treesearch

    Debra D. Warren

    2006-01-01

    Provides current information on lumber and plywood production and prices; employment in forest industries; international trade in logs, lumber, and plywood; volumes and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  9. Voting with Their Seats: Computer Laboratory Design and the Casual User

    ERIC Educational Resources Information Center

    Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David

    2007-01-01

    Student computer laboratories are provided by most teaching institutions around the world; however, what is the most effective layout for such facilities? The log-in data files from computer laboratories at a regional university in Australia were analysed to determine whether there was a pattern in student seating. In particular, it was…

  10. Using Learning Styles and Viewing Styles in Streaming Video

    ERIC Educational Resources Information Center

    de Boer, Jelle; Kommers, Piet A. M.; de Brock, Bert

    2011-01-01

    Improving the effectiveness of learning when students observe video lectures becomes urgent with the rising advent of (web-based) video materials. Vital questions are how students differ in their learning preferences and what patterns in viewing video can be detected in log files. Our experiments inventory students' viewing patterns while watching…

  11. Recommendations for Benchmarking Web Site Usage among Academic Libraries.

    ERIC Educational Resources Information Center

    Hightower, Christy; Sih, Julie; Tilghman, Adam

    1998-01-01

    To help library directors and Web developers create a benchmarking program to compare statistics of academic Web sites, the authors analyzed the Web server log files of 14 university science and engineering libraries. Recommends a centralized voluntary reporting structure coordinated by the Association of Research Libraries (ARL) and a method for…

  12. Motivational Aspects of Learning Genetics with Interactive Multimedia

    ERIC Educational Resources Information Center

    Tsui, Chi-Yan; Treagust, David F.

    2004-01-01

    A BioLogica trial in six U.S. schools using interpretive approach is conducted by the Concord Consortium that examined the student motivation of learning genetics. Multiple data sources like online tests, computer data log files and classroom observation are used that found the result in terms of interviewees' perception, class-wide online…

  13. 16. Photocopy of photograph (4 x 5 inch reduction of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. Photocopy of photograph (4 x 5 inch reduction of 1939 3-1/4 x 5-5/8 inch print, photographer unknown; in Recreation files, Supervisor's Office, Mt. Baker-Snoqualmie National Forest) GENERAL VIEW, NORTHEAST CORNER, INTERPRETIVE LOG TO LEFT. - Glacier Ranger Station, Washington State Route 542, Glacier, Whatcom County, WA

  14. 20 CFR 658.410 - Establishment of State agency JS complaint system.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... system. At the local office level, the local office manager shall be responsible for the management of... related), the local office manager shall transmit a copy of that portion of the log containing the... established for the handling of complaints and files relating to the handling of complaints. The Manager or...

  15. 76 FR 4463 - Privacy Act of 1974; Report of Modified or Altered System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-25

    ... occupationally related mortality or morbidity is occurring. In the event of litigation where the defendant is: (a... diseases and which provides for the confidentiality of the information. In the event of litigation..., limited log-ins, virus protection, and user rights/file attribute restrictions. Password protection...

  16. Production, prices, employment, and trade in Northwest forest industries, all quarters 1998.

    Treesearch

    Debra D. Warren

    2000-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  17. Production, prices, employment, and trade in Northwest forest industries, fourth quarter 1996.

    Treesearch

    Debra D. Warren

    1997-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  18. Production, prices, employment, and trade in Northwest forest industries, all quarters of 2007.

    Treesearch

    Debra D. Warren

    2008-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  19. Production, prices, employment, and trade in Northwest forest industries, all quarters 2003.

    Treesearch

    Debra D. Warren

    2005-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  20. Production, prices, employment, and trade in Northwest forest industries, all quarters 2008

    Treesearch

    Debra Warren

    2009-01-01

    Provides current information on lumber and plywood production and prices; employment in the forest industries; international trade in logs, lumber, and plywood; volume and average prices of stumpage sold by public agencies; and other related items. View an updated online version of this publication with downloadable excel files at

  1. Data Retention Policy | High-Performance Computing | NREL

    Science.gov Websites

    HPC Data Retention Policy. File storage areas on Peregrine and Gyrfalcon are either user-centric to reclaim storage. We can make special arrangements for permanent storage, if needed. User-Centric > is 3 months after the last project ends. During this retention period, the user may log in to

  2. Elementary School Students' Strategic Learning: Does Task-Type Matter?

    ERIC Educational Resources Information Center

    Malmberg, Jonna; Järvelä, Sanna; Kirschner, Paul A.

    2014-01-01

    This study investigated what types of learning patterns and strategies elementary school students use to carry out ill- and well-structured tasks. Specifically, it was investigated which and when learning patterns actually emerge with respect to students' task solutions. The present study uses computer log file traces to investigate how…

  3. Patterns in Elementary School Students' Strategic Actions in Varying Learning Situations

    ERIC Educational Resources Information Center

    Malmberg, Jonna; Järvenoja, Hanna; Järvelä, Sanna

    2013-01-01

    This study uses log file traces to examine differences between high-and low-achieving students' strategic actions in varying learning situations. In addition, this study illustrates, in detail, what strategic and self-regulated learning constitutes in practice. The study investigates the learning patterns that emerge in learning situations…

  4. Online Persistence in Higher Education Web-Supported Courses

    ERIC Educational Resources Information Center

    Hershkovitz, Arnon; Nachmias, Rafi

    2011-01-01

    This research consists of an empirical study of online persistence in Web-supported courses in higher education, using Data Mining techniques. Log files of 58 Moodle websites accompanying Tel Aviv University courses were drawn, recording the activity of 1189 students in 1897 course enrollments during the academic year 2008/9, and were analyzed…

  5. 78 FR 56873 - Information Collection Being Reviewed by the Federal Communications Commission

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... on the respondents, including the use of automated collection techniques or other forms of....: 3060-0360. Title: Section 80.409, Station Logs (Maritime Services). Form No.: N/A. Type of Review... the claim or complaint has been satisfied or barred by statute limiting the time for filing suits upon...

  6. Rapid Diagnostics of Onboard Sequences

    NASA Technical Reports Server (NTRS)

    Starbird, Thomas W.; Morris, John R.; Shams, Khawaja S.; Maimone, Mark W.

    2012-01-01

    Keeping track of sequences onboard a spacecraft is challenging. When reviewing Event Verification Records (EVRs) of sequence executions on the Mars Exploration Rover (MER), operators often found themselves wondering which version of a named sequence the EVR corresponded to. The lack of this information drastically impacts the operators diagnostic capabilities as well as their situational awareness with respect to the commands the spacecraft has executed, since the EVRs do not provide argument values or explanatory comments. Having this information immediately available can be instrumental in diagnosing critical events and can significantly enhance the overall safety of the spacecraft. This software provides auditing capability that can eliminate that uncertainty while diagnosing critical conditions. Furthermore, the Restful interface provides a simple way for sequencing tools to automatically retrieve binary compiled sequence SCMFs (Space Command Message Files) on demand. It also enables developers to change the underlying database, while maintaining the same interface to the existing applications. The logging capabilities are also beneficial to operators when they are trying to recall how they solved a similar problem many days ago: this software enables automatic recovery of SCMF and RML (Robot Markup Language) sequence files directly from the command EVRs, eliminating the need for people to find and validate the corresponding sequences. To address the lack of auditing capability for sequences onboard a spacecraft during earlier missions, extensive logging support was added on the Mars Science Laboratory (MSL) sequencing server. This server is responsible for generating all MSL binary SCMFs from RML input sequences. The sequencing server logs every SCMF it generates into a MySQL database, as well as the high-level RML file and dictionary name inputs used to create the SCMF. The SCMF is then indexed by a hash value that is automatically included in all command EVRs by the onboard flight software. Second, both the binary SCMF result and the RML input file can be retrieved simply by specifying the hash to a Restful web interface. This interface enables command line tools as well as large sophisticated programs to download the SCMF and RMLs on-demand from the database, enabling a vast array of tools to be built on top of it. One such command line tool can retrieve and display RML files, or annotate a list of EVRs by interleaving them with the original sequence commands. This software has been integrated with the MSL sequencing pipeline where it will serve sequences useful in diagnostics, debugging, and situational awareness throughout the mission.

  7. The effect of MLC speed and acceleration on the plan delivery accuracy of VMAT

    PubMed Central

    Park, J M; Wu, H-G; Kim, J H; Carlson, J N K

    2015-01-01

    Objective: To determine a new metric utilizing multileaf collimator (MLC) speeds and accelerations to predict plan delivery accuracy of volumetric modulated arc therapy (VMAT). Methods: To verify VMAT delivery accuracy, gamma evaluations, analysis of mechanical parameter difference between plans and log files, and analysis of changes in dose–volumetric parameters between plans and plans reconstructed with log files were performed with 40 VMAT plans. The average proportion of leaf speeds ranging from l to h cm s−1 (Sl–h and l–h = 0–0.4, 0.4–0.8, 0.8–1.2, 1.2–1.6 and 1.6–2.0), mean and standard deviation of MLC speeds were calculated for each VMAT plan. The same was carried out for accelerations in centimetre per second squared (Al–h and l–h = 0–4, 4–8, 8–12, 12–16 and 16–20). The correlations of those indicators to plan delivery accuracy were analysed with Spearman's correlation coefficient (rs). Results: The S1.2–1.6 and mean acceleration of MLCs showed generally higher correlations to plan delivery accuracy than did others. The highest rs values were observed between S1.2–1.6 and global 1%/2 mm (rs = −0.698 with p < 0.001) as well as mean acceleration and global 1%/2 mm (rs = −0.650 with p < 0.001). As the proportion of MLC speeds and accelerations >0.4 and 4 cm s−2 increased, the plan delivery accuracy of VMAT decreased. Conclusion: The variations in MLC speeds and accelerations showed considerable correlations to VMAT delivery accuracy. Advances in knowledge: As the MLC speeds and accelerations increased, VMAT delivery accuracy reduced. PMID:25734490

  8. VizieR Online Data Catalog: Bessel (1825) calculation for geodesic measurements (Karney+, 2010)

    NASA Astrophysics Data System (ADS)

    Karney, C. F. F.; Deakin, R. E.

    2010-06-01

    The solution of the geodesic problem for an oblate ellipsoid is developed in terms of series. Tables are provided to simplify the computation. Included here are the tables that accompanied Bessel's paper (with corrections). The tables were crafted by Bessel to be minimize the labor of hand calculations. To this end, he adjusted the intervals in the tables, the number of terms included in the series, and the number of significant digits given so that the final results are accurate to about 8 places. For that reason, the most useful form of the tables is as the PDF file which provides the tables in a layout close to the original. Also provided is the LaTeX source file for the PDF file. Finally, the data has been put into a format so that it can be read easily by computer programs. All the logarithms are in base 10 (common logarithms). The characteristic and the mantissa should be read separately (indicated as x.c and x.m in the file description). Thus the first entry in the table, -4.4, should be parsed as "-4" (the characteristic) and ".4" (the mantissa); the anti-log for this entry is 10(-4+0.4)=2.5e-4. The "Delta" columns give the first difference of the preceding column, i.e., the difference of the preceding column in the next row and the preceding column in the current row. In the printed tables these are expressed as "units in the last place" and the differences are of the rounded representations in the preceding columns (to minimize interpolation errors). In table1.dat these are given scaled to a match the format used for the preceding column, as indicated by the units given for these columns. The unit log(") (in the description within square brackets [arcsec]) means the logarithm of a quantity expressed in arcseconds. (3 data files).

  9. Grid-wide neuroimaging data federation in the context of the NeuroLOG project

    PubMed Central

    Michel, Franck; Gaignard, Alban; Ahmad, Farooq; Barillot, Christian; Batrancourt, Bénédicte; Dojat, Michel; Gibaud, Bernard; Girard, Pascal; Godard, David; Kassel, Gilles; Lingrand, Diane; Malandain, Grégoire; Montagnat, Johan; Pélégrini-Issac, Mélanie; Pennec, Xavier; Rojas Balderrama, Javier; Wali, Bacem

    2010-01-01

    Grid technologies are appealing to deal with the challenges raised by computational neurosciences and support multi-centric brain studies. However, core grids middleware hardly cope with the complex neuroimaging data representation and multi-layer data federation needs. Moreover, legacy neuroscience environments need to be preserved and cannot be simply superseded by grid services. This paper describes the NeuroLOG platform design and implementation, shedding light on its Data Management Layer. It addresses the integration of brain image files, associated relational metadata and neuroscience semantic data in a heterogeneous distributed environment, integrating legacy data managers through a mediation layer. PMID:20543431

  10. Archive of Side Scan Sonar and Swath Bathymetry Data collected during USGS Cruise 10CCT02 Offshore of Petit Bois Island Including Petit Bois Pass, Gulf Islands National Seashore, Mississippi, March 2010

    USGS Publications Warehouse

    Pfeiffer, William R.; Flocks, James G.; DeWitt, Nancy T.; Forde, Arnell S.; Kelso, Kyle; Thompson, Phillip R.; Wiese, Dana S.

    2011-01-01

    In March of 2010, the U.S. Geological Survey (USGS) conducted geophysical surveys offshore of Petit Bois Island, Mississippi, and Dauphin Island, Alabama (fig. 1). These efforts were part of the USGS Gulf of Mexico Science Coordination partnership with the U.S. Army Corps of Engineers (USACE) to assist the Mississippi Coastal Improvements Program (MsCIP) and the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazards Susceptibility Project by mapping the shallow geologic stratigraphic framework of the Mississippi Barrier Island Complex. These geophysical surveys will provide the data necessary for scientists to define, interpret, and provide baseline bathymetry and seafloor habitat for this area and to aid scientists in predicting future geomorphological changes of the islands with respect to climate change, storm impact, and sea-level rise. Furthermore, these data will provide information for barrier island restoration, particularly in Camille Cut, and protection for the historical Fort Massachusetts on Ship Island, Mississippi. For more information please refer to http://ngom.usgs.gov/gomsc/mscip/index.html. This report serves as an archive of the processed swath bathymetry and side scan sonar data (SSS). Data products herein include gridded and interpolated surfaces, seabed backscatter images, and ASCII x,y,z data products for both swath bathymetry and side scan sonar imagery. Additional files include trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, and formal FGDC metadata. Scanned images of the handwritten and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.

  11. An alternative to sneakernet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrell, S.; Ralstin, S.

    1992-04-01

    Many computer security plans specify that only a small percentage of the data processed will be classified. Thus, the bulk of the data on secure systems must be unclassified. Secure limited access sites operating approved classified computing systems sometimes also have a system ostensibly containing only unclassified files but operating within the secure environment. That system could be networked or otherwise connected to a classified system(s) in order that both be able to use common resources for file storage or computing power. Such a system must operate under the same rules as the secure classified systems. It is in themore » nature of unclassified files that they either came from, or will eventually migrate to, a non-secure system. Today, unclassified files are exported from systems within the secure environment typically by loading transport media and carrying them to an open system. Import of unclassified files is handled similarly. This media transport process, sometimes referred to as sneaker net, often is manually logged and controlled only by administrative procedures. A comprehensive system for secure bi-directional transfer of unclassified files between secure and open environments has yet to be developed. Any such secure file transport system should be required to meet several stringent criteria. It is the purpose of this document to begin a definition of these criteria.« less

  12. An alternative to sneakernet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrell, S.; Ralstin, S.

    1992-01-01

    Many computer security plans specify that only a small percentage of the data processed will be classified. Thus, the bulk of the data on secure systems must be unclassified. Secure limited access sites operating approved classified computing systems sometimes also have a system ostensibly containing only unclassified files but operating within the secure environment. That system could be networked or otherwise connected to a classified system(s) in order that both be able to use common resources for file storage or computing power. Such a system must operate under the same rules as the secure classified systems. It is in themore » nature of unclassified files that they either came from, or will eventually migrate to, a non-secure system. Today, unclassified files are exported from systems within the secure environment typically by loading transport media and carrying them to an open system. Import of unclassified files is handled similarly. This media transport process, sometimes referred to as sneaker net, often is manually logged and controlled only by administrative procedures. A comprehensive system for secure bi-directional transfer of unclassified files between secure and open environments has yet to be developed. Any such secure file transport system should be required to meet several stringent criteria. It is the purpose of this document to begin a definition of these criteria.« less

  13. The medium is NOT the message or Indefinitely long-term file storage at Leeds University

    NASA Technical Reports Server (NTRS)

    Holdsworth, David

    1996-01-01

    Approximately 3 years ago we implemented an archive file storage system which embodies experiences gained over more than 25 years of using and writing file storage systems. It is the third in-house system that we have written, and all three systems have been adopted by other institutions. This paper discusses the requirements for long-term data storage in a university environment, and describes how our present system is designed to meet these requirements indefinitely. Particular emphasis is laid on experiences from past systems, and their influence on current system design. We also look at the influence of the IEEE-MSS standard. We currently have the system operating in five UK universities. The system operates in a multi-server environment, and is currently operational with UNIX (SunOS4, Solaris2, SGI-IRIX, HP-UX), NetWare3 and NetWare4. PCs logged on to NetWare can also archive and recover files that live on their hard disks.

  14. The design and implementation of the HY-1B Product Archive System

    NASA Astrophysics Data System (ADS)

    Liu, Shibin; Liu, Wei; Peng, Hailong

    2010-11-01

    Product Archive System (PAS), as a background system, is the core part of the Product Archive and Distribution System (PADS) which is the center for data management of the Ground Application System of HY-1B satellite hosted by the National Satellite Ocean Application Service of China. PAS integrates a series of updating methods and technologies, such as a suitable data transmittal mode, flexible configuration files and log information in order to make the system with several desirable characteristics, such as ease of maintenance, stability, minimal complexity. This paper describes seven major components of the PAS (Network Communicator module, File Collector module, File Copy module, Task Collector module, Metadata Extractor module, Product data Archive module, Metadata catalogue import module) and some of the unique features of the system, as well as the technical problems encountered and resolved.

  15. Using Videogame Apps to Assess Gains in Adolescents' Substance Use Knowledge: New Opportunities for Evaluating Intervention Exposure and Content Mastery.

    PubMed

    Montanaro, Erika; Fiellin, Lynn E; Fakhouri, Tamer; Kyriakides, Tassos C; Duncan, Lindsay R

    2015-10-28

    Videogame interventions are becoming increasingly popular as a means to engage people in behavioral interventions; however, strategies for examining data from such interventions have not been developed. The objective of this study was to describe how a technology-based intervention can yield meaningful, objective evidence of intervention exposure within a behavioral intervention. This study demonstrates the analysis of automatic log files, created by software from a videogame intervention, that catalog game play and, as proof of concept, the association of these data with changes in substance use knowledge as documented with standardized assessments. We analyzed 3- and 6-month follow-up data from 166 participants enrolled in a randomized controlled trial evaluating a videogame intervention, PlayForward: Elm City Stories (PlayForward). PlayForward is a videogame developed as a risk reduction and prevention program targeting HIV risk behaviors (substance use and sex) in young minority adolescents. Log files were analyzed to extract the total amount of time spent playing the videogame intervention and the total number of game levels completed and beaten by each player. Completing and beating more of the game levels, and not total game play time, was related to higher substance use knowledge scores at the 3- (P=.001) and 6-month (P=.001) follow-ups. Our findings highlight the potential contributions a videogame intervention can make to the study of health behavior change. Specifically, the use of objective data collected during game play can address challenges in traditional human-delivered behavioral interventions. Clinicaltrials.gov NCT01666496; https://clinicaltrials.gov/ct2/show/NCT01666496 (Archived by WebCite at http://www.webcitation.org/6cV9fxsOg).

  16. Using Videogame Apps to Assess Gains in Adolescents’ Substance Use Knowledge: New Opportunities for Evaluating Intervention Exposure and Content Mastery

    PubMed Central

    2015-01-01

    Background Videogame interventions are becoming increasingly popular as a means to engage people in behavioral interventions; however, strategies for examining data from such interventions have not been developed. Objective The objective of this study was to describe how a technology-based intervention can yield meaningful, objective evidence of intervention exposure within a behavioral intervention. This study demonstrates the analysis of automatic log files, created by software from a videogame intervention, that catalog game play and, as proof of concept, the association of these data with changes in substance use knowledge as documented with standardized assessments. Methods We analyzed 3- and 6-month follow-up data from 166 participants enrolled in a randomized controlled trial evaluating a videogame intervention, PlayForward: Elm City Stories (PlayForward). PlayForward is a videogame developed as a risk reduction and prevention program targeting HIV risk behaviors (substance use and sex) in young minority adolescents. Log files were analyzed to extract the total amount of time spent playing the videogame intervention and the total number of game levels completed and beaten by each player. Results Completing and beating more of the game levels, and not total game play time, was related to higher substance use knowledge scores at the 3- (P=.001) and 6-month (P=.001) follow-ups. Conclusions Our findings highlight the potential contributions a videogame intervention can make to the study of health behavior change. Specifically, the use of objective data collected during game play can address challenges in traditional human-delivered behavioral interventions. Trial Registration Clinicaltrials.gov NCT01666496; https://clinicaltrials.gov/ct2/show/NCT01666496 (Archived by WebCite at http://www.webcitation.org/6cV9fxsOg) PMID:26510775

  17. A dosimetric evaluation of the Eclipse AAA algorithm and Millennium 120 MLC for cranial intensity-modulated radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calvo Ortega, Juan Francisco, E-mail: jfcdrr@yahoo.es; Moragues, Sandra; Pozo, Miquel

    2014-07-01

    The aim of this study is to assess the accuracy of a convolution-based algorithm (anisotropic analytical algorithm [AAA]) implemented in the Eclipse planning system for intensity-modulated radiosurgery (IMRS) planning of small cranial targets by using a 5-mm leaf-width multileaf collimator (MLC). Overall, 24 patient-based IMRS plans for cranial lesions of variable size (0.3 to 15.1 cc) were planned (Eclipse, AAA, version 10.0.28) using fixed field-based IMRS produced by a Varian linear accelerator equipped with a 120 MLC (5-mm width on central leaves). Plan accuracy was evaluated according to phantom-based measurements performed with radiochromic film (EBT2, ISP, Wayne, NJ). Film 2Dmore » dose distributions were performed with the FilmQA Pro software (version 2011, Ashland, OH) by using the triple-channel dosimetry method. Comparison between computed and measured 2D dose distributions was performed using the gamma method (3%/1 mm). Performance of the MLC was checked by inspection of the DynaLog files created by the linear accelerator during the delivery of each dynamic field. The absolute difference between the calculated and measured isocenter doses for all the IMRS plans was 2.5% ± 2.1%. The gamma evaluation method resulted in high average passing rates of 98.9% ± 1.4% (red channel) and 98.9% ± 1.5% (blue and green channels). DynaLog file analysis revealed a maximum root mean square error of 0.46 mm. According to our results, we conclude that the Eclipse/AAA algorithm provides accurate cranial IMRS dose distributions that may be accurately delivered by a Varian linac equipped with a Millennium 120 MLC.« less

  18. Case Master

    DTIC Science & Technology

    2009-04-01

    information on  user’s interests. In that case, the polarity takes the value of zero.  Positive polarity examples: Query, Question/Assertion, cut/paste,  chat ...Polarity Query (Keywords/Question/Assertion) 1  +1 cut/paste 0.9  +1 Selection from list 0.8  +1 Saving/printing 0.7  +1 Chat 0.6  +1 Reading doc/Web...3. logging all VIGEstimates (from UMS and IMS separately) and user  snap  shots as xml files for post‐process  analysis  As new InfoPacks come into the

  19. Self-Regulation during E-Learning: Using Behavioural Evidence from Navigation Log Files

    ERIC Educational Resources Information Center

    Jeske, D.; Backhaus, J.; Stamov Roßnagel, C.

    2014-01-01

    The current paper examined the relationship between perceived characteristics of the learning environment in an e-module in relation to test performance among a group of e-learners. Using structural equation modelling, the relationship between these variables is further explored in terms of the proposed double mediation as outlined by Ning and…

  20. Microanalytic Case studies of Individual Participation Patterns in an Asynchronous Online Discussion in an Undergraduate Blended Course

    ERIC Educational Resources Information Center

    Wise, Alyssa Friend; Perera, Nishan; Hsiao, Ying-Ting; Speer, Jennifer; Marbouti, Farshid

    2012-01-01

    This study presents three case studies of students' participation patterns in an online discussion to address the gap in our current understanding of how "individuals" experience asynchronous learning environments. Cases were constructed via microanalysis of log-file data, post contents, and the evolving discussion structure. The first student was…

  1. Query Classification and Study of University Students' Search Trends

    ERIC Educational Resources Information Center

    Maabreh, Majdi A.; Al-Kabi, Mohammed N.; Alsmadi, Izzat M.

    2012-01-01

    Purpose: This study is an attempt to develop an automatic identification method for Arabic web queries and divide them into several query types using data mining. In addition, it seeks to evaluate the impact of the academic environment on using the internet. Design/methodology/approach: The web log files were collected from one of the higher…

  2. 25 CFR 547.12 - What are the minimum technical standards for downloading on a Class II gaming system?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...

  3. 25 CFR 547.12 - What are the minimum technical standards for downloading on a Class II gaming system?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...

  4. 76 FR 54835 - Child Labor Regulations, Orders and Statements of Interpretation; Child Labor Violations-Civil...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ....m. in your local time zone, or log onto the Wage and Hour Division's Web site for a nationwide... INFORMATION: I. Electronic Access and Filing Comments Public Participation: This notice of proposed rulemaking is available through the Federal Register and the http://www.regulations.gov Web site. You may also...

  5. Capabilities Report 2012, West Desert Test Center

    DTIC Science & Technology

    2012-03-12

    132 FT- IR Spectrometer...electronic system files, paper logs, production batch records, QA/QC data, and PCR data generated during a test. Data analysts also track and QC raw data...Advantage +SL bench-top freeze dryers achieve shelf temperatures as low as -57°C and condenser temperatures to -67°C. The bulk milling facility produces

  6. 15. Photocopy of photograph (4 x 5 inch reduction of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. Photocopy of photograph (4 x 5 inch reduction of 1939 3-1/4 x 5-1/2 inch print, photographer unknown; in Recreation files, Supervisor's Office, Mt. Baker-Snoqualmie National Forest) GENERAL VIEW, LOOKING SOUTHWEST, SHOWING INTERPRETIVE LOG AND PROTECTION ASSISTANT'S HOUSE IN BACKGROUND. - Glacier Ranger Station, Washington State Route 542, Glacier, Whatcom County, WA

  7. Negotiating the Context of Online In-Service Training: "Expert" and "Non-Expert" Footings

    ERIC Educational Resources Information Center

    Nilsen, Mona

    2010-01-01

    This paper focuses on how people working in the Swedish food production industry engage in in-service training by means of computer-mediated communication. The empirical material consists of archived chat log files from a course concerning quality assurance and food safety hazards control in the preparation and handling of foodstuff. Drawing on…

  8. Digging Deeper into Learners' Experiences in MOOCs: Participation in Social Networks outside of MOOCs, Notetaking and Contexts Surrounding Content Consumption

    ERIC Educational Resources Information Center

    Veletsianos, George; Collier, Amy; Schneider, Emily

    2015-01-01

    Researchers describe with increasing confidence "what" they observe participants doing in massive open online courses (MOOCs). However, our understanding of learner activities in open courses is limited by researchers' extensive dependence on log file analyses and clickstream data to make inferences about learner behaviors. Further, the…

  9. Web-Based Learning Programs: Use by Learners with Various Cognitive Styles

    ERIC Educational Resources Information Center

    Chen, Ling-Hsiu

    2010-01-01

    To consider how Web-based learning program is utilized by learners with different cognitive styles, this study presents a Web-based learning system (WBLS) and analyzes learners' browsing data recorded in the log file to identify how learners' cognitive styles and learning behavior are related. In order to develop an adapted WBLS, this study also…

  10. Developing an Efficient Computational Method that Estimates the Ability of Students in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2012-01-01

    This paper presents a computational method that can efficiently estimate the ability of students from the log files of a Web-based learning environment capturing their problem solving processes. The computational method developed in this study approximates the posterior distribution of the student's ability obtained from the conventional Bayes…

  11. Making Sense of Students' Actions in an Open-Ended Virtual Laboratory Environment

    ERIC Educational Resources Information Center

    Gal, Ya'akov; Uzan, Oriel; Belford, Robert; Karabinos, Michael; Yaron, David

    2015-01-01

    A process for analyzing log files collected from open-ended learning environments is developed and tested on a virtual lab problem involving reaction stoichiometry. The process utilizes a set of visualization tools that, by grouping student actions in a hierarchical manner, helps experts make sense of the linear list of student actions recorded in…

  12. Introduction to the Space Physics Analysis Network (SPAN)

    NASA Technical Reports Server (NTRS)

    Green, J. L. (Editor); Peters, D. J. (Editor)

    1985-01-01

    The Space Physics Analysis Network or SPAN is emerging as a viable method for solving an immediate communication problem for the space scientist. SPAN provides low-rate communication capability with co-investigators and colleagues, and access to space science data bases and computational facilities. The SPAN utilizes up-to-date hardware and software for computer-to-computer communications allowing binary file transfer and remote log-on capability to over 25 nationwide space science computer systems. SPAN is not discipline or mission dependent with participation from scientists in such fields as magnetospheric, ionospheric, planetary, and solar physics. Basic information on the network and its use are provided. It is anticipated that SPAN will grow rapidly over the next few years, not only from the standpoint of more network nodes, but as scientists become more proficient in the use of telescience, more capability will be needed to satisfy the demands.

  13. Video Analysis and Remote Digital Ethnography: Approaches to understanding user perspectives and processes involving healthcare information technology.

    PubMed

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    Innovations in healthcare information systems promise to revolutionize and streamline healthcare processes worldwide. However, the complexity of these systems and the need to better understand issues related to human-computer interaction have slowed progress in this area. In this chapter the authors describe their work in using methods adapted from usability engineering, video ethnography and analysis of digital log files for improving our understanding of complex real-world healthcare interactions between humans and technology. The approaches taken are cost-effective and practical and can provide detailed ethnographic data on issues health professionals and consumers encounter while using systems as well as potential safety problems. The work is important in that it can be used in techno-anthropology to characterize complex user interactions with technologies and also to provide feedback into redesign and optimization of improved healthcare information systems.

  14. Mission Operations Center (MOC) - Precipitation Processing System (PPS) Interface Software System (MPISS)

    NASA Technical Reports Server (NTRS)

    Ferrara, Jeffrey; Calk, William; Atwell, William; Tsui, Tina

    2013-01-01

    MPISS is an automatic file transfer system that implements a combination of standard and mission-unique transfer protocols required by the Global Precipitation Measurement Mission (GPM) Precipitation Processing System (PPS) to control the flow of data between the MOC and the PPS. The primary features of MPISS are file transfers (both with and without PPS specific protocols), logging of file transfer and system events to local files and a standard messaging bus, short term storage of data files to facilitate retransmissions, and generation of file transfer accounting reports. The system includes a graphical user interface (GUI) to control the system, allow manual operations, and to display events in real time. The PPS specific protocols are an enhanced version of those that were developed for the Tropical Rainfall Measuring Mission (TRMM). All file transfers between the MOC and the PPS use the SSH File Transfer Protocol (SFTP). For reports and data files generated within the MOC, no additional protocols are used when transferring files to the PPS. For observatory data files, an additional handshaking protocol of data notices and data receipts is used. MPISS generates and sends to the PPS data notices containing data start and stop times along with a checksum for the file for each observatory data file transmitted. MPISS retrieves the PPS generated data receipts that indicate the success or failure of the PPS to ingest the data file and/or notice. MPISS retransmits the appropriate files as indicated in the receipt when required. MPISS also automatically retrieves files from the PPS. The unique feature of this software is the use of both standard and PPS specific protocols in parallel. The advantage of this capability is that it supports users that require the PPS protocol as well as those that do not require it. The system is highly configurable to accommodate the needs of future users.

  15. Assessing spatial uncertainty in reservoir characterization for carbon sequestration planning using public well-log data: A case study

    USGS Publications Warehouse

    Venteris, E.R.; Carter, K.M.

    2009-01-01

    Mapping and characterization of potential geologic reservoirs are key components in planning carbon dioxide (CO2) injection projects. The geometry of target and confining layers is vital to ensure that the injected CO2 remains in a supercritical state and is confined to the target layer. Also, maps of injection volume (porosity) are necessary to estimate sequestration capacity at undrilled locations. Our study uses publicly filed geophysical logs and geostatistical modeling methods to investigate the reliability of spatial prediction for oil and gas plays in the Medina Group (sandstone and shale facies) in northwestern Pennsylvania. Specifically, the modeling focused on two targets: the Grimsby Formation and Whirlpool Sandstone. For each layer, thousands of data points were available to model structure and thickness but only hundreds were available to support volumetric modeling because of the rarity of density-porosity logs in the public records. Geostatistical analysis based on this data resulted in accurate structure models, less accurate isopach models, and inconsistent models of pore volume. Of the two layers studied, only the Whirlpool Sandstone data provided for a useful spatial model of pore volume. Where reliable models for spatial prediction are absent, the best predictor available for unsampled locations is the mean value of the data, and potential sequestration sites should be planned as close as possible to existing wells with volumetric data. ?? 2009. The American Association of Petroleum Geologists/Division of Environmental Geosciences. All rights reserved.

  16. [Errors in laboratory daily practice].

    PubMed

    Larrose, C; Le Carrer, D

    2007-01-01

    Legislation set by GBEA (Guide de bonne exécution des analyses) requires that, before performing analysis, the laboratory directors have to check both the nature of the samples and the patients identity. The data processing of requisition forms, which identifies key errors, was established in 2000 and in 2002 by the specialized biochemistry laboratory, also with the contribution of the reception centre for biological samples. The laboratories follow a strict criteria of defining acceptability as a starting point for the reception to then check requisition forms and biological samples. All errors are logged into the laboratory database and analysis report are sent to the care unit specifying the problems and the consequences they have on the analysis. The data is then assessed by the laboratory directors to produce monthly or annual statistical reports. This indicates the number of errors, which are then indexed to patient files to reveal the specific problem areas, therefore allowing the laboratory directors to teach the nurses and enable corrective action.

  17. Toward a Real-Time (Day) Dreamcatcher: Sensor-Free Detection of Mind Wandering during Online Reading

    ERIC Educational Resources Information Center

    Mills, Caitlin; D'Mello, Sidney

    2015-01-01

    This paper reports the results from a sensor-free detector of mind wandering during an online reading task. Features consisted of reading behaviors (e.g., reading time) and textual features (e.g., level of difficulty) extracted from self-paced reading log files. Supervised machine learning was applied to two datasets in order to predict if…

  18. Real-Time Population Health Detector

    DTIC Science & Technology

    2004-11-01

    military and civilian populations. General Dynamics (then Veridian Systems Division), in cooperation with Stanford University, won a competitive DARPA...via the sequence of one-step ahead forecast errors from the Kalman recursions: 1| −−= tttt Hye µ The log-likelihood then follows by treating the... parking in the transient parking structure. Norfolk Area Military Treatment Facility Patient Files GDAIS received historic CHCS data from all

  19. Diagnostic Problem-Solving Process in Professional Contexts: Theory and Empirical Investigation in the Context of Car Mechatronics Using Computer-Generated Log-Files

    ERIC Educational Resources Information Center

    Abele, Stephan

    2018-01-01

    This article deals with a theory-based investigation of the diagnostic problem-solving process in professional contexts. To begin with, a theory of the diagnostic problem-solving process was developed drawing on findings from different professional contexts. The theory distinguishes between four sub-processes of the diagnostic problem-solving…

  20. Some Features of "Alt" Texts Associated with Images in Web Pages

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    2006-01-01

    Introduction: This paper extends a series on summaries of Web objects, in this case, the alt attribute of image files. Method: Data were logged from 1894 pages from Yahoo!'s random page service and 4703 pages from the Google directory; an img tag was extracted randomly from each where present; its alt attribute, if any, was recorded; and the…

  1. Sediment data collected in 2013 from the northern Chandeleur Islands, Louisiana

    USGS Publications Warehouse

    Buster, Noreen A.; Kelso, Kyle W.; Bernier, Julie C.; Flocks, James G.; Miselis, Jennifer L.; DeWitt, Nancy T.

    2014-01-01

    This data series serves as an archive of sediment data collected in July 2013 from the Chandeleur Islands sand berm and adjacent barrier-island environments. Data products include descriptive core logs, core photographs and x-radiographs, results of sediment grain-size analyses, sample location maps, and Geographic Information System data files with accompanying formal Federal Geographic Data Committee metadata.

  2. Well construction information, lithologic logs, water level data, and overview of research in Handcart Gulch, Colorado: an alpine watershed affected by metalliferous hydrothermal alteration

    USGS Publications Warehouse

    Caine, Jonathan S.; Manning, Andrew H.; Verplanck, Philip L.; Bove, Dana J.; Kahn, Katherine Gurley; Ge, Shemin

    2006-01-01

    Integrated, multidisciplinary studies of the Handcart Gulch alpine watershed provide a unique opportunity to study and characterize the geology and hydrology of an alpine watershed along the Continental Divide. The study area arose out of the donation of four abandoned, deep mineral exploration boreholes to the U.S. Geological Survey for research purposes by Mineral Systems Inc. These holes were supplemented with nine additional shallow holes drilled by the U.S. Geological Survey along the Handcart Gulch trunk stream. All of the holes were converted into observation wells, and a variety of data and samples were measured and collected from each. This open-file report contains: (1) An overview of the research conducted to date in Handcart Gulch; (2) well location, construction, lithologic log, and water level data from the research boreholes; and (3) a brief synopsis of preliminary results. The primary purpose of this report is to provide a research overview as well as raw data from the boreholes. Interpretation of the data will be reported in future publications. The drill hole data were tabulated into a spreadsheet included with this digital open-file report.

  3. LAS - LAND ANALYSIS SYSTEM, VERSION 5.0

    NASA Technical Reports Server (NTRS)

    Pease, P. B.

    1994-01-01

    The Land Analysis System (LAS) is an image analysis system designed to manipulate and analyze digital data in raster format and provide the user with a wide spectrum of functions and statistical tools for analysis. LAS offers these features under VMS with optional image display capabilities for IVAS and other display devices as well as the X-Windows environment. LAS provides a flexible framework for algorithm development as well as for the processing and analysis of image data. Users may choose between mouse-driven commands or the traditional command line input mode. LAS functions include supervised and unsupervised image classification, film product generation, geometric registration, image repair, radiometric correction and image statistical analysis. Data files accepted by LAS include formats such as Multi-Spectral Scanner (MSS), Thematic Mapper (TM) and Advanced Very High Resolution Radiometer (AVHRR). The enhanced geometric registration package now includes both image to image and map to map transformations. The over 200 LAS functions fall into image processing scenario categories which include: arithmetic and logical functions, data transformations, fourier transforms, geometric registration, hard copy output, image restoration, intensity transformation, multispectral and statistical analysis, file transfer, tape profiling and file management among others. Internal improvements to the LAS code have eliminated the VAX VMS dependencies and improved overall system performance. The maximum LAS image size has been increased to 20,000 lines by 20,000 samples with a maximum of 256 bands per image. The catalog management system used in earlier versions of LAS has been replaced by a more streamlined and maintenance-free method of file management. This system is not dependent on VAX/VMS and relies on file naming conventions alone to allow the use of identical LAS file names on different operating systems. While the LAS code has been improved, the original capabilities of the system have been preserved. These include maintaining associated image history, session logging, and batch, asynchronous and interactive mode of operation. The LAS application programs are integrated under version 4.1 of an interface called the Transportable Applications Executive (TAE). TAE 4.1 has four modes of user interaction: menu, direct command, tutor (or help), and dynamic tutor. In addition TAE 4.1 allows the operation of LAS functions using mouse-driven commands under the TAE-Facelift environment provided with TAE 4.1. These modes of operation allow users, from the beginner to the expert, to exercise specific application options. LAS is written in C-language and FORTRAN 77 for use with DEC VAX computers running VMS with approximately 16Mb of physical memory. This program runs under TAE 4.1. Since TAE 4.1 is not a current version of TAE, TAE 4.1 is included within the LAS distribution. Approximately 130,000 blocks (65Mb) of disk storage space are necessary to store the source code and files generated by the installation procedure for LAS and 44,000 blocks (22Mb) of disk storage space are necessary for TAE 4.1 installation. The only other dependencies for LAS are the subroutine libraries for the specific display device(s) that will be used with LAS/DMS (e.g. X-Windows and/or IVAS). The standard distribution medium for LAS is a set of two 9track 6250 BPI magnetic tapes in DEC VAX BACKUP format. It is also available on a set of two TK50 tape cartridges in DEC VAX BACKUP format. This program was developed in 1986 and last updated in 1992.

  4. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). Version 3.5, Quick Reference Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, B.G.; Richards, R.E.; Reece, W.J.

    1992-10-01

    This Reference Guide contains instructions on how to install and use Version 3.5 of the NRC-sponsored Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). The NUCLARR data management system is contained in compressed files on the floppy diskettes that accompany this Reference Guide. NUCLARR is comprised of hardware component failure data (HCFD) and human error probability (HEP) data, both of which are available via a user-friendly, menu driven retrieval system. The data may be saved to a file in a format compatible with IRRAS 3.0 and commercially available statistical packages, or used to formulate log-plots and reports of data retrievalmore » and aggregation findings.« less

  5. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, B.G.; Richards, R.E.; Reece, W.J.

    1992-10-01

    This Reference Guide contains instructions on how to install and use Version 3.5 of the NRC-sponsored Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR). The NUCLARR data management system is contained in compressed files on the floppy diskettes that accompany this Reference Guide. NUCLARR is comprised of hardware component failure data (HCFD) and human error probability (HEP) data, both of which are available via a user-friendly, menu driven retrieval system. The data may be saved to a file in a format compatible with IRRAS 3.0 and commercially available statistical packages, or used to formulate log-plots and reports of data retrievalmore » and aggregation findings.« less

  6. Cloud Based Drive Forensic and DDoS Analysis on Seafile as Case Study

    NASA Astrophysics Data System (ADS)

    Bahaweres, R. B.; Santo, N. B.; Ningsih, A. S.

    2017-01-01

    The rapid development of Internet due to increasing data rates through both broadband cable networks and 4G wireless mobile, make everyone easily connected to the internet. Storages as Services (StaaS) is more popular and many users want to store their data in one place so that whenever they need they can easily access anywhere, any place and anytime in the cloud. The use of the service makes it vulnerable to use by someone to commit a crime or can do Denial of Service (DoS) on cloud storage services. The criminals can use the cloud storage services to store, upload and download illegal file or document to the cloud storage. In this study, we try to implement a private cloud storage using Seafile on Raspberry Pi and perform simulations in Local Area Network and Wi-Fi environment to analyze forensically to discover or open a criminal act can be traced and proved forensically. Also, we can identify, collect and analyze the artifact of server and client, such as a registry of the desktop client, the file system, the log of seafile, the cache of the browser, and database forensic.

  7. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the datamore » into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.« less

  8. Perceived Task-Difficulty Recognition from Log-File Information for the Use in Adaptive Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Janning, Ruth; Schatten, Carlotta; Schmidt-Thieme, Lars

    2016-01-01

    Recognising students' emotion, affect or cognition is a relatively young field and still a challenging task in the area of intelligent tutoring systems. There are several ways to use the output of these recognition tasks within the system. The approach most often mentioned in the literature is using it for giving feedback to the students. The…

  9. Scalable Trust of Next-Generation Management (STRONGMAN)

    DTIC Science & Technology

    2004-10-01

    remote logins might be policy controlled to allow only strongly encrypted IPSec tunnels to log in remotely, to access selected files, etc. The...and Angelos D. Keromytis. Drop-in Security for Distributed and Portable Computing Elements. Emerald Journal of Internet Research. Electronic...Security and Privacy, pp. 17-31, May 1999. [2] S. M. Bellovin. Distributed Firewalls. ; login : magazine, special issue on security, November 1999. [3] M

  10. 77 FR 66608 - New England Hydropower Company, LLC; Notice of Preliminary Permit Application Accepted for Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-06

    ... Spillway Dike with an 8-foot-long stop-log slot; (2) an existing 31-foot-long, 42-inch-diameter low level penstock; (3) an existing 0.13 acre impoundment with a normal maximum water surface elevation of 66.3 feet... transmission line connected to the NSTAR regional grid. The project would have an estimated average annual...

  11. 76 FR 7838 - Claverack Creek, LLC; Notice of Preliminary Permit Application Accepted for Filing and Soliciting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ...-deep intake canal; (5) new trash racks, head gates, and stop log structure; (6) an existing 6-foot... Internet. See 18 CFR 385.2001(a)(1)(iii) and the instructions on the Commission's Web site http://www.ferc... copy of the application, can be viewed or printed on the ``eLibrary'' link of the Commission's Web site...

  12. Evaluation of an interactive case simulation system in dermatology and venereology for medical students

    PubMed Central

    Wahlgren, Carl-Fredrik; Edelbring, Samuel; Fors, Uno; Hindbeck, Hans; Ståhle, Mona

    2006-01-01

    Background Most of the many computer resources used in clinical teaching of dermatology and venereology for medical undergraduates are information-oriented and focus mostly on finding a "correct" multiple-choice alternative or free-text answer. We wanted to create an interactive computer program, which facilitates not only factual recall but also clinical reasoning. Methods Through continuous interaction with students, a new computerised interactive case simulation system, NUDOV, was developed. It is based on authentic cases and contains images of real patients, actors and healthcare providers. The student selects a patient and proposes questions for medical history, examines the skin, and suggests investigations, diagnosis, differential diagnoses and further management. Feedback is given by comparing the user's own suggestions with those of a specialist. In addition, a log file of the student's actions is recorded. The program includes a large number of images, video clips and Internet links. It was evaluated with a student questionnaire and by randomising medical students to conventional teaching (n = 85) or conventional teaching plus NUDOV (n = 31) and comparing the results of the two groups in a final written examination. Results The questionnaire showed that 90% of the NUDOV students stated that the program facilitated their learning to a large/very large extent, and 71% reported that extensive working with authentic computerised cases made it easier to understand and learn about diseases and their management. The layout, user-friendliness and feedback concept were judged as good/very good by 87%, 97%, and 100%, respectively. Log files revealed that the students, in general, worked with each case for 60–90 min. However, the intervention group did not score significantly better than the control group in the written examination. Conclusion We created a computerised case simulation program allowing students to manage patients in a non-linear format supporting the clinical reasoning process. The student gets feedback through comparison with a specialist, eliminating the need for external scoring or correction. The model also permits discussion of case processing, since all transactions are stored in a log file. The program was highly appreciated by the students, but did not significantly improve their performance in the written final examination. PMID:16907972

  13. SU-D-BRC-03: Development and Validation of an Online 2D Dose Verification System for Daily Patient Plan Delivery Accuracy Check

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, J; Hu, W; Xing, Y

    Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, positionmore » and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blackwell, David D.; Chickering Pace, Cathy; Richards, Maria C.

    The National Geothermal Data System (NGDS) is a Department of Energy funded effort to create a single cataloged source for a variety of geothermal information through a distributed network of databases made available via web services. The NGDS will help identify regions suitable for potential development and further scientific data collection and analysis of geothermal resources as a source for clean, renewable energy. A key NGDS repository or ‘node’ is located at Southern Methodist University developed by a consortium made up of: • SMU Geothermal Laboratory • Siemens Corporate Technology, a division of Siemens Corporation • Bureau of Economic Geologymore » at the University of Texas at Austin • Cornell Energy Institute, Cornell University • Geothermal Resources Council • MLKay Technologies • Texas Tech University • University of North Dakota. The focus of resources and research encompass the United States with particular emphasis on the Gulf Coast (on and off shore), the Great Plains, and the Eastern U.S. The data collection includes the thermal, geological and geophysical characteristics of these area resources. Types of data include, but are not limited to, temperature, heat flow, thermal conductivity, radiogenic heat production, porosity, permeability, geological structure, core geophysical logs, well tests, estimated reservoir volume, in situ stress, oil and gas well fluid chemistry, oil and gas well information, and conventional and enhanced geothermal system related resources. Libraries of publications and reports are combined into a unified, accessible, catalog with links for downloading non-copyrighted items. Field notes, individual temperature logs, site maps and related resources are included to increase data collection knowledge. Additional research based on legacy data to improve quality increases our understanding of the local and regional geology and geothermal characteristics. The software to enable the integration, analysis, and dissemination of this team’s NGDS contributions was developed by Siemens Corporate Technology. The SMU Node interactive application is accessible at http://geothermal.smu.edu. Additionally, files may be downloaded from either http://geothermal.smu.edu:9000/geoserver/web/ or through http://geothermal.smu.edu/static/DownloadFilesButtonPage.htm. The Geothermal Resources Council Library is available at https://www.geothermal-library.org/.« less

  15. COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.

    PubMed

    Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas

    2014-12-14

    With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.

  16. Console Log Keeping Made Easier - Tools and Techniques for Improving Quality of Flight Controller Activity Logs

    NASA Technical Reports Server (NTRS)

    Scott, David W.; Underwood, Debrah (Technical Monitor)

    2002-01-01

    At the Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) for International Space Station (ISS), each flight controller maintains detailed logs of activities and communications at their console position. These logs are critical for accurately controlling flight in real-time as well as providing a historical record and troubleshooting tool. This paper describes logging methods and electronic formats used at the POIC and provides food for thought on their strengths and limitations, plus proposes some innovative extensions. It also describes an inexpensive PC-based scheme for capturing and/or transcribing audio clips from communications consoles. Flight control activity (e.g. interpreting computer displays, entering data/issuing electronic commands, and communicating with others) can become extremely intense. It's essential to document it well, but the effort to do so may conflict with actual activity. This can be more than just annoying, as what's in the logs (or just as importantly not in them) often feeds back directly into the quality of future operations, whether short-term or long-term. In earlier programs, such as Spacelab, log keeping was done on paper, often using position-specific shorthand, and the other reader was at the mercy of the writer's penmanship. Today, user-friendly software solves the legibility problem and can automate date/time entry, but some content may take longer to finish due to individual typing speed and less use of symbols. File layout can be used to great advantage in making types of information easy to find, and creating searchable master logs for a given position is very easy and a real lifesaver in reconstructing events or researching a given topic. We'll examine log formats from several console position, and the types of information that are included and (just as importantly) excluded. We'll also look at when a summary or synopsis is effective, and when extensive detail is needed.

  17. Archive of digital CHIRP seismic reflection data collected during USGS cruise 06FSH01 offshore of Siesta Key, Florida, May 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.; Robbins, Lisa L.

    2007-01-01

    In May of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Siesta Key, Florida. This report serves as an archive of unprocessed digital chirp seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  18. Archive of digital CHIRP seismic reflection data collected during USGS cruise 06SCC01 offshore of Isles Dernieres, Louisiana, June 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Ferina, Nick F.; Wiese, Dana S.; Flocks, James G.

    2007-01-01

    In June of 2006, the U.S. Geological Survey conducted a geophysical survey offshore of Isles Dernieres, Louisiana. This report serves as an archive of unprocessed digital CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic UNIX (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  19. Aero/fluids database system

    NASA Technical Reports Server (NTRS)

    Reardon, John E.; Violett, Duane L., Jr.

    1991-01-01

    The AFAS Database System was developed to provide the basic structure of a comprehensive database system for the Marshall Space Flight Center (MSFC) Structures and Dynamics Laboratory Aerophysics Division. The system is intended to handle all of the Aerophysics Division Test Facilities as well as data from other sources. The system was written for the DEC VAX family of computers in FORTRAN-77 and utilizes the VMS indexed file system and screen management routines. Various aspects of the system are covered, including a description of the user interface, lists of all code structure elements, descriptions of the file structures, a description of the security system operation, a detailed description of the data retrieval tasks, a description of the session log, and a description of the archival system.

  20. Self-optimizing Monte Carlo method for nuclear well logging simulation

    NASA Astrophysics Data System (ADS)

    Liu, Lianyan

    1997-09-01

    In order to increase the efficiency of Monte Carlo simulation for nuclear well logging problems, a new method has been developed for variance reduction. With this method, an importance map is generated in the regular Monte Carlo calculation as a by-product, and the importance map is later used to conduct the splitting and Russian roulette for particle population control. By adopting a spatial mesh system, which is independent of physical geometrical configuration, the method allows superior user-friendliness. This new method is incorporated into the general purpose Monte Carlo code MCNP4A through a patch file. Two nuclear well logging problems, a neutron porosity tool and a gamma-ray lithology density tool are used to test the performance of this new method. The calculations are sped up over analog simulation by 120 and 2600 times, for the neutron porosity tool and for the gamma-ray lithology density log, respectively. The new method enjoys better performance by a factor of 4~6 times than that of MCNP's cell-based weight window, as per the converged figure-of-merits. An indirect comparison indicates that the new method also outperforms the AVATAR process for gamma-ray density tool problems. Even though it takes quite some time to generate a reasonable importance map from an analog run, a good initial map can create significant CPU time savings. This makes the method especially suitable for nuclear well logging problems, since one or several reference importance maps are usually available for a given tool. Study shows that the spatial mesh sizes should be chosen according to the mean-free-path. The overhead of the importance map generator is 6% and 14% for neutron and gamma-ray cases. The learning ability towards a correct importance map is also demonstrated. Although false-learning may happen, physical judgement can help diagnose with contributon maps. Calibration and analysis are performed for the neutron tool and the gamma-ray tool. Due to the fact that a very good initial importance map is always available after the first point has been calculated, high computing efficiency is maintained. The availability of contributon maps provides an easy way of understanding the logging measurement and analyzing for the depth of investigation.

  1. Automated lithology prediction from PGNAA and other geophysical logs.

    PubMed

    Borsaru, M; Zhou, B; Aizawa, T; Karashima, H; Hashimoto, T

    2006-02-01

    Different methods of lithology predictions from geophysical data have been developed in the last 15 years. The geophysical logs used for predicting lithology are the conventional logs: sonic, neutron-neutron, gamma (total natural-gamma) and density (backscattered gamma-gamma). The prompt gamma neutron activation analysis (PGNAA) is another established geophysical logging technique for in situ element analysis of rocks in boreholes. The work described in this paper was carried out to investigate the application of PGNAA to the lithology interpretation. The data interpretation was conducted using the automatic interpretation program LogTrans based on statistical analysis. Limited test suggests that PGNAA logging data can be used to predict the lithology. A success rate of 73% for lithology prediction was achieved from PGNAA logging data only. It can also be used in conjunction with the conventional geophysical logs to enhance the lithology prediction.

  2. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  3. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  4. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  5. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  6. 40 CFR Appendix A to Subpart Bb of... - State Requirements Incorporated by Reference in Subpart BB of Part 147 of the Code of Federal...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... Construction-no conflict with board of land commissioners' authority. Section 82-11-105 through 82-11-110... Cuttings. Rule 36.22.1013. Filing of Completion Reports, Well Logs, Analyses, Reports, and Surveys. Rule 36.... Gas Oil Ratio Tests. Rule 36.22.1217. Water Production Report. Rule 36.22.1218. Gas to be Metered...

  7. 105-KE Isolation Barrier Leak Rate Acceptance Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCracken, K.J.

    1995-06-14

    This Acceptance Test Report (ATR) contains the completed and signed Acceptance Procedure (ATP) for the 105-KE Isolations Barrier Leak Rate Test. The Test Engineer`s log, the completed sections of the ATP in the Appendix for Repeat Testing (Appendix K), the approved WHC J-7s (Appendix H), the data logger files (Appendices T and U), and the post test calibration checks (Appendix V) are included.

  8. Gigabit Network Communications Research

    DTIC Science & Technology

    1992-12-31

    additional BPF channels, raw bytesync support for video codecs, and others. All source file modifications were logged with RCS. Source and object trees were...34 (RFCs). 20 RFCs were published this quarter: RFC 1366: Gerich, E., " Guidelines for Management of IP Address Space", Merit, October 1992. RFC 1367...Topolcic, C., "Schedule for IP Address Space Management Guidelines ", CNRI, October 1992. RFC 1368: McMaster, D. (Synoptics Communications, Inc.), K

  9. VizieR Online Data Catalog: Reference Catalogue of Bright Galaxies (RC1; de Vaucouleurs+ 1964)

    NASA Astrophysics Data System (ADS)

    de Vaucouleurs, G.; de Vaucouleurs, A.

    1995-11-01

    The Reference Catalogue of Bright Galaxies lists for each entry the following information: NGC number, IC number, or A number; A, B, or C designation; B1950.0 positions, position at 100 year precession; galactic and supergalactic positions; revised morphological type and source; type and color class in Yerkes list 1 and 2; Hubble-Sandage type; revised Hubble type according to Holmberg; logarithm of mean major diameter (log D) and ratio of major to minor diameter (log R) and their weights; logarithm of major diameter; sources of the diameters; David Dunlap Observatory type and luminosity class; Harvard photographic apparent magnitude; weight of V, B-V(0), U-B(0); integrated magnitude B(0) and its weight in the B system; mean surface brightness in magnitude per square minute of arc and sources for the B magnitude; mean B surface brightness derived from corrected Harvard magnitude; the integrated color index in the standard B-V system; "intrinsic" color index; sources of B-V and/or U-B; integrated color in the standard U-B system; observed radial velocity in km/sec; radial velocity corrected for solar motion in km/sec; sources of radial velocities; solar motion correction; and direct photographic source. The catalog was created by concatenating four files side by side. (1 data file).

  10. A CCD search of short period variable stars in six selected fields. (Italian Title: Ricerca CCD di variabili a breve periodo in sei campi selezionati)

    NASA Astrophysics Data System (ADS)

    Valentini, S.

    2013-12-01

    A search of variable stars was carried out, using a new software specifically created by the author, on a series of images acquired at the Astronomical Observatory of Santa Lucia di Stroncone (Terni, Italy) between October 2010 and March 2012. This research, named Fast Variable Stars Survey (FVSS), arose from the idea to verify if the log files pr oduced by the software Astrometrica (H. Raab), could be used as a basis for rapid detection of short-period variable stars. The r esults obtained showed that the idea is very valid, so that the new software has allowed the identification and the correct determination of the period of thirty-two new variable stars in the six stellar fields subjected to analysis.

  11. VizieR Online Data Catalog: Abundance analysis of 9 very metal-poor stars (O'Malley+, 2017)

    NASA Astrophysics Data System (ADS)

    O'Malley, E. M.; McWilliam, A.; Chaboyer, B.; Thompson, I.

    2017-10-01

    We were awarded time on HST to obtain fine guidance sensor (FGS) parallaxes of nine very metal-poor stars with the goal of extending the range of metallicities below at least [Fe/H]=-2.3dex for stars with well-determined parallaxes. High-resolution spectroscopy of the nine target stars were obtained between 2008 and 2012 using the Magellan Inamori Kyocera Echelle (MIKE) double spectrograph on the 6.5m Magellan II Clay Telescope at Las Campanas Observatory, Chile (R=48000 for the red side and R=55000 for the blue side), and the High-Resolution Echelle Spectrometer (HiRES) on the twin telescopes at the W. M. Keck Observatory (R~70500). A log of the spectroscopic observations along with the HST F606W magnitudes and parallaxes appears in Table 1. (5 data files).

  12. Historical files from Federal government mineral exploration-assistance programs, 1950 to 1974

    USGS Publications Warehouse

    Frank, David G.

    2010-01-01

    Congress enacted the Defense Production Act in 1950 to provide funding and support for the exploration and development of critical mineral resources. From 1950 to 1974, three Department of the Interior agencies carried out this mission. Contracts with mine owners provided financial assistance for mineral exploration on a joint-participation basis. These contracts are documented in more than 5,000 'dockets' now archived online by the U.S. Geological Survey. This archive provides access to unique and difficult to recreate information, such as drill logs, assay results, and underground geologic maps, that is invaluable to land and resource management organizations and the minerals industry. An effort to preserve the data began in 2009, and the entire collection of dockets was electronically scanned. The scanning process used optical character recognition (OCR) when possible, and files were converted into Portable Document Format (.pdf) files, which require Adobe Reader or similar software for viewing. In 2010, the scans were placed online (http://minerals.usgs.gov/dockets/) and are available to download free of charge.

  13. Archive of Digitized Analog Boomer Seismic Reflection Data Collected from the Mississippi-Alabama-Florida Shelf During Cruises Onboard the R/V Kit Jones, June 1990 and July 1991

    USGS Publications Warehouse

    Sanford, Jordan M.; Harrison, Arnell S.; Wiese, Dana S.; Flocks, James G.

    2009-01-01

    In June of 1990 and July of 1991, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the shallow geologic framework of the Mississippi-Alabama-Florida shelf in the northern Gulf of Mexico, from Mississippi Sound to the Florida Panhandle. Work was done onboard the Mississippi Mineral Resources Institute R/V Kit Jones as part of a project to study coastal erosion and offshore sand resources. This report is part of a series to digitally archive the legacy analog data collected from the Mississippi-Alabama SHelf (MASH). The MASH data rescue project is a cooperative effort by the USGS and the Minerals Management Service (MMS). This report serves as an archive of high-resolution scanned Tagged Image File Format (TIFF) and Graphics Interchange Format (GIF) images of the original boomer paper records, navigation files, trackline maps, Geographic Information System (GIS) files, cruise logs, and formal Federal Geographic Data Committee (FGDC) metadata.

  14. Multipurpose Controller with EPICS integration and data logging: BPM application for ESS Bilbao

    NASA Astrophysics Data System (ADS)

    Arredondo, I.; del Campo, M.; Echevarria, P.; Jugo, J.; Etxebarria, V.

    2013-10-01

    This work presents a multipurpose configurable control system which can be integrated in an EPICS control network, this functionality being configured through a XML configuration file. The core of the system is the so-called Hardware Controller which is in charge of the control hardware management, the set up and communication with the EPICS network and the data storage. The reconfigurable nature of the controller is based on a single XML file, allowing any final user to easily modify and adjust the control system to any specific requirement. The selected Java development environment ensures a multiplatform operation and large versatility, even regarding the control hardware to be controlled. Specifically, this paper, focused on fast control based on a high performance FPGA, describes also an application approach for the ESS Bilbao's Beam Position Monitoring system. The implementation of the XML configuration file and the satisfactory performance outcome achieved are presented, as well as a general description of the Multipurpose Controller itself.

  15. Evaluation of a web-based lifestyle coach designed to maintain a healthy bodyweight.

    PubMed

    Kelders, Saskia M; van Gemert-Pijnen, Julia E W C; Werkman, Andrea; Seydel, Erwin R

    2010-01-01

    We evaluated a web-based intervention, the Healthy Weight Assistant (HWA), which was designed to help people with a healthy bodyweight, or those who are slightly overweight, to achieve and maintain a healthy weight. Four evaluation methods were used: (1) pre- and post-test questionnaires; (2) real time usability-tests; (3) log-file analysis; (4) qualitative analysis of forum posts, email messages and free-text responses in the questionnaires. A total of 703 respondents received access to the HWA. Six weeks after receiving access, 431 respondents completed a second questionnaire. The enthusiastic responses showed that many people were interested in using an interactive online application to support achieving and maintaining a healthy weight. The preliminary results suggest that improvements with respect to motivation may lead to large effects, yet require only small changes in the design of the HWA. Sending automatic tailored reminders may enhance motivation to keep using the application. Motivation to change behaviour may be enhanced by emphasizing goal setting and visualizing progress.

  16. VizieR Online Data Catalog: X-ray sources in Hickson Compact Groups (Tzanavaris+, 2014)

    NASA Astrophysics Data System (ADS)

    Tzanavaris, P.; Gallagher, S. C.; Hornschemeier, A. E.; Fedotov, K.; Eracleous, M.; Brandt, W. N.; Desjardins, T. D.; Charlton, J. C.; Gronwall, C.

    2014-06-01

    By virtue of their selection criteria, Hickson Compact Groups (HCGs) constitute a distinct class among small galaxy agglomerations. The Hickson catalog (Hickson et al. 1992, Cat. VII/213) comprises 92 spectroscopically confirmed nearby compact groups with three or more members with accordant redshifts (i.e., within 1000km/s of the group mean). In this paper we present nine of these groups, for which both archival Chandra X-ray and Swift UVOT ultraviolet data are available. An observation log for the Chandra data is presented in Table 1. An observation log for the Swift UVOT data is presented in Tzanavaris et al. (2010ApJ...716..556T). In addition, note that in the present work we have included UVOT data for HCGs 90 and 92. (3 data files).

  17. SatelliteDL: a Toolkit for Analysis of Heterogeneous Satellite Datasets

    NASA Astrophysics Data System (ADS)

    Galloy, M. D.; Fillmore, D.

    2014-12-01

    SatelliteDL is an IDL toolkit for the analysis of satellite Earth observations from a diverse set of platforms and sensors. The core function of the toolkit is the spatial and temporal alignment of satellite swath and geostationary data. The design features an abstraction layer that allows for easy inclusion of new datasets in a modular way. Our overarching objective is to create utilities that automate the mundane aspects of satellite data analysis, are extensible and maintainable, and do not place limitations on the analysis itself. IDL has a powerful suite of statistical and visualization tools that can be used in conjunction with SatelliteDL. Toward this end we have constructed SatelliteDL to include (1) HTML and LaTeX API document generation,(2) a unit test framework,(3) automatic message and error logs,(4) HTML and LaTeX plot and table generation, and(5) several real world examples with bundled datasets available for download. For ease of use, datasets, variables and optional workflows may be specified in a flexible format configuration file. Configuration statements may specify, for example, a region and date range, and the creation of images, plots and statistical summary tables for a long list of variables. SatelliteDL enforces data provenance; all data should be traceable and reproducible. The output NetCDF file metadata holds a complete history of the original datasets and their transformations, and a method exists to reconstruct a configuration file from this information. Release 0.1.0 distributes with ingest methods for GOES, MODIS, VIIRS and CERES radiance data (L1) as well as select 2D atmosphere products (L2) such as aerosol and cloud (MODIS and VIIRS) and radiant flux (CERES). Future releases will provide ingest methods for ocean and land surface products, gridded and time averaged datasets (L3 Daily, Monthly and Yearly), and support for 3D products such as temperature and water vapor profiles. Emphasis will be on NPP Sensor, Environmental and Climate Data Records as they become available. To obtain SatelliteDL, please visit the project website at http://www.txcorp.com/SatelliteDL

  18. Poster — Thur Eve — 30: 4D VMAT dose calculation methodology to investigate the interplay effect: experimental validation using TrueBeam Developer Mode and Gafchromic film

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teke, T; Milette, MP; Huang, V

    2014-08-15

    The interplay effect between the tumor motion and the radiation beam modulation during a VMAT treatment delivery alters the delivered dose distribution from the planned one. This work present and validate a method to accurately calculate the dose distribution in 4D taking into account the tumor motion, the field modulation and the treatment starting phase. A QUASAR™ respiratory motion phantom was 4D scanned with motion amplitude of 3 cm and with a 3 second period. A static scan was also acquired with the lung insert and the tumor contained in it centered. A VMAT plan with a 6XFFF beam wasmore » created on the averaged CT and delivered on a Varian TrueBeam and the trajectory log file was saved. From the trajectory log file 10 VMAT plans (one for each breathing phase) and a developer mode XML file were created. For the 10 VMAT plans, the tumor motion was modeled by moving the isocentre on the static scan, the plans were re-calculated and summed in the treatment planning system. In the developer mode, the tumor motion was simulated by moving the couch dynamically during the treatment. Gafchromic films were placed in the QUASAR phantom static and irradiated using the developer mode. Different treatment starting phase were investigated (no phase shift, maximum inhalation and maximum exhalation). Calculated and measured isodose lines and profiles are in very good agreement. For each starting phase, the dose distribution exhibit significant differences but are accurately calculated with the methodology presented in this work.« less

  19. Archive of single-beam bathymetry data collected during USGS cruise 07CCT01 nearshore of Fort Massachusetts and within Camille Cut, West and East Ship Islands, Gulf Islands National Seashore, Mississippi, July 2007

    USGS Publications Warehouse

    DeWitt, Nancy T.; Flocks, James G.; Reynolds, B.J.; Hansen, Mark

    2012-01-01

    The Gulf Islands National Seashore (GUIS) is composed of a series of barrier islands along the Mississippi - Alabama coastline. Historically these islands have undergone long-term shoreline change. The devastation of Hurricane Katrina in 2005 prompted questions about the stability of the barrier islands and their potential response to future storm impacts. Additionally, there was concern from the National Park Service (NPS) about the preservation of the historical Fort Massachusetts, located on West Ship Island. During the early 1900s, Ship Island was an individual island. In 1969 Hurricane Camille breached Ship Island, widening the cut and splitting it into what is now known as West Ship Island and East Ship Island. In July of 2007, the U.S. Geological Survey (USGS) was able to provide the NPS with a small bathymetric survey of Camille Cut using high-resolution single-beam bathymetry. This provided GUIS with a post-Katrina assessment of the bathymetry in Camille Cut and along the northern shoreline directly in front of Fort Massachusetts. Ultimately, this survey became an initial bathymetry dataset toward a larger USGS effort included in the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility Project (http://ngom.usgs.gov/gomsc/mscip/). This report serves as an archive of the processed single-beam bathymetry. Data products herein include gridded and interpolated digital depth surfaces and x,y,z data products. Additional files include trackline maps, navigation files, geographic information system (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Scanned images of the handwritten FACS logs and digital FACS logs are also provided as PDF files. Refer to the Acronyms page for description of acronyms and abbreviations used in this report or hold the cursor over an acronym for a pop-up explanation. The USGS St. Petersburg Coastal and Marine Science Center assigns a unique identifier to each cruise or field activity. For example, 07CCT01 tells us the data were collected in 2007 for the Coastal Change and Transport (CCT) study and the data were collected during the first (01) field activity for that project in that calendar year. Refer to http://walrus.wr.usgs.gov/infobank/programs/html/definition/activity.html for a detailed description of the method used to assign the field activity ID. Data were collected using a 26-foot (ft) Glacier Bay catamaran. The single-beam transducers were sled mounted on a rail attached between the catamaran hulls. Navigation was acquired using HYPACK, Inc., Hypack version 4.3a.7.1 and differentially corrected using land-based GPS stations. See the digital FACS equipment log for details about the acquisition equipment used. Raw datasets were stored digitally and processed systematically using NovAtel's Waypoint GrafNav version 7.6, SANDS version 3.7, and ESRI ArcGIS version 9.3.1. For more information on processing refer to the Equipment and Processing page.

  20. SU-E-T-67: A Quality Assurance Procedure for VMAT Delivery Technique with Multiple Verification Metric Using TG-119 Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katsuta, Y; Kadoya, N; Shimizu, E

    2015-06-15

    Purpose: A successful VMAT plan delivery includes precise modulations of dose rate, gantry rotational and multi-leaf collimator shapes. The purpose of this research is to construct routine QA protocol which focuses on VMAT delivery technique and to obtain a baseline including dose error, fluence distribution and mechanical accuracy during VMAT. Methods: The mock prostate, head and neck (HN) cases supplied from AAPM were used in this study. A VMAT plans were generated in Monaco TPS according to TG-119 protocol. Plans were created using 6 MV and 10 MV photon beams for each case. The phantom based measurement, fluence measurement andmore » log files analysis were performed. The dose measurement was performed using 0.6 cc ion chamber, which located at isocenter. The fluence distribution were acquired using the MapCHECK2 mounted in the MapPHAN. The trajectory log files recorded inner 20 leaf pairs and gantry angle positions at every 0.25 sec interval were exported to in-house software developed by MATLAB and determined those RMS values. Results: The dose difference is expressed as a ratio of the difference between measured and planned doses. The dose difference for 6 MV was 0.91%, for 10 MV was 0.67%. In turn, the fluence distribution using gamma criteria of 2%/2 mm with a 50% minimum dose threshold for 6 MV was 98.8%, for 10 MV was 97.5%, respectively. The RMS values of MLC for 6 MV and 10 MV were 0.32 mm and 0.37 mm, of gantry were 0.33 degree and 0.31 degree. Conclusion: In this study, QA protocol to assess VMAT delivery accuracy is constructed and results acquired in this study are used as a baseline of VMAT delivery performance verification.« less

  1. Dose calculation and verification of the Vero gimbal tracking treatment delivery

    NASA Astrophysics Data System (ADS)

    Prasetio, H.; Wölfelschneider, J.; Ziegler, M.; Serpa, M.; Witulla, B.; Bert, C.

    2018-02-01

    The Vero linear accelerator delivers dynamic tumor tracking (DTT) treatment using a gimbal motion. However, the availability of treatment planning systems (TPS) to simulate DTT is limited. This study aims to implement and verify the gimbal tracking beam geometry in the dose calculation. Gimbal tracking was implemented by rotating the reference CT outside the TPS according to the ring, gantry, and gimbal tracking position obtained from the tracking log file. The dose was calculated using these rotated CTs. The geometric accuracy was verified by comparing calculated and measured film response using a ball bearing phantom. The dose was verified by comparing calculated 2D dose distributions and film measurements in a ball bearing and a homogeneous phantom using a gamma criterion of 2%/2 mm. The effect of implementing the gimbal tracking beam geometry in a 3D patient data dose calculation was evaluated using dose volume histograms (DVH). Geometrically, the gimbal tracking implementation accuracy was  <0.94 mm. The isodose lines agreed with the film measurement. The largest dose difference of 9.4% was observed at maximum tilt positions with an isocenter and target separation of 17.51 mm. Dosimetrically, gamma passing rates were  >98.4%. The introduction of the gimbal tracking beam geometry in the dose calculation shifted the DVH curves by 0.05%-1.26% for the phantom geometry and by 5.59% for the patient CT dataset. This study successfully demonstrates a method to incorporate the gimbal tracking beam geometry into dose calculations. By combining CT rotation and MU distribution according to the log file, the TPS was able to simulate the Vero tracking treatment dose delivery. The DVH analysis from the gimbal tracking dose calculation revealed changes in the dose distribution during gimbal DTT that are not visible with static dose calculations.

  2. Co-PylotDB - A Python-Based Single-Window User Interface for Transmitting Information to a Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnette, Daniel W.

    2012-01-05

    Co-PylotDB, written completely in Python, provides a user interface (UI) with which to select user and data file(s), directories, and file content, and provide or capture various other information for sending data collected from running any computer program to a pre-formatted database table for persistent storage. The interface allows the user to select input, output, make, source, executable, and qsub files. It also provides fields for specifying the machine name on which the software was run, capturing compile and execution lines, and listing relevant user comments. Data automatically captured by Co-PylotDB and sent to the database are user, current directory,more » local hostname, current date, and time of send. The UI provides fields for logging into a local or remote database server, specifying a database and a table, and sending the information to the selected database table. If a server is not available, the UI provides for saving the command that would have saved the information to a database table for either later submission or for sending via email to a collaborator who has access to the desired database.« less

  3. Designing a data portal for synthesis modeling

    NASA Astrophysics Data System (ADS)

    Holmes, M. A.

    2006-12-01

    Processing of field and model data in multi-disciplinary integrated science studies is a vital part of synthesis modeling. Collection and storage techniques for field data vary greatly between the participating scientific disciplines due to the nature of the data being collected, whether it be in situ, remotely sensed, or recorded by automated data logging equipment. Spreadsheets, personal databases, text files and binary files are used in the initial storage and processing of the raw data. In order to be useful to scientists, engineers and modelers the data need to be stored in a format that is easily identifiable, accessible and transparent to a variety of computing environments. The Model Operations and Synthesis (MOAS) database and associated web portal were created to provide such capabilities. The industry standard relational database is comprised of spatial and temporal data tables, shape files and supporting metadata accessible over the network, through a menu driven web-based portal or spatially accessible through ArcSDE connections from the user's local GIS desktop software. A separate server provides public access to spatial data and model output in the form of attributed shape files through an ArcIMS web-based graphical user interface.

  4. Forensic Analysis of Compromised Computers

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  5. Separate-channel analysis of two-channel microarrays: recovering inter-spot information.

    PubMed

    Smyth, Gordon K; Altman, Naomi S

    2013-05-26

    Two-channel (or two-color) microarrays are cost-effective platforms for comparative analysis of gene expression. They are traditionally analysed in terms of the log-ratios (M-values) of the two channel intensities at each spot, but this analysis does not use all the information available in the separate channel observations. Mixed models have been proposed to analyse intensities from the two channels as separate observations, but such models can be complex to use and the gain in efficiency over the log-ratio analysis is difficult to quantify. Mixed models yield test statistics for the null distributions can be specified only approximately, and some approaches do not borrow strength between genes. This article reformulates the mixed model to clarify the relationship with the traditional log-ratio analysis, to facilitate information borrowing between genes, and to obtain an exact distributional theory for the resulting test statistics. The mixed model is transformed to operate on the M-values and A-values (average log-expression for each spot) instead of on the log-expression values. The log-ratio analysis is shown to ignore information contained in the A-values. The relative efficiency of the log-ratio analysis is shown to depend on the size of the intraspot correlation. A new separate channel analysis method is proposed that assumes a constant intra-spot correlation coefficient across all genes. This approach permits the mixed model to be transformed into an ordinary linear model, allowing the data analysis to use a well-understood empirical Bayes analysis pipeline for linear modeling of microarray data. This yields statistically powerful test statistics that have an exact distributional theory. The log-ratio, mixed model and common correlation methods are compared using three case studies. The results show that separate channel analyses that borrow strength between genes are more powerful than log-ratio analyses. The common correlation analysis is the most powerful of all. The common correlation method proposed in this article for separate-channel analysis of two-channel microarray data is no more difficult to apply in practice than the traditional log-ratio analysis. It provides an intuitive and powerful means to conduct analyses and make comparisons that might otherwise not be possible.

  6. Checking Flight Rules with TraceContract: Application of a Scala DSL for Trace Analysis

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Havelund, Klaus; Morris, Robert A.

    2011-01-01

    Typically during the design and development of a NASA space mission, rules and constraints are identified to help reduce reasons for failure during operations. These flight rules are usually captured in a set of indexed tables, containing rule descriptions, rationales for the rules, and other information. Flight rules can be part of manual operations procedures carried out by humans. However, they can also be automated, and either implemented as on-board monitors, or as ground based monitors that are part of a ground data system. In the case of automated flight rules, one considerable expense to be addressed for any mission is the extensive process by which system engineers express flight rules in prose, software developers translate these requirements into code, and then both experts verify that the resulting application is correct. This paper explores the potential benefits of using an internal Scala DSL for general trace analysis, named TRACECONTRACT, to write executable specifications of flight rules. TRACECONTRACT can generally be applied to analysis of for example log files or for monitoring executing systems online.

  7. Reservoir characterization using core, well log, and seismic data and intelligent software

    NASA Astrophysics Data System (ADS)

    Soto Becerra, Rodolfo

    We have developed intelligent software, Oilfield Intelligence (OI), as an engineering tool to improve the characterization of oil and gas reservoirs. OI integrates neural networks and multivariate statistical analysis. It is composed of five main subsystems: data input, preprocessing, architecture design, graphics design, and inference engine modules. More than 1,200 lines of programming code as M-files using the language MATLAB been written. The degree of success of many oil and gas drilling, completion, and production activities depends upon the accuracy of the models used in a reservoir description. Neural networks have been applied for identification of nonlinear systems in almost all scientific fields of humankind. Solving reservoir characterization problems is no exception. Neural networks have a number of attractive features that can help to extract and recognize underlying patterns, structures, and relationships among data. However, before developing a neural network model, we must solve the problem of dimensionality such as determining dominant and irrelevant variables. We can apply principal components and factor analysis to reduce the dimensionality and help the neural networks formulate more realistic models. We validated OI by obtaining confident models in three different oil field problems: (1) A neural network in-situ stress model using lithology and gamma ray logs for the Travis Peak formation of east Texas, (2) A neural network permeability model using porosity and gamma ray and a neural network pseudo-gamma ray log model using 3D seismic attributes for the reservoir VLE 196 Lamar field located in Block V of south-central Lake Maracaibo (Venezuela), and (3) Neural network primary ultimate oil recovery (PRUR), initial waterflooding ultimate oil recovery (IWUR), and infill drilling ultimate oil recovery (IDUR) models using reservoir parameters for San Andres and Clearfork carbonate formations in west Texas. In all cases, we compared the results from the neural network models with the results from regression statistical and non-parametric approach models. The results show that it is possible to obtain the highest cross-correlation coefficient between predicted and actual target variables, and the lowest average absolute errors using the integrated techniques of multivariate statistical analysis and neural networks in our intelligent software.

  8. Using a Formal Approach for Reverse Engineering and Design Recovery to Support Software Reuse

    NASA Technical Reports Server (NTRS)

    Gannod, Gerald C.

    2002-01-01

    This document describes 3rd year accomplishments and summarizes overall project accomplishments. Included as attachments are all published papers from year three. Note that the budget for this project was discontinued after year two, but that a residual budget from year two allowed minimal continuance into year three. Accomplishments include initial investigations into log-file based reverse engineering, service-based software reuse, and a source to XML generator.

  9. Coastal single-beam bathymetry data collected in 2015 from Raccoon Point to Point Au Fer Island, Louisiana

    USGS Publications Warehouse

    Stalk, Chelsea A.; DeWitt, Nancy T.; Kindinger, Jack L.; Flocks, James G.; Reynolds, Billy J.; Kelso, Kyle W.; Fredericks, Joseph J.; Tuten, Thomas M.

    2017-03-10

    As part of the Barrier Island Comprehensive Monitoring Program (BICM), scientists from the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center conducted a nearshore single-beam bathymetry survey along the south-central coast of Louisiana, from Raccoon Point to Point Au Fer Island, in July 2015. The goal of the BICM program is to provide long-term data on Louisiana’s coastline and use this data to plan, design, evaluate, and maintain current and future barrier island restoration projects. The data described in this report will provide baseline bathymetric information for future research investigating island evolution, sediment transport, and recent and long-term geomorphic change, and will support modeling of future changes in response to restoration and storm impacts. The survey area encompasses more than 300 square kilometers of nearshore environment from Raccoon Point to Point Au Fer Island. This data series serves as an archive of processed single-beam bathymetry data, collected from July 22–29, 2015, under USGS Field Activity Number 2015-320-FA. Geographic information system data products include a 200-meter-cell-size interpolated bathymetry grid, trackline maps, and point data files. Additional files include error analysis maps, Field Activity Collection System logs, and formal Federal Geographic Data Committee metadata.

  10. Transaction aware tape-infrastructure monitoring

    NASA Astrophysics Data System (ADS)

    Nikolaidis, Fotios; Kruse, Daniele Francesco

    2014-06-01

    Administrating a large scale, multi protocol, hierarchical tape infrastructure like the CERN Advanced STORage manager (CASTOR)[2], which stores now 100 PB (with an increasing step of 25 PB per year), requires an adequate monitoring system for quick spotting of malfunctions, easier debugging and on demand report generation. The main challenges for such system are: to cope with CASTOR's log format diversity and its information scattered among several log files, the need for long term information archival, the strict reliability requirements and the group based GUI visualization. For this purpose, we have designed, developed and deployed a centralized system consisting of four independent layers: the Log Transfer layer for collecting log lines from all tape servers to a single aggregation server, the Data Mining layer for combining log data into transaction context, the Storage layer for archiving the resulting transactions and finally the Web UI layer for accessing the information. Having flexibility, extensibility and maintainability in mind, each layer is designed to work as a message broker for the next layer, providing a clean and generic interface while ensuring consistency, redundancy and ultimately fault tolerance. This system unifies information previously dispersed over several monitoring tools into a single user interface, using Splunk, which also allows us to provide information visualization based on access control lists (ACL). Since its deployment, it has been successfully used by CASTOR tape operators for quick overview of transactions, performance evaluation, malfunction detection and from managers for report generation.

  11. WE-D-204-06: An Open Source ImageJ CatPhan Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, G

    2015-06-15

    Purpose: The CatPhan is a popular QA device for assessing CT image quality. There are a number of software options which perform analysis of the CatPhan. However, there is often little ability for the user to adjust the analysis if it isn’t running properly, and these are all expensive options. An open source tool is an effective solution. Methods: To use the software, the user imports the CT as an image sequence in ImageJ. The user then scrolls to the slice with the lateral dots. The user then runs the plugin. If tolerance constraints are not already created, the usermore » is prompted to enter them or to use generic tolerances. Upon completion of the analysis, the plugin calls pdfLaTex to compile the pdf report. There is a csv version of the report as well. A log of the results from all CatPhan scans is kept as a csv file. The user can use this to baseline the machine. Results: The tool is capable of detecting the orientation of the phantom. If the CatPhan was scanned backwards, one can simply flip the stack of images horizontally and proceed with the analysis. The analysis includes Sensitometry (estimating the effective beam energy), HU values and linearity, Low Contrast Visibility (using LDPE & Polystyrene), Contrast Scale, Geometric Accuracy, Slice Thickness Accuracy, Spatial resolution (giving the MTF using the line pairs as well as the point spread function), CNR, Low Contrast Detectability (including the raw data), Uniformity (including the Cupping Effect). Conclusion: This is a robust tool that analyzes more components of the CatPhan than other software options (with the exception of ImageOwl). It produces an elegant pdf and keeps a log of analyses for long-term tracking of the system. Because it is open source, users are able to customize any component of it.« less

  12. New method for calculating a mathematical expression for streamflow recession

    USGS Publications Warehouse

    Rutledge, Albert T.

    1991-01-01

    An empirical method has been devised to calculate the master recession curve, which is a mathematical expression for streamflow recession during times of negligible direct runoff. The method is based on the assumption that the storage-delay factor, which is the time per log cycle of streamflow recession, varies linearly with the logarithm of streamflow. The resulting master recession curve can be nonlinear. The method can be executed by a computer program that reads a data file of daily mean streamflow, then allows the user to select several near-linear segments of streamflow recession. The storage-delay factor for each segment is one of the coefficients of the equation that results from linear least-squares regression. Using results for each recession segment, a mathematical expression of the storage-delay factor as a function of the log of streamflow is determined by linear least-squares regression. The master recession curve, which is a second-order polynomial expression for time as a function of log of streamflow, is then derived using the coefficients of this function.

  13. Archive of digital boomer and CHIRP seismic reflection data collected during USGS cruise 06FSH03 offshore of Fort Lauderdale, Florida, September 2006

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Reich, Christopher D.; Wiese, Dana S.; Greenwood, Jason W.; Swarzenski, Peter W.

    2007-01-01

    In September of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Fort Lauderdale, FL. This report serves as an archive of unprocessed digital boomer and CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  14. Comparative evaluation of calcium hypochlorite and sodium hypochlorite associated with passive ultrasonic irrigation on antimicrobial activity of a root canal system infected with Enterococcus faecalis: an in vitro study.

    PubMed

    de Almeida, Ana Paula; Souza, Matheus Albino; Miyagaki, Daniela Cristina; Dal Bello, Yuri; Cecchin, Doglas; Farina, Ana Paula

    2014-12-01

    The purpose of this study was to compare in vitro the effectiveness of calcium hypochlorite (Ca[OCl]2) and sodium hypochlorite (NaOCl) associated with passive ultrasonic irrigation in root canals of bovine teeth infected with Enterococcus faecalis. The root canals of 60 single-rooted bovine extracted teeth were enlarged up to a file 45, autoclaved, inoculated with Enterococcus faecalis, and incubated for 30 days. The samples were divided into 6 groups (n = 10) according to the protocol for decontamination: G1: no treatment; G2: distilled water; G3: 2.5% NaOCl; G4: 2.5% Ca(OCl)2; G5: 2.5% NaOCl with ultrasonic activation; and G6: 2.5% Ca(OCl)2 with ultrasonic activation (US). Microbiological testing (colony-forming unit [CFU] counting) was performed to evaluate and show, respectively, the effectiveness of the proposed treatments. Data were subjected to 1-way analysis of variance followed by the post hoc Tukey test (α = 0.05). Groups 1 and 2 showed the highest mean contamination (3.26 log10 CFU/mL and 2.69 log10 CFU/mL, respectively), which was statistically different from all other groups (P < .05). Group 6 (Ca[OCl]2 + US) showed the lowest mean contamination (1.00 log10 CFU/mL), with no statistically significant difference found in groups 3 (NaOCl), 4 (Ca[OCl]2), and 5 (NaOCl + US) (P < .05). Ca(OCl)2 as well as passive ultrasonic irrigation can aid in chemomechanical preparation, contributing in a significant way to the reduction of microbial content during root canal treatment. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  15. Logging Student Learning via a Puerto Rico-based Geologic Mapping Game on the Google Earth Virtual Globe

    NASA Astrophysics Data System (ADS)

    Gobert, J.; Toto, E.; Wild, S. C.; Dordevic, M. M.; De Paor, D. G.

    2013-12-01

    A hindrance to migrating undergraduate geoscience courses online is the challenge of giving students a quasi-authentic field experience. As part of an NSF TUES Type 2 project (# NSF-DUE 1022755), we addressed this challenge by designing a Google Earth (GE) mapping game centered on Puerto Rico, a place we chose in order to connect with underrepresented minorities but also because its simple geologic divisions minimized map complexity. The game invites student groups to explore the island and draw a geological map with these divisions: Rugged Volcanic Terrain, Limestone Karst Topography, and Surficial Sands & Gravels. Students, represented as avatars via COLLADA models and the GE browser plugin, can move about, text fellow students, and click a 'drill here' button that tells them what lies underground. They need to learn to read the topography because the number of holes they can drill is limited to 30. Then using the GE Polygon tool, they create a map, aided by a custom 'snapping' algorithm that stitches adjacent contacts, preventing gaps and overlaps, and they submit this map for evaluation by their instructor, an evaluation we purposefully did not automate. Initially we assigned students to groups of 4 and gave each group a field vehicle avatar with a designated driver, however students hated the experience unless they were the designated driver, so we revised the game to allow all students to roam independently, however we retained the mutual texting feature amongst students in groups. We implemented the activity with undergraduates from a university in South East USA. All student movements and actions on the GE terrain were logged. We wrote algorithms to evaluate student learning processes via log files, including, but not limited to, number of places drilled and their locations. Pre-post gains were examined, as well as correlations between data from log files and pre-post data. There was a small but statistically significant post-pre gain including a positive correlation between diagram-based post-test questions and: 1) total number of drills; 2) number of correct within-polygon identifications (Evidently those who did more drilling inside polygons and drew boundaries accordingly, learn more. Drills 'mistakingly' plotted outside formation polygons were negatively correlated with extra post-test questions but this was not statistically significant --likely due to low statistical power because there were few students who did this); and 3) average distance between drills (Students whose drill holes were further apart, learn more. This makes sense since more information can be gleaned this way and this may also be indicative of a skilled learning strategy because there is little point to doing close/overlapping drills when the permitted number is small and the region is large.) No significant correlation between pre-test score and diagram-based post-test questions was found; this suggests that prior knowledge is not accounting for above correlations. Data will be discussed with respect to GE's utility to convey geoscience principles to geology undergraduates, as well as the affordances for analyzing students' log files in order to better understand their learning processes.

  16. Geologic Map of Prescott National Forest and the Headwaters of the Verde River, Yavapai and Coconino Counties, Arizona

    USGS Publications Warehouse

    DeWitt, Ed; Langenheim, V.E.; Force, Eric; Vance, R.K.; Lindberg, P.A.; Driscoll, R.L.

    2008-01-01

    This 1:100,000-scale digital geologic map details the complex Early Proterozoic metavolcanic and plutonic basement of north-central Arizona; shows the mildly deformed cover of Paleozoic rocks; reveals where Laramide to mid-Tertiary plutonic rocks associated with base- and precious-metals deposits are exposed; subdivides the Tertiary volcanic rocks according to chemically named units; and maps the Pliocene to Miocene fill of major basins. Associated digital files include more than 1,300 geochemical analyses of all rock units; 1,750 logs of water wells deeper than 300 feet; and interpreted logs of 300 wells that define the depth to basement in major basins. Geophysically interpreted buried features include normal faults defining previous unknown basins, mid-Tertiary intrusive rocks, and half-grabens within shallow bains.

  17. SU-E-T-479: IMRT Plan Recalculation in Patient Based On Dynalog Data and the Effect of a Single Failing MLC Motor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morcos, M; Mitrou, E

    2015-06-15

    Purpose: Using Linac dynamic logs (Dynalogs) we evaluate the impact of a single failing MLC motor on the deliverability of an IMRT plan by assessing the recalculated dose volume histograms (DVHs) taking the delivered MLC positions and beam hold-offs into consideration. Methods: This is a retrospective study based on a deteriorating MLC motor (leaf 36B) which was observed to be failing via Dynalog analysis. To investigate further, Eclipse-importable MLC files were generated from Dynalogs to recalculate the actual delivered dose and to assess the clinical impact through DVHs. All deliveries were performed on a Varian 21EX linear accelerator equipped withmore » Millennium-120 MLC. The analysis of Dynalog files and subsequent conversion to Eclipse-importable MLC files were all performed by in-house programming in Python. Effects on plan DVH are presented in the following section on a particular brain-IMRT plan which was delivered with a failing MLC motor which was then replaced. Results: Global max dose increased by 13.5%, max dose to the brainstem PRV increased by 8.2%, max dose to the optic chiasm increased by 7.6%, max dose to optic nerve increased by 8.8% and the mean dose to the PTV increased by 7.9% when comparing the original plan to the fraction with the failing MLC motor. The reason the dose increased was due to the failure being on the B-bank which is the lagging side on a sliding window delivery, therefore any failures on this side will cause an over-irradiation as the B-bank leaves struggles to keep the window from growing. Conclusion: Our findings suggest that a single failing MLC motor may jeopardize the entire delivery. This may be due to the bad MLC motor drawing too much current causing all MLCs on the same bank to underperform. This hypothesis will be investigated in a future study.« less

  18. Thresholds of logging intensity to maintain tropical forest biodiversity.

    PubMed

    Burivalova, Zuzana; Sekercioğlu, Cağan Hakkı; Koh, Lian Pin

    2014-08-18

    Primary tropical forests are lost at an alarming rate, and much of the remaining forest is being degraded by selective logging. Yet, the impacts of logging on biodiversity remain poorly understood, in part due to the seemingly conflicting findings of case studies: about as many studies have reported increases in biodiversity after selective logging as have reported decreases. Consequently, meta-analytical studies that treat selective logging as a uniform land use tend to conclude that logging has negligible effects on biodiversity. However, selectively logged forests might not all be the same. Through a pantropical meta-analysis and using an information-theoretic approach, we compared and tested alternative hypotheses for key predictors of the richness of tropical forest fauna in logged forest. We found that the species richness of invertebrates, amphibians, and mammals decreases as logging intensity increases and that this effect varies with taxonomic group and continental location. In particular, mammals and amphibians would suffer a halving of species richness at logging intensities of 38 m(3) ha(-1) and 63 m(3) ha(-1), respectively. Birds exhibit an opposing trend as their total species richness increases with logging intensity. An analysis of forest bird species, however, suggests that this pattern is largely due to an influx of habitat generalists into heavily logged areas while forest specialist species decline. Our study provides a quantitative analysis of the nuanced responses of species along a gradient of logging intensity, which could help inform evidence-based sustainable logging practices from the perspective of biodiversity conservation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. SU-F-T-465: Two Years of Radiotherapy Treatments Analyzed Through MLC Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Defoor, D; Kabat, C; Papanikolaou, N

    Purpose: To present treatment statistics of a Varian Novalis Tx using more than 90,000 Varian Dynalog files collected over the past 2 years. Methods: Varian Dynalog files are recorded for every patient treated on our Varian Novalis Tx. The files are collected and analyzed daily to check interfraction agreement of treatment deliveries. This is accomplished by creating fluence maps from the data contained in the Dynalog files. From the Dynalog files we have also compiled statistics for treatment delivery times, MLC errors, gantry errors and collimator errors. Results: The mean treatment time for VMAT patients was 153 ± 86 secondsmore » while the mean treatment time for step & shoot was 256 ± 149 seconds. Patient’s treatment times showed a variation of 0.4% over there treatment course for VMAT and 0.5% for step & shoot. The average field sizes were 40 cm2 and 26 cm2 for VMAT and step & shoot respectively. VMAT beams contained and average overall leaf travel of 34.17 meters and step & shoot beams averaged less than half of that at 15.93 meters. When comparing planned and delivered fluence maps generated using the Dynalog files VMAT plans showed an average gamma passing percentage of 99.85 ± 0.47. Step & shoot plans showed an average gamma passing percentage of 97.04 ± 0.04. 5.3% of beams contained an MLC error greater than 1 mm and 2.4% had an error greater than 2mm. The mean gantry speed for VMAT plans was 1.01 degrees/s with a maximum of 6.5 degrees/s. Conclusion: Varian Dynalog files are useful for monitoring machine performance treatment parameters. The Dynalog files have shown that the performance of the Novalis Tx is consistent over the course of a patients treatment with only slight variations in patient treatment times and a low rate of MLC errors.« less

  20. Measurement, Modeling, and Analysis of a Large-scale Blog Sever Workload

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeon, Myeongjae; Hwang, Jeaho; Kim, Youngjae

    2010-01-01

    Despite the growing popularity of Online Social Networks (OSNs), the workload characteristics of OSN servers, such as those hosting blog services, are not well understood. Understanding workload characteristics is important for opti- mizing and improving the performance of current systems and software based on observed trends. Thus, in this paper, we characterize the system workload of the largest blog hosting servers in South Korea, Tistory1. In addition to understanding the system workload of the blog hosting server, we have developed synthesized workloads and obtained the following major findings: (i) the transfer size of non-multimedia files and blog articles can bemore » modeled by a truncated Pareto distribution and a log-normal distribution respectively, and (ii) users accesses to blog articles do not show temporal locality, but they are strongly biased toward those posted along with images or audio.« less

  1. Use of Web-based library resources by medical students in community and ambulatory settings.

    PubMed

    Tannery, Nancy Hrinya; Foust, Jill E; Gregg, Amy L; Hartman, Linda M; Kuller, Alice B; Worona, Paul; Tulsky, Asher A

    2002-07-01

    The purpose was to evaluate the use of Web-based library resources by third-year medical students. Third-year medical students (147) in a twelve-week multidisciplinary primary care rotation in community and ambulatory settings. Individual user surveys and log file analysis of Website were used. Twenty resource topics were compiled into a Website to provide students with access to electronic library resources from any community-based clerkship location. These resource topics, covering subjects such as hypertension and back pain, linked to curriculum training problems, full-text journal articles, MEDLINE searches, electronic book chapters, and relevant Websites. More than half of the students (69%) accessed the Website on a daily or weekly basis. Over 80% thought the Website was a valuable addition to their clerkship. Web-based information resources can provide curriculum support to students for whom access to the library is difficult and time consuming.

  2. VizieR Online Data Catalog: CCD {Delta}a-photometry of 5 open clusters (Paunzen+, 2003)

    NASA Astrophysics Data System (ADS)

    Paunzen, E.; Pintado, O. I.; Maitzen, H. M.

    2004-01-01

    Observations of the five open clusters were performed with the Bochum 61cm (ESO-La Silla), the Helen-Sawyer-Hogg 61cm telescope (UTSO-Las Campanas Observatory), the 2.15m telescope at the Complejo Astronomico el Leoncito (CASLEO) and the L. Figl Observatory (FOA) with the 150cm telescope on Mt. Schopfl (Austria) using the multimode instrument OEFOSC (see the observation log in Table 1). (5 data files).

  3. Examining the Return on Investment of a Security Information and Event Management Solution in a Notional Department of Defense Network Environment

    DTIC Science & Technology

    2013-06-01

    collection are the facts that devices the lack encryption or compression methods and that the log file must be saved on the host system prior to transfer...time. Statistical correlation utilizes numerical algorithms to detect deviations from normal event levels and other routine activities (Chuvakin...can also assist in detecting low volume threats. Although easy and logical to implement, the implementation of statistical correlation algorithms

  4. Ending the U.S. War in Iraq: The Final Transition, Operational Maneuver, and Disestablishment of United States Forces-Iraq

    DTIC Science & Technology

    2013-01-01

    management survey and ensure that all databases (military and contracted civilian), key leader engagement logs, assistance project files, and other...Princeton University Press, 2000; Michael I. Handel, War Termination—A Critical Survey , Jeru- salem: Hebrew University, 1978; Jane Holl Lute, From the...DoS did not plan to install permanent and more costly security measures.133 Security surveys undertaken collaboratively by USF-I and multiple

  5. Data mining learning bootstrap through semantic thumbnail analysis

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Farinella, Giovanni Maria; Giuffrida, Giovanni; Tribulato, Giuseppe

    2007-01-01

    The rapid increase of technological innovations in the mobile phone industry induces the research community to develop new and advanced systems to optimize services offered by mobile phones operators (telcos) to maximize their effectiveness and improve their business. Data mining algorithms can run over data produced by mobile phones usage (e.g. image, video, text and logs files) to discover user's preferences and predict the most likely (to be purchased) offer for each individual customer. One of the main challenges is the reduction of the learning time and cost of these automatic tasks. In this paper we discuss an experiment where a commercial offer is composed by a small picture augmented with a short text describing the offer itself. Each customer's purchase is properly logged with all relevant information. Upon arrival of new items we need to learn who the best customers (prospects) for each item are, that is, the ones most likely to be interested in purchasing that specific item. Such learning activity is time consuming and, in our specific case, is not applicable given the large number of new items arriving every day. Basically, given the current customer base we are not able to learn on all new items. Thus, we need somehow to select among those new items to identify the best candidates. We do so by using a joint analysis between visual features and text to estimate how good each new item could be, that is, whether or not is worth to learn on it. Preliminary results show the effectiveness of the proposed approach to improve classical data mining techniques.

  6. Petrophysical evaluation of subterranean formations

    DOEpatents

    Klein, James D; Schoderbek, David A; Mailloux, Jason M

    2013-05-28

    Methods and systems are provided for evaluating petrophysical properties of subterranean formations and comprehensively evaluating hydrate presence through a combination of computer-implemented log modeling and analysis. Certain embodiments include the steps of running a number of logging tools in a wellbore to obtain a variety of wellbore data and logs, and evaluating and modeling the log data to ascertain various petrophysical properties. Examples of suitable logging techniques that may be used in combination with the present invention include, but are not limited to, sonic logs, electrical resistivity logs, gamma ray logs, neutron porosity logs, density logs, NRM logs, or any combination or subset thereof.

  7. Using telephony data to facilitate discovery of clinical workflows.

    PubMed

    Rucker, Donald W

    2017-04-19

    Discovery of clinical workflows to target for redesign using methods such as Lean and Six Sigma is difficult. VoIP telephone call pattern analysis may complement direct observation and EMR-based tools in understanding clinical workflows at the enterprise level by allowing visualization of institutional telecommunications activity. To build an analytic framework mapping repetitive and high-volume telephone call patterns in a large medical center to their associated clinical units using an enterprise unified communications server log file and to support visualization of specific call patterns using graphical networks. Consecutive call detail records from the medical center's unified communications server were parsed to cross-correlate telephone call patterns and map associated phone numbers to a cost center dictionary. Hashed data structures were built to allow construction of edge and node files representing high volume call patterns for display with an open source graph network tool. Summary statistics for an analysis of exactly one week's call detail records at a large academic medical center showed that 912,386 calls were placed with a total duration of 23,186 hours. Approximately half of all calling called number pairs had an average call duration under 60 seconds and of these the average call duration was 27 seconds. Cross-correlation of phone calls identified by clinical cost center can be used to generate graphical displays of clinical enterprise communications. Many calls are short. The compact data transfers within short calls may serve as automation or re-design targets. The large absolute amount of time medical center employees were engaged in VoIP telecommunications suggests that analysis of telephone call patterns may offer additional insights into core clinical workflows.

  8. Online molecular image repository and analysis system: A multicenter collaborative open-source infrastructure for molecular imaging research and application.

    PubMed

    Rahman, Mahabubur; Watabe, Hiroshi

    2018-05-01

    Molecular imaging serves as an important tool for researchers and clinicians to visualize and investigate complex biochemical phenomena using specialized instruments; these instruments are either used individually or in combination with targeted imaging agents to obtain images related to specific diseases with high sensitivity, specificity, and signal-to-noise ratios. However, molecular imaging, which is a multidisciplinary research field, faces several challenges, including the integration of imaging informatics with bioinformatics and medical informatics, requirement of reliable and robust image analysis algorithms, effective quality control of imaging facilities, and those related to individualized disease mapping, data sharing, software architecture, and knowledge management. As a cost-effective and open-source approach to address these challenges related to molecular imaging, we develop a flexible, transparent, and secure infrastructure, named MIRA, which stands for Molecular Imaging Repository and Analysis, primarily using the Python programming language, and a MySQL relational database system deployed on a Linux server. MIRA is designed with a centralized image archiving infrastructure and information database so that a multicenter collaborative informatics platform can be built. The capability of dealing with metadata, image file format normalization, and storing and viewing different types of documents and multimedia files make MIRA considerably flexible. With features like logging, auditing, commenting, sharing, and searching, MIRA is useful as an Electronic Laboratory Notebook for effective knowledge management. In addition, the centralized approach for MIRA facilitates on-the-fly access to all its features remotely through any web browser. Furthermore, the open-source approach provides the opportunity for sustainable continued development. MIRA offers an infrastructure that can be used as cross-boundary collaborative MI research platform for the rapid achievement in cancer diagnosis and therapeutics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. VizieR Online Data Catalog: Astron low resolution UV spectra (Boyarchuk+, 1994)

    NASA Astrophysics Data System (ADS)

    Boyarchuk, A. A.

    2017-05-01

    Astron was a Soviet spacecraft launched on 23 March 1983, and it was operational for eight years as the largest ultraviolet space telescope during its lifetime. Astron's payload consisted of an 80 cm ultraviolet telescope Spica and an X-ray spectroscope. We present 159 low resolution spectra of stars obtained during the Astron space mission (Tables 4, 5; hereafter table numbers in Boyarchuk et al. 1994 are given). Table 4 (observational log, logs.dat) contains data on 142 sessions for 90 stars (sorted in ascending order of RA), where SED was obtained by scanning method, and then data on 17 sessions for 15 stars (also sorted in ascending order of RA), where multicolor photometry was done. Kilpio et al. (2016, Baltic Astronomy 25, 23) presented results of the comparison of Astron data to the modern UV stellar data, discussed Astron precision and accuracy, and made some conclusions on potential application areas of these data. Also 34 sessions of observations of 27 stellar systems (galaxies and globular clusters) are presented. Observational log was published in Table 10 and data were published in Table 11, respectively. Also 16 sessions of observations of 12 nebulae (Table 12 for observational log and Table 13 for data themselves) are presented. Background radiation intensity data (Table 14) are presented in Table 15. At last, data on comets are presented in different forms. We draw your attention that observational data for stars, stellar systems, nebulae and comets are expressed in log [erg/s/cm^2/A], while for comets data 10E-13 erg/s/cm^2/A units are used, hydroxyl band photometric data for comets are expressed in log [erg/s/cm^2], and for the background data it is radiation intensity expressed in log [erg/s/cm^2/A/sr]. Scanned (PDF version of) Boyarchuk et al. (1994) book is available at http://www.inasan.ru/~astron/astron.pdf (12 data files).

  10. Usage analysis of user files in UNIX

    NASA Technical Reports Server (NTRS)

    Devarakonda, Murthy V.; Iyer, Ravishankar K.

    1987-01-01

    Presented is a user-oriented analysis of short term file usage in a 4.2 BSD UNIX environment. The key aspect of this analysis is a characterization of users and files, which is a departure from the traditional approach of analyzing file references. Two characterization measures are employed: accesses-per-byte (combining fraction of a file referenced and number of references) and file size. This new approach is shown to distinguish differences in files as well as users, which cam be used in efficient file system design, and in creating realistic test workloads for simulations. A multi-stage gamma distribution is shown to closely model the file usage measures. Even though overall file sharing is small, some files belonging to a bulletin board system are accessed by many users, simultaneously and otherwise. Over 50% of users referenced files owned by other users, and over 80% of all files were involved in such references. Based on the differences in files and users, suggestions to improve the system performance were also made.

  11. Optimal File-Distribution in Heterogeneous and Asymmetric Storage Networks

    NASA Astrophysics Data System (ADS)

    Langner, Tobias; Schindelhauer, Christian; Souza, Alexander

    We consider an optimisation problem which is motivated from storage virtualisation in the Internet. While storage networks make use of dedicated hardware to provide homogeneous bandwidth between servers and clients, in the Internet, connections between storage servers and clients are heterogeneous and often asymmetric with respect to upload and download. Thus, for a large file, the question arises how it should be fragmented and distributed among the servers to grant "optimal" access to the contents. We concentrate on the transfer time of a file, which is the time needed for one upload and a sequence of n downloads, using a set of m servers with heterogeneous bandwidths. We assume that fragments of the file can be transferred in parallel to and from multiple servers. This model yields a distribution problem that examines the question of how these fragments should be distributed onto those servers in order to minimise the transfer time. We present an algorithm, called FlowScaling, that finds an optimal solution within running time {O}(m log m). We formulate the distribution problem as a maximum flow problem, which involves a function that states whether a solution with a given transfer time bound exists. This function is then used with a scaling argument to determine an optimal solution within the claimed time complexity.

  12. D0 Superconducting Solenoid Quench Data and Slow Dump Data Acquisition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markley, D.; /Fermilab

    1998-06-09

    This Dzero Engineering note describes the method for which the 2 Tesla Superconducting Solenoid Fast Dump and Slow Dump data are accumulated, tracked and stored. The 2 Tesla Solenoid has eleven data points that need to be tracked and then stored when a fast dump or a slow dump occur. The TI555(Texas Instruments) PLC(Programmable Logic Controller) which controls the DC power circuit that powers the Solenoid, also has access to all the voltage taps and other equipment in the circuit. The TI555 constantly logs these eleven points in a rotating memory buffer. When either a fast dump(dump switch opens) ormore » a slow dump (power supply turns off) occurs, the TI555 organizes the respective data and will down load the data to a file on DO-CCRS2. This data in this file is moved over ethernet and is stored in a CSV (comma separated format) file which can easily be examined by Microsoft Excel or any other spreadsheet. The 2 Tesla solenoid control system also locks in first fault information. The TI555 decodes the first fault and passes it along to the program collecting the data and storing it on DO-CCRS2. This first fault information is then part of the file.« less

  13. Archive of digital boomer seismic reflection data collected during USGS field activities 95LCA03 and 96LCA02 in the Peace River of West-Central Florida, 1995 and 1996

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Tihansky, Ann B.; Lewelling, Bill R.; Flocks, James G.; Wiese, Dana S.; Kindinger, Jack G.; Harrison, Arnell S.

    2006-01-01

    In October and November of 1995 and February of 1996, the U.S. Geological Survey, in cooperation with the Southwest Florida Water Management District, conducted geophysical surveys of the Peace River in west-central Florida from east of Bartow to west of Arcadia. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, observers' logbooks, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  14. Pizza.py Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plimpton, Steve; Jones, Matt; Crozier, Paul

    2006-01-01

    Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs on any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invokedmore » interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.« less

  15. Addressing fluorogenic real-time qPCR inhibition using the novel custom Excel file system 'FocusField2-6GallupqPCRSet-upTool-001' to attain consistently high fidelity qPCR reactions

    PubMed Central

    Ackermann, Mark R.

    2006-01-01

    The purpose of this manuscript is to discuss fluorogenic real-time quantitative polymerase chain reaction (qPCR) inhibition and to introduce/define a novel Microsoft Excel-based file system which provides a way to detect and avoid inhibition, and enables investigators to consistently design dynamically-sound, truly LOG-linear qPCR reactions very quickly. The qPCR problems this invention solves are universal to all qPCR reactions, and it performs all necessary qPCR set-up calculations in about 52 seconds (using a pentium 4 processor) for up to seven qPCR targets and seventy-two samples at a time – calculations that commonly take capable investigators days to finish. We have named this custom Excel-based file system "FocusField2-6GallupqPCRSet-upTool-001" (FF2-6-001 qPCR set-up tool), and are in the process of transforming it into professional qPCR set-up software to be made available in 2007. The current prototype is already fully functional. PMID:17033699

  16. HLYWD: a program for post-processing data files to generate selected plots or time-lapse graphics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munro, J.K. Jr.

    1980-05-01

    The program HLYWD is a post-processor of output files generated by large plasma simulation computations or of data files containing a time sequence of plasma diagnostics. It is intended to be used in a production mode for either type of application; i.e., it allows one to generate along with the graphics sequence, segments containing title, credits to those who performed the work, text to describe the graphics, and acknowledgement of funding agency. The current version is designed to generate 3D plots and allows one to select type of display (linear or semi-log scales), choice of normalization of function values formore » display purposes, viewing perspective, and an option to allow continuous rotations of surfaces. This program was developed with the intention of being relatively easy to use, reasonably flexible, and requiring a minimum investment of the user's time. It uses the TV80 library of graphics software and ORDERLIB system software on the CDC 7600 at the National Magnetic Fusion Energy Computing Center at Lawrence Livermore Laboratory in California.« less

  17. An analysis of lecture video utilization in undergraduate medical education: associations with performance in the courses

    PubMed Central

    McNulty, John A; Hoyt, Amy; Gruener, Gregory; Chandrasekhar, Arcot; Espiritu, Baltazar; Price, Ron; Naheedy, Ross

    2009-01-01

    Background Increasing numbers of medical schools are providing videos of lectures to their students. This study sought to analyze utilization of lecture videos by medical students in their basic science courses and to determine if student utilization was associated with performance on exams. Methods Streaming videos of lectures (n = 149) to first year and second year medical students (n = 284) were made available through a password-protected server. Server logs were analyzed over a 10-week period for both classes. For each lecture, the logs recorded time and location from which students accessed the file. A survey was administered at the end of the courses to obtain additional information about student use of the videos. Results There was a wide disparity in the level of use of lecture videos by medical students with the majority of students accessing the lecture videos sparingly (60% of the students viewed less than 10% of the available videos. The anonymous student survey revealed that students tended to view the videos by themselves from home during weekends and prior to exams. Students who accessed lecture videos more frequently had significantly (p < 0.002) lower exam scores. Conclusion We conclude that videos of lectures are used by relatively few medical students and that individual use of videos is associated with the degree to which students are having difficulty with the subject matter. PMID:19173725

  18. A PC-based bus monitor program for use with the transport systems research vehicle RS-232 communication interfaces

    NASA Technical Reports Server (NTRS)

    Easley, Wesley C.

    1991-01-01

    Experiment critical use of RS-232 data busses in the Transport Systems Research Vehicle (TSRV) operated by the Advanced Transport Operating Systems Program Office at the NASA Langley Research Center has recently increased. Each application utilizes a number of nonidentical computer and peripheral configurations and requires task specific software development. To aid these development tasks, an IBM PC-based RS-232 bus monitoring system was produced. It can simultaneously monitor two communication ports of a PC or clone, including the nonstandard bus expansion of the TSRV Grid laptop computers. Display occurs in a separate window for each port's input with binary display being selectable. A number of other features including binary log files, screen capture to files, and a full range of communication parameters are provided.

  19. Archive of digital and digitized analog boomer seismic reflection data collected during USGS cruise 96CCT02 in Copano, Corpus Christi, and Nueces Bays and Corpus Christi Bayou, Texas, July 1996

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Kindinger, Jack G.; Morton, Robert A.; Blum, Mike D.; Wiese, Dana S.; Subiño, Janice A.

    2007-01-01

    In June of 1996, the U.S. Geological Survey conducted geophysical surveys from Nueces to Copano Bays, Texas. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, cruise log, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles and high resolution scanned TIFF images of the original paper printouts are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  20. Archive of Digital Boomer Seismic Reflection Data Collected During USGS Field Activity 08LCA04 in Lakes Cherry, Helen, Hiawassee, Louisa, and Prevatt, Central Florida, September 2008

    USGS Publications Warehouse

    Harrison, Arnell S.; Dadisman, Shawn V.; Davis, Jeffrey B.; Flocks, James G.; Wiese, Dana S.

    2009-01-01

    From September 2 through 4, 2008, the U.S. Geological Survey and St. Johns River Water Management District (SJRWMD) conducted geophysical surveys in Lakes Cherry, Helen, Hiawassee, Louisa, and Prevatt, central Florida. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, FACS logs, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.

  1. Archive of digital chirp subbottom profile data collected during USGS Cruise 13GFP01, Brownlee Dam and Hells Canyon Reservoir, Idaho and Oregon, 2013

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Fosness, Ryan L.; Welcker, Chris; Kelso, Kyle W.

    2014-01-01

    From March 16 - 31, 2013, the U.S. Geological Survey in cooperation with the Idaho Power Company conducted a geophysical survey to investigate sediment deposits and long-term sediment transport within the Snake River from Brownlee Dam to Hells Canyon Reservoir, along the Idaho and Oregon border; this effort will help the USGS to better understand geologic processes. This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansions of acronyms and abbreviations used in this report.

  2. Archive of digital chirp subbottom profile data collected during USGS cruise 11BIM01 Offshore of the Chandeleur Islands, Louisiana, June 2011

    USGS Publications Warehouse

    Forde, Arnell S.; Dadisman, Shawn V.; Miselis, Jennifer L.; Flocks, James G.; Wiese, Dana S.

    2013-01-01

    From June 3 to 13, 2011, the U.S. Geological Survey conducted a geophysical survey to investigate the geologic controls on barrier island framework and long-term sediment transport along the oil spill mitigation sand berm constructed at the north end and just offshore of the Chandeleur Islands, LA. This effort is part of a broader USGS study, which seeks to better understand barrier island evolution over medium time scales (months to years). This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (showing a relative increase in signal amplitude) digital images of the seismic profiles are also provided.

  3. Understanding the usage of content in a mental health intervention for depression: an analysis of log data.

    PubMed

    Van Gemert-Pijnen, Julia Ewc; Kelders, Saskia M; Bohlmeijer, Ernst T

    2014-01-31

    Web-based interventions for the early treatment of depressive symptoms can be considered effective in reducing mental complaints. However, there is a limited understanding of which elements in an intervention contribute to effectiveness. For efficiency and effectiveness of interventions, insight is needed into the use of content and persuasive features. The aims of this study were (1) to illustrate how log data can be used to understand the uptake of the content of a Web-based intervention that is based on the acceptance and commitment therapy (ACT) and (2) to discover how log data can be of value for improving the incorporation of content in Web-based interventions. Data from 206 participants (out of the 239) who started the first nine lessons of the Web-based intervention, Living to the Full, were used for a secondary analysis of a subset of the log data of the parent study about adherence to the intervention. The log files used in this study were per lesson: login, start mindfulness, download mindfulness, view success story, view feedback message, start multimedia, turn on text-message coach, turn off text-message coach, and view text message. Differences in usage between lessons were explored with repeated measures ANOVAs (analysis of variance). Differences between groups were explored with one-way ANOVAs. To explore the possible predictive value of the login per lesson quartiles on the outcome measures, four linear regressions were used with login quartiles as predictor and with the outcome measures (Center for Epidemiologic Studies-Depression [CES-D] and the Hospital Anxiety and Depression Scale-Anxiety [HADS-A] on post-intervention and follow-up) as dependent variables. A significant decrease in logins and in the use of content and persuasive features over time was observed. The usage of features varied significantly during the treatment process. The usage of persuasive features increased during the third part of the ACT (commitment to value-based living), which might indicate that at that stage motivational support was relevant. Higher logins over time (9 weeks) corresponded with a higher usage of features (in most cases significant); when predicting depressive symptoms at post-intervention, the linear regression yielded a significant model with login quartile as a significant predictor (explained variance is 2.7%). A better integration of content and persuasive features in the design of the intervention and a better intra-usability of features within the system are needed to identify which combination of features works best for whom. Pattern recognition can be used to tailor the intervention based on usage patterns from the earlier lessons and to support the uptake of content essential for therapy. An adaptable interface for a modular composition of therapy features supposes a dynamic approach for Web-based treatment; not a predefined path for all, but a flexible way to go through all features that have to be used.

  4. Understanding the Usage of Content in a Mental Health Intervention for Depression: An Analysis of Log Data

    PubMed Central

    2014-01-01

    Background Web-based interventions for the early treatment of depressive symptoms can be considered effective in reducing mental complaints. However, there is a limited understanding of which elements in an intervention contribute to effectiveness. For efficiency and effectiveness of interventions, insight is needed into the use of content and persuasive features. Objective The aims of this study were (1) to illustrate how log data can be used to understand the uptake of the content of a Web-based intervention that is based on the acceptance and commitment therapy (ACT) and (2) to discover how log data can be of value for improving the incorporation of content in Web-based interventions. Methods Data from 206 participants (out of the 239) who started the first nine lessons of the Web-based intervention, Living to the Full, were used for a secondary analysis of a subset of the log data of the parent study about adherence to the intervention. The log files used in this study were per lesson: login, start mindfulness, download mindfulness, view success story, view feedback message, start multimedia, turn on text-message coach, turn off text-message coach, and view text message. Differences in usage between lessons were explored with repeated measures ANOVAs (analysis of variance). Differences between groups were explored with one-way ANOVAs. To explore the possible predictive value of the login per lesson quartiles on the outcome measures, four linear regressions were used with login quartiles as predictor and with the outcome measures (Center for Epidemiologic Studies—Depression [CES-D] and the Hospital Anxiety and Depression Scale—Anxiety [HADS-A] on post-intervention and follow-up) as dependent variables. Results A significant decrease in logins and in the use of content and persuasive features over time was observed. The usage of features varied significantly during the treatment process. The usage of persuasive features increased during the third part of the ACT (commitment to value-based living), which might indicate that at that stage motivational support was relevant. Higher logins over time (9 weeks) corresponded with a higher usage of features (in most cases significant); when predicting depressive symptoms at post-intervention, the linear regression yielded a significant model with login quartile as a significant predictor (explained variance is 2.7%). Conclusions A better integration of content and persuasive features in the design of the intervention and a better intra-usability of features within the system are needed to identify which combination of features works best for whom. Pattern recognition can be used to tailor the intervention based on usage patterns from the earlier lessons and to support the uptake of content essential for therapy. An adaptable interface for a modular composition of therapy features supposes a dynamic approach for Web-based treatment; not a predefined path for all, but a flexible way to go through all features that have to be used. PMID:24486914

  5. New Mexico Play Fairway Analysis: Gamma Ray Logs and Heat Generation Calculations for SW New Mexico

    DOE Data Explorer

    Shari Kelley

    2015-10-23

    For the New Mexico Play fairway Analysis project, gamma ray geophysical well logs from oil wells penetrating the Proterozoic basement in southwestern New Mexico were digitized. Only the portion of the log in the basement was digitized. The gamma ray logs are converted to heat production using the equation (Bucker and Rybach, 1996) : A[µW/m3] = 0.0158 (Gamma Ray [API] – 0.8).

  6. Collecting conditions usage metadata to optimize current and future ATLAS software and processing

    NASA Astrophysics Data System (ADS)

    Rinaldi, L.; Barberis, D.; Formica, A.; Gallas, E. J.; Oda, S.; Rybkin, G.; Verducci, M.; ATLAS Collaboration

    2017-10-01

    Conditions data (for example: alignment, calibration, data quality) are used extensively in the processing of real and simulated data in ATLAS. The volume and variety of the conditions data needed by different types of processing are quite diverse, so optimizing its access requires a careful understanding of conditions usage patterns. These patterns can be quantified by mining representative log files from each type of processing and gathering detailed information about conditions usage for that type of processing into a central repository.

  7. Structure of the top of the Karnak Limestone Member (Ste. Genevieve) in Illinois

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bristol, H.M.; Howard, R.H.

    1976-01-01

    To facilitate petroleum exploration in Illinois, the Illinois State Geological Survey presents a structure map (for most of southern Illinois) of the Karnak Limestone Member--a relatively pure persistent limestone unit (generally 10 to 35 ft thick) in the Ste. Genevieve Limestone of Genevievian age. All available electric logs and selected studies of well cuttings were used in constructing the map. Oil and gas development maps containing Karnak-structure contours are on open file at the ISGS.

  8. Cloud-based processing of multi-spectral imaging data

    NASA Astrophysics Data System (ADS)

    Bernat, Amir S.; Bolton, Frank J.; Weiser, Reuven; Levitz, David

    2017-03-01

    Multispectral imaging holds great promise as a non-contact tool for the assessment of tissue composition. Performing multi - spectral imaging on a hand held mobile device would allow to bring this technology and with it knowledge to low resource settings to provide a state of the art classification of tissue health. This modality however produces considerably larger data sets than white light imaging and requires preliminary image analysis for it to be used. The data then needs to be analyzed and logged, while not requiring too much of the system resource or a long computation time and battery use by the end point device. Cloud environments were designed to allow offloading of those problems by allowing end point devices (smartphones) to offload computationally hard tasks. For this end we present a method where the a hand held device based around a smartphone captures a multi - spectral dataset in a movie file format (mp4) and compare it to other image format in size, noise and correctness. We present the cloud configuration used for segmenting images to frames where they can later be used for further analysis.

  9. VizieR Online Data Catalog: Solar analogs and twins rotation by Kepler (do Nascimento+, 2014)

    NASA Astrophysics Data System (ADS)

    Do Nascimento, J.-D. Jr; Garcia, R. A.; Mathur, S.; Anthony, F.; Barnes, S. A.; Meibom, S.; da Costa, J. S.; Castro, M.; Salabert, D.; Ceillier, T.

    2017-03-01

    Our sample of 75 stars consists of a seismic sample of 38 from Chaplin et al. (2014, J/ApJS/210/1), 35 additional stars selected from the Kepler Input Catalog (KIC), and 16 Cyg A and B. We selected 38 well-studied stars from the asteroseismic data with fundamental properties, including ages, estimated by Chaplin et al. (2014, J/ApJS/210/1), and with Teff and log g as close as possible to the Sun's value (5200 K < Teff < 6060 K and 3.63 < log g < 4.40). This seismic sample allows a direct comparison between gyro- and seismic-ages for a subset of eight stars. These seismic samples were observed in short cadence for one month each in survey mode. Stellar properties for these stars have been estimated using two global asteroseismic parameters and complementary photometric and spectroscopic observations as described by Chaplin et al. (2014, J/ApJS/210/1). The median final quoted uncertainties for the full Chaplin et al. (2014, J/ApJS/210/1) sample were approximately 0.020 dex in log g and 150 K in Teff. (1 data file).

  10. Finite Elements Analysis of a Composite Semi-Span Test Article With and Without Discrete Damage

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Jegley, Dawn C. (Technical Monitor)

    2000-01-01

    AS&M Inc. performed finite element analysis, with and without discrete damage, of a composite semi-span test article that represents the Boeing 220-passenger transport aircraft composite semi-span test article. A NASTRAN bulk data file and drawings of the test mount fixtures and semi-span components were utilized to generate the baseline finite element model. In this model, the stringer blades are represented by shell elements, and the stringer flanges are combined with the skin. Numerous modeling modifications and discrete source damage scenarios were applied to the test article model throughout the course of the study. This report details the analysis method and results obtained from the composite semi-span study. Analyses were carried out for three load cases: Braked Roll, LOG Down-Bending and 2.5G Up-Bending. These analyses included linear and nonlinear static response, as well as linear and nonlinear buckling response. Results are presented in the form of stress and strain plots. factors of safety for failed elements, buckling loads and modes, deflection prediction tables and plots, and strainage prediction tables and plots. The collected results are presented within this report for comparison to test results.

  11. PipeCraft: Flexible open-source toolkit for bioinformatics analysis of custom high-throughput amplicon sequencing data.

    PubMed

    Anslan, Sten; Bahram, Mohammad; Hiiesalu, Indrek; Tedersoo, Leho

    2017-11-01

    High-throughput sequencing methods have become a routine analysis tool in environmental sciences as well as in public and private sector. These methods provide vast amount of data, which need to be analysed in several steps. Although the bioinformatics may be applied using several public tools, many analytical pipelines allow too few options for the optimal analysis for more complicated or customized designs. Here, we introduce PipeCraft, a flexible and handy bioinformatics pipeline with a user-friendly graphical interface that links several public tools for analysing amplicon sequencing data. Users are able to customize the pipeline by selecting the most suitable tools and options to process raw sequences from Illumina, Pacific Biosciences, Ion Torrent and Roche 454 sequencing platforms. We described the design and options of PipeCraft and evaluated its performance by analysing the data sets from three different sequencing platforms. We demonstrated that PipeCraft is able to process large data sets within 24 hr. The graphical user interface and the automated links between various bioinformatics tools enable easy customization of the workflow. All analytical steps and options are recorded in log files and are easily traceable. © 2017 John Wiley & Sons Ltd.

  12. Using Web Server Logs in Evaluating Instructional Web Sites.

    ERIC Educational Resources Information Center

    Ingram, Albert L.

    2000-01-01

    Web server logs contain a great deal of information about who uses a Web site and how they use it. This article discusses the analysis of Web logs for instructional Web sites; reviews the data stored in most Web server logs; demonstrates what further information can be gleaned from the logs; and discusses analyzing that information for the…

  13. A rule-based approach for the correlation of alarms to support Disaster and Emergency Management

    NASA Astrophysics Data System (ADS)

    Gloria, M.; Minei, G.; Lersi, V.; Pasquariello, D.; Monti, C.; Saitto, A.

    2009-04-01

    Key words: Simple Event Correlator, Agent Platform, Ontology, Semantic Web, Distributed Systems, Emergency Management The importance of recognition of emergency's typology to control the critical situation for security of citizens has been always recognized. It follows this aspect is very important for proper management of a hazardous event. In this work we present a solution for the recognition of emergency's typology adopted by an Italian research project, called CI6 (Centro Integrato per Servizi di Emergenza Innovativi). In our approach, CI6 receives alarms by citizen or people involved in the work (for example: police, operator of 112, and so on). CI6 represents any alarm by a set of information, including a text that describes it and obtained when the user points out the danger, and a pair of coordinates for its location. The system realizes an analysis of text and automatically infers information on the type of emergencies by means a set of parsing rules and rules of inference applied by a independent module: a correlator of events based on their log and called Simple Event Correlator (SEC). SEC, integrated in CI6's platform, is an open source and platform independent event correlation tool. SEC accepts input both files and text derived from standard input, making it flexible because it can be matched to any application that is able to write its output to a file stream. The SEC configuration is stored in text files as rules, each rule specifying an event matching condition, an action list, and optionally a Boolean expression whose truth value decides whether the rule can be applied at a given moment. SEC can produce output events by executing user-specified shell scripts or programs, by writing messages to files, and by various other means. SEC has been successfully applied in various domains like network management, system monitoring, data security, intrusion detection, log file monitoring and analysis, etc; it has been used or integrated with many application as CiscoWorks, HP OpenView NNM and Operation, BMC Patrol, etc. Analysis of text of an alarm can detect some keywords that allow to classify the particular event. The inference rules were developed by means an analysis about news regard real emergency found by web reaserches. We have seen that often a kind of emergency is characterized by more keyword. Keywords are not uniquely associated with a specific emergency, but they can be shared by different types of emergencies (such as. keyword "landslide" can be associated both emergency "landslide" and emergency "Flood"). However, the identification of two or more keywords associated with a particular type of emergency identified in most cases the correct type of emergency. So, for example, if text contains words as "water", "flood", "overflowing", "landslide" o other words belonging to the set of defined keywords or words that have some root of keywords, the system "decides" that this alarm belongs to specific typology, in this case "flood typology". The system has the memory of this information, so if a new alarm is reported and belongs to one of the typology already identified, it proceeds with the comparison of coordinates. The comparison between the centers of the alarms allows to see if they describe an area inscribed in an ideal circle that has centered on the first alarm and radius defined by the typology above mentioned. If this happens the system CI6 creates an emergency that has centered on the centre of that area and typology equal to that of the alarms. It follows that an emergency is represented by at least two alarms. Thus, the system suggests to manager (CI6's user) the possibility that most alarms can concern same events and makes a classification of this event. It is important to stress that CI6 is a system of decision support, hence also this service is limited to providing advice to the user to facilitate his task, leaving him the decision to accept it or not. REFERENCES SEC (Simple Event Correlator), http://kodu.neti.ee/~risto/sec/ M. Gloria,V. Lersi, G. Minei, D. Pasquariello, C. Monti, A. Saitto, "A Semantic WEB Services Platform to support Disaster and Emergency Management", 4th biennial Meeting of International Environmental Modelling and Software Society (iEMSs), 2008

  14. Electrofacies analysis for coal lithotype profiling based on high-resolution wireline log data

    NASA Astrophysics Data System (ADS)

    Roslin, A.; Esterle, J. S.

    2016-06-01

    The traditional approach to coal lithotype analysis is based on a visual characterisation of coal in core, mine or outcrop exposures. As not all wells are fully cored, the petroleum and coal mining industries increasingly use geophysical wireline logs for lithology interpretation.This study demonstrates a method for interpreting coal lithotypes from geophysical wireline logs, and in particular discriminating between bright or banded, and dull coal at similar densities to a decimetre level. The study explores the optimum combination of geophysical log suites for training the coal electrofacies interpretation, using neural network conception, and then propagating the results to wells with fewer wireline data. This approach is objective and has a recordable reproducibility and rule set.In addition to conventional gamma ray and density logs, laterolog resistivity, microresistivity and PEF data were used in the study. Array resistivity data from a compact micro imager (CMI tool) were processed into a single microresistivity curve and integrated with the conventional resistivity data in the cluster analysis. Microresistivity data were tested in the analysis to test the hypothesis that the improved vertical resolution of microresistivity curve can enhance the accuracy of the clustering analysis. The addition of PEF log allowed discrimination between low density bright to banded coal electrofacies and low density inertinite-rich dull electrofacies.The results of clustering analysis were validated statistically and the results of the electrofacies results were compared to manually derived coal lithotype logs.

  15. Lithology and mineralogy recognition from geochemical logging tool data using multivariate statistical analysis.

    PubMed

    Konaté, Ahmed Amara; Ma, Huolin; Pan, Heping; Qin, Zhen; Ahmed, Hafizullah Abba; Dembele, N'dji Dit Jacques

    2017-10-01

    The availability of a deep well that penetrates deep into the Ultra High Pressure (UHP) metamorphic rocks is unusual and consequently offers a unique chance to study the metamorphic rocks. One such borehole is located in the southern part of Donghai County in the Sulu UHP metamorphic belt of Eastern China, from the Chinese Continental Scientific Drilling Main hole. This study reports the results obtained from the analysis of oxide log data. A geochemical logging tool provides in situ, gamma ray spectroscopy measurements of major and trace elements in the borehole. Dry weight percent oxide concentration logs obtained for this study were SiO 2 , K 2 O, TiO 2 , H 2 O, CO 2 , Na 2 O, Fe 2 O 3 , FeO, CaO, MnO, MgO, P 2 O 5 and Al 2 O 3 . Cross plot and Principal Component Analysis methods were applied for lithology characterization and mineralogy description respectively. Cross plot analysis allows lithological variations to be characterized. Principal Component Analysis shows that the oxide logs can be summarized by two components related to the feldspar and hydrous minerals. This study has shown that geochemical logging tool data is accurate and adequate to be tremendously useful in UHP metamorphic rocks analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Web-based pathology practice examination usage.

    PubMed

    Klatt, Edward C

    2014-01-01

    General and subject specific practice examinations for students in health sciences studying pathology were placed onto a free public internet web site entitled web path and were accessed four clicks from the home web site menu. Multiple choice questions were coded into. html files with JavaScript functions for web browser viewing in a timed format. A Perl programming language script with common gateway interface for web page forms scored examinations and placed results into a log file on an internet computer server. The four general review examinations of 30 questions each could be completed in up to 30 min. The 17 subject specific examinations of 10 questions each with accompanying images could be completed in up to 15 min each. The results of scores and user educational field of study from log files were compiled from June 2006 to January 2014. The four general review examinations had 31,639 accesses with completion of all questions, for a completion rate of 54% and average score of 75%. A score of 100% was achieved by 7% of users, ≥90% by 21%, and ≥50% score by 95% of users. In top to bottom web page menu order, review examination usage was 44%, 24%, 17%, and 15% of all accessions. The 17 subject specific examinations had 103,028 completions, with completion rate 73% and average score 74%. Scoring at 100% was 20% overall, ≥90% by 37%, and ≥50% score by 90% of users. The first three menu items on the web page accounted for 12.6%, 10.0%, and 8.2% of all completions, and the bottom three accounted for no more than 2.2% each. Completion rates were higher for shorter 10 questions subject examinations. Users identifying themselves as MD/DO scored higher than other users, averaging 75%. Usage was higher for examinations at the top of the web page menu. Scores achieved suggest that a cohort of serious users fully completing the examinations had sufficient preparation to use them to support their pathology education.

  17. MO-G-BRE-04: Automatic Verification of Daily Treatment Deliveries and Generation of Daily Treatment Reports for a MR Image-Guided Treatment Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, D; Li, X; Li, H

    2014-06-15

    Purpose: Two aims of this work were to develop a method to automatically verify treatment delivery accuracy immediately after patient treatment and to develop a comprehensive daily treatment report to provide all required information for daily MR-IGRT review. Methods: After systematically analyzing the requirements for treatment delivery verification and understanding the available information from a novel MR-IGRT treatment machine, we designed a method to use 1) treatment plan files, 2) delivery log files, and 3) dosimetric calibration information to verify the accuracy and completeness of daily treatment deliveries. The method verifies the correctness of delivered treatment plans and beams, beammore » segments, and for each segment, the beam-on time and MLC leaf positions. Composite primary fluence maps are calculated from the MLC leaf positions and the beam-on time. Error statistics are calculated on the fluence difference maps between the plan and the delivery. We also designed the daily treatment delivery report by including all required information for MR-IGRT and physics weekly review - the plan and treatment fraction information, dose verification information, daily patient setup screen captures, and the treatment delivery verification results. Results: The parameters in the log files (e.g. MLC positions) were independently verified and deemed accurate and trustable. A computer program was developed to implement the automatic delivery verification and daily report generation. The program was tested and clinically commissioned with sufficient IMRT and 3D treatment delivery data. The final version has been integrated into a commercial MR-IGRT treatment delivery system. Conclusion: A method was developed to automatically verify MR-IGRT treatment deliveries and generate daily treatment reports. Already in clinical use since December 2013, the system is able to facilitate delivery error detection, and expedite physician daily IGRT review and physicist weekly chart review.« less

  18. A Scientific Data Provenance API for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raju, Bibi; Elsethagen, Todd O.; Stephan, Eric G.

    Data provenance has been an active area of research as a means to standardize how the origin of data, process event history, and what or who was responsible for influencing results is explained. There are two approaches to capture provenance information. The first approach is to collect observed evidence produced by an executing application using log files, event listeners, and temporary files that are used by the application or application developer. The provenance translated from these observations is an interpretation of the provided evidence. The second approach is called disclosed because the application provides a firsthand account of the provenancemore » based on the anticipated questions on data flow, process flow, and responsible agents. Most observed provenance collection systems collect lot of provenance information during an application run or workflow execution. The common trend in capturing provenance is to collect all possible information, then attempt to find relevant information, which is not efficient. Existing disclosed provenance system APIs do not work well in distributed environment and have trouble finding where to fit the individual pieces of provenance information. This work focuses on determining more reliable solutions for provenance capture. As part of the Integrated End-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project, an API was developed, called Producer API (PAPI), which can disclose application targeted provenance, designed to work in distributed environments by means of unique object identification methods. The provenance disclosure approach used adds additional metadata to the provenance information to uniquely identify the pieces and connect them together. PAPI uses a common provenance model to support this provenance integration across disclosure sources. The API also provides the flexibility to let the user decide what to do with the collected provenance. The collected provenance can be sent to a triple store using REST services or it can be logged to a file.« less

  19. Meta-Analysis of the Reduction of Norovirus and Male-Specific Coliphage Concentrations in Wastewater Treatment Plants.

    PubMed

    Pouillot, Régis; Van Doren, Jane M; Woods, Jacquelina; Plante, Daniel; Smith, Mark; Goblick, Gregory; Roberts, Christopher; Locas, Annie; Hajen, Walter; Stobo, Jeffrey; White, John; Holtzman, Jennifer; Buenaventura, Enrico; Burkhardt, William; Catford, Angela; Edwards, Robyn; DePaola, Angelo; Calci, Kevin R

    2015-07-01

    Human norovirus (NoV) is the leading cause of foodborne illness in the United States and Canada. Wastewater treatment plant (WWTP) effluents impacting bivalve mollusk-growing areas are potential sources of NoV contamination. We have developed a meta-analysis that evaluates WWTP influent concentrations and log10 reductions of NoV genotype I (NoV GI; in numbers of genome copies per liter [gc/liter]), NoV genotype II (NoV GII; in gc/liter), and male-specific coliphage (MSC; in number of PFU per liter), a proposed viral surrogate for NoV. The meta-analysis included relevant data (2,943 measurements) reported in the scientific literature through September 2013 and previously unpublished surveillance data from the United States and Canada. Model results indicated that the mean WWTP influent concentration of NoV GII (3.9 log10 gc/liter; 95% credible interval [CI], 3.5, 4.3 log10 gc/liter) is larger than the value for NoV GI (1.5 log10 gc/liter; 95% CI, 0.4, 2.4 log10 gc/liter), with large variations occurring from one WWTP to another. For WWTPs with mechanical systems and chlorine disinfection, mean log10 reductions were -2.4 log10 gc/liter (95% CI, -3.9, -1.1 log10 gc/liter) for NoV GI, -2.7 log10 gc/liter (95% CI, -3.6, -1.9 log10 gc/liter) for NoV GII, and -2.9 log10 PFU per liter (95% CI, -3.4, -2.4 log10 PFU per liter) for MSCs. Comparable values for WWTPs with lagoon systems and chlorine disinfection were -1.4 log10 gc/liter (95% CI, -3.3, 0.5 log10 gc/liter) for NoV GI, -1.7 log10 gc/liter (95% CI, -3.1, -0.3 log10 gc/liter) for NoV GII, and -3.6 log10 PFU per liter (95% CI, -4.8, -2.4 PFU per liter) for MSCs. Within WWTPs, correlations exist between mean NoV GI and NoV GII influent concentrations and between the mean log10 reduction in NoV GII and the mean log10 reduction in MSCs. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  20. Development of Head Injury Assessment Reference Values Based on NASA Injury Modeling

    NASA Technical Reports Server (NTRS)

    Somers, Jeffrey T.; Melvin, John W.; Tabiei, Ala; Lawrence, Charles; Ploutz-Snyder, Robert; Granderson, Bradley; Feiveson, Alan; Gernhardt, Michael; Patalak, John

    2011-01-01

    NASA is developing a new capsule-based, crewed vehicle that will land in the ocean, and the space agency desires to reduce the risk of injury from impact during these landings. Because landing impact occurs for each flight and the crew might need to perform egress tasks, current injury assessment reference values (IARV) were deemed insufficient. Because NASCAR occupant restraint systems are more effective than the systems used to determine the current IARVs and are similar to NASA s proposed restraint system, an analysis of NASCAR impacts was performed to develop new IARVs that may be more relevant to NASA s context of vehicle landing operations. Head IARVs associated with race car impacts were investigated by completing a detailed analysis of all of the 2002-2008 NASCAR impact data. Specific inclusion and exclusion criteria were used to select 4071 impacts from the 4015 recorder files provided (each file could contain multiple impact events). Of the 4071 accepted impacts, 274 were selected for numerical simulation using a custom NASCAR restraint system and Humanetics Hybrid-III 50th percentile numerical dummy model in LS-DYNA. Injury had occurred in 32 of the 274 selected impacts, and 27 of those injuries involved the head. A majority of the head injuries were mild concussions with or without brief loss of consciousness. The 242 non-injury impacts were randomly selected and representative of the range of crash dynamics present in the total set of 4071 impacts. Head dynamics data (head translational acceleration, translational change in velocity, rotational acceleration, rotational velocity, HIC-15, HIC-36, and the Head 3ms clip) were filtered according to SAE J211 specifications and then transformed to a log scale. The probability of head injury was estimated using a separate logistic regression analysis for each log-transformed predictor candidate. Using the log transformation constrains the estimated probability of injury to become negligible as IARVs approach zero. For the parameters head translational acceleration, head translational velocity change, head rotational acceleration, HIC-15, and HIC-36, conservative values (in the lower 95% confidence interval) that gave rise to a 5% risk of any injury occurring were estimated as 40.0 G, 7.9 m/s, 2200 rad/s2, 98.4, and 77.4 respectively. Because NASA is interested in the consequence of any particular injury on the ability of the crew to perform egress tasks, the head injuries that occurred in the NASCAR dataset were classified according to a NASA-developed scale (Classes I - III) for operationally relevant injuries, which classifies injuries on the basis of their operational significance. Additional analysis of the data was performed to determine the probability of each injury class occurring, and this was estimated using an ordered probit model. For head translational acceleration, head translational velocity change, head rotational acceleration, head rotational velocity, HIC-36, and head 3ms clip, conservative values of IARVs that produced a 5% risk of Class II injury were estimated as 50.7 G, 9.5 m/s, 2863 rad/s2, 11.0 rad/s, 30.3, and 46.4 G respectively. The results indicate that head IARVs developed from the NASCAR dataset may be useful to protect crews during landing impact.

  1. Conjoin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjaardema, Gregory

    2010-08-06

    Conjoin is a code for joining sequentially in time multiple exodusII database files. It is used to create a single results or restart file from multiple results or restart files which typically arise as the result of multiple restarted analyses. The resulting output file will be the union of the input files with a status variable indicating the status of each element at the various time planes.Combining multiple exodusII files arising from a restarted analysis or combining multiple exodusII files arising from a finite element analysis with dynamic topology changes.

  2. Interpretation of well logs in a carbonate aquifer

    USGS Publications Warehouse

    MacCary, L.M.

    1978-01-01

    This report describes the log analysis of the Randolph and Sabial core holes in the Edwards aquifer in Texas, with particular attention to the principles that can be applied generally to any carbonate system. The geologic and hydrologic data were obtained during the drilling of the two holes, from extensive laboratory analysis of the cores, and from numerous geophysical logs run in the two holes. Some logging methods are inherently superiors to others for the analysis of limestone and dolomite aquifers. Three such systems are the dentistry, neutron, and acoustic-velocity (sonic) logs. Most of the log analysis described here is based on the interpretation of suites of logs from these three systems. In certain instances, deeply focused resistivity logs can be used to good advantage in carbonate rock studies; this technique is used to computer the water resistivity in the Randolph core hole. The rocks penetrated by the Randolph core hole are typical of those carbonates that have undergone very little solution by recent ground-water circulation. There are few large solutional openings; the water is saline; and the rocks are dark, dolomitic, have pore space that is interparticle or intercrystalline, and contain unoxidized organic material. The total porosity of rocks in the saline zone is higher than that of rocks in the fresh-water aquifer; however, the intrinsic permeability is much less in the saline zone because there are fewer large solutional openings. The Sabinal core hole penetrates a carbonate environment that has experienced much solution by ground water during recent geologic time. The rocks have high secondary porosities controlled by sedimentary structures within the rock; the water is fresh; and the dominant rock composition is limestone. The relative percentages of limestone and dolomite, the average matrix (grain) densities of the rock mixtures , and the porosity of the rock mass can be calculated from density, neutron, and acoustic logs. With supporting data from resistivity logs, the formation water quality can be estimated, as well as the relative cementation or tortuosity of the rock. Many of these properties calculated from logs can be verified by analysis of the core available from test holes drilled in the saline and fresh water zones.

  3. Archive of digital chirp subbottom profile data collected during USGS Cruise 13CCT04 offshore of Petit Bois Island, Mississippi, August 2013

    USGS Publications Warehouse

    Forde, Arnell S.; Flocks, James G.; Kindinger, Jack G.; Bernier, Julie C.; Kelso, Kyle W.; Wiese, Dana S.

    2015-01-01

    From August 13-23, 2013, the U.S. Geological Survey (USGS), in cooperation with the U.S. Army Corps of Engineers (USACE) conducted geophysical surveys to investigate the geologic controls on barrier island framework and long-term sediment transport offshore of Petit Bois Island, Mississippi. This investigation is part of a broader USGS study on Coastal Change and Transport (CCT). These surveys were funded through the Mississippi Coastal Improvements Program (MsCIP) with partial funding provided by the Northern Gulf of Mexico Ecosystem Change and Hazard Susceptibility Project. This report serves as an archive of unprocessed digital chirp subbottom data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, and formal Federal Geographic Data Committee (FGDC) metadata. Gained-showing a relative increase in signal amplitude-digital images of the seismic profiles are provided.

  4. Remote Environmental Monitoring and Diagnostics in the Perishables Supply Chain - Phase 1

    DTIC Science & Technology

    2011-12-12

    The table below displays the  raw  data from the tests. Each cell contains a number between 0  and 5 corresponding  to  the number of  successful...along  with  the  raw   temperature  data  to  the  email  addresses  specified  in  the  configuration file.    As mentioned previously, for the CAEN...the Intelleflex system.    The user also has the option to save the data log, which contains the  raw  temperature data, to  a file on the Windows

  5. Archive of Digital Chirp Sub-bottom Profile Data Collected During USGS Cruises 08CCT02 and 08CCT03, Mississippi Gulf Islands, July and September 2008

    USGS Publications Warehouse

    Barry, K.M.; Cavers, D.A.; Kneale, C.W.

    2011-01-01

    In July and September of 2008, the U.S. Geological Survey (USGS) conducted geophysical surveys to investigate the geologic controls on island framework from Ship Island to Horn Island, MS, for the Northern Gulf of Mexico (NGOM) Ecosystem Change and Hazard Susceptibility project. This project is also part of a broader USGS study on Coastal Change and Transport (CCT). This report serves as an archive of unprocessed digital Chirp sub-bottom profile data, trackline maps, navigation files, Geographic Information System (GIS) files, Field Activity Collection System (FACS) logs, observer's logbook, and formal Federal Geographic Data Committee (FGDC) metadata. Gained (a relative increase in signal amplitude) digital images of the sub-bottom profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report.

  6. A Kinect-based system for automatic recording of some pigeon behaviors.

    PubMed

    Lyons, Damian M; MacDonall, James S; Cunningham, Kelly M

    2015-12-01

    Contact switches and touch screens are the state of the art for recording pigeons' pecking behavior. Recording other behavior, however, requires a different sensor for each behavior, and some behaviors cannot easily be recorded. We present a flexible and inexpensive image-based approach to detecting and counting pigeon behaviors that is based on the Kinect sensor from Microsoft. Although the system is as easy to set up and use as the standard approaches, it is more flexible because it can record behaviors in addition to key pecking. In this article, we show how both the fast, fine motion of key pecking and the gross body activity of feeding can be measured. Five pigeons were trained to peck at a lighted contact switch, a pigeon key, to obtain food reward. The timing of the pecks and the food reward signals were recorded in a log file using standard equipment. The Kinect-based system, called BehaviorWatch, also measured the pecking and feeding behavior and generated a different log file. For key pecking, BehaviorWatch had an average sensitivity of 95% and a precision of 91%, which were very similar to the pecking measurements from the standard equipment. For detecting feeding activity, BehaviorWatch had a sensitivity of 95% and a precision of 97%. These results allow us to demonstrate that an advantage of the Kinect-based approach is that it can also be reliably used to measure activity other than key pecking.

  7. Methodology for locating defects within hardwood logs and determining their impact on lumber-value yield

    Treesearch

    Thomas Harless; Francis G. Wagner; Phillip Steele; Fred Taylor; Vikram Yadama; Charles W. McMillin

    1991-01-01

    A precise research methodology is described by which internal log-defect locations may help select hardwood log ortentation and sawing procedure to improve lumber value. Procedures for data collection, data handling, simulated sawing, and data analysis are described. A single test log verified the methodology. Results from this log showed significant differences in...

  8. The computerized OMAHA system in microsoft office excel.

    PubMed

    Lai, Xiaobin; Wong, Frances K Y; Zhang, Peiqiang; Leung, Carenx W Y; Lee, Lai H; Wong, Jessica S Y; Lo, Yim F; Ching, Shirley S Y

    2014-01-01

    The OMAHA System was adopted as the documentation system in an interventional study. To systematically record client care and facilitate data analysis, two Office Excel files were developed. The first Excel file (File A) was designed to record problems, care procedure, and outcomes for individual clients according to the OMAHA System. It was used by the intervention nurses in the study. The second Excel file (File B) was the summary of all clients that had been automatically extracted from File A. Data in File B can be analyzed directly in Excel or imported in PASW for further analysis. Both files have four parts to record basic information and the three parts of the OMAHA System. The computerized OMAHA System simplified the documentation procedure and facilitated the management and analysis of data.

  9. Web Log Analysis: A Study of Instructor Evaluations Done Online

    ERIC Educational Resources Information Center

    Klassen, Kenneth J.; Smith, Wayne

    2004-01-01

    This paper focuses on developing a relatively simple method for analyzing web-logs. It also explores the challenges and benefits of web-log analysis. The study of student behavior on this site provides insights into website design and the effectiveness of this site in particular. Another benefit realized from the paper is the ease with which these…

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCord, Jason

    WLS gathers all known relevant contextual data along with standard event log information, processes it into an easily consumable format for analysis by 3rd party tools, and forwards the logs to any compatible log server.

  11. SU-E-J-150: Impact of Intrafractional Prostate Motion On the Accuracy and Efficiency of Prostate SBRT Delivery: A Retrospective Analysis of Prostate Tracking Log Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiang, H; Hirsch, A; Willins, J

    2014-06-01

    Purpose: To measure intrafractional prostate motion by time-based stereotactic x-ray imaging and investigate the impact on the accuracy and efficiency of prostate SBRT delivery. Methods: Prostate tracking log files with 1,892 x-ray image registrations from 18 SBRT fractions for 6 patients were retrospectively analyzed. Patient setup and beam delivery sessions were reviewed to identify extended periods of large prostate motion that caused delays in setup or interruptions in beam delivery. The 6D prostate motions were compared to the clinically used PTV margin of 3–5 mm (3 mm posterior, 5 mm all other directions), a hypothetical PTV margin of 2–3 mmmore » (2 mm posterior, 3 mm all other directions), and the rotation correction limits (roll ±2°, pitch ±5° and yaw ±3°) of CyberKnife to quantify beam delivery accuracy. Results: Significant incidents of treatment start delay and beam delivery interruption were observed, mostly related to large pitch rotations of ≥±5°. Optimal setup time of 5–15 minutes was recorded in 61% of the fractions, and optimal beam delivery time of 30–40 minutes in 67% of the fractions. At a default imaging interval of 15 seconds, the percentage of prostate motion beyond PTV margin of 3–5 mm varied among patients, with a mean at 12.8% (range 0.0%–31.1%); and the percentage beyond PTV margin of 2–3 mm was at a mean of 36.0% (range 3.3%–83.1%). These timely detected offsets were all corrected real-time by the robotic manipulator or by operator intervention at the time of treatment interruptions. Conclusion: The durations of patient setup and beam delivery were directly affected by the occurrence of large prostate motion. Frequent imaging of down to 15 second interval is necessary for certain patients. Techniques for reducing prostate motion, such as using endorectal balloon, can be considered to assure consistently higher accuracy and efficiency of prostate SBRT delivery.« less

  12. User-composable Electronic Health Record Improves Efficiency of Clinician Data Viewing for Patient Case Appraisal: A Mixed-Methods Study.

    PubMed

    Senathirajah, Yalini; Kaufman, David; Bakken, Suzanne

    2016-01-01

    Challenges in the design of electronic health records (EHRs) include designing usable systems that must meet the complex, rapidly changing, and high-stakes information needs of clinicians. The ability to move and assemble elements together on the same page has significant human-computer interaction (HCI) and efficiency advantages, and can mitigate the problems of negotiating multiple fixed screens and the associated cognitive burdens. We compare MedWISE-a novel EHR that supports user-composable displays-with a conventional EHR in terms of the number of repeat views of data elements for patient case appraisal. The study used mixed-methods for examination of clinical data viewing in four patient cases. The study compared use of an experimental user-composable EHR with use of a conventional EHR, for case appraisal. Eleven clinicians used a user-composable EHR in a case appraisal task in the laboratory setting. This was compared with log file analysis of the same patient cases in the conventional EHR. We investigated the number of repeat views of the same clinical information during a session and across these two contexts, and compared them using Fisher's exact test. There was a significant difference (p<.0001) in proportion of cases with repeat data element viewing between the user-composable EHR (14.6 percent) and conventional EHR (72.6 percent). Users of conventional EHRs repeatedly viewed the same information elements in the same session, as revealed by log files. Our findings are consistent with the hypothesis that conventional systems require that the user view many screens and remember information between screens, causing the user to forget information and to have to access the information a second time. Other mechanisms (such as reduction in navigation over a population of users due to interface sharing, and information selection) may also contribute to increased efficiency in the experimental system. Systems that allow a composable approach that enables the user to gather together on the same screen any desired information elements may confer cognitive support benefits that can increase productive use of systems by reducing fragmented information. By reducing cognitive overload, it can also enhance the user experience.

  13. AnalyzeHOLE - An Integrated Wellbore Flow Analysis Tool

    USGS Publications Warehouse

    Halford, Keith

    2009-01-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically displaying pertinent results.

  14. Multicriteria evaluation of simulated logging scenarios in a tropical rain forest.

    PubMed

    Huth, Andreas; Drechsler, Martin; Köhler, Peter

    2004-07-01

    Forest growth models are useful tools for investigating the long-term impacts of logging. In this paper, the results of the rain forest growth model FORMIND were assessed by a multicriteria decision analysis. The main processes covered by FORMIND include tree growth, mortality, regeneration and competition. Tree growth is calculated based on a carbon balance approach. Trees compete for light and space; dying large trees fall down and create gaps in the forest. Sixty-four different logging scenarios for an initially undisturbed forest stand at Deramakot (Malaysia) were simulated. The scenarios differ regarding the logging cycle, logging method, cutting limit and logging intensity. We characterise the impacts with four criteria describing the yield, canopy opening and changes in species composition. Multicriteria decision analysis was used for the first time to evaluate the scenarios and identify the efficient ones. Our results plainly show that reduced-impact logging scenarios are more 'efficient' than the others, since in these scenarios forest damage is minimised without significantly reducing yield. Nevertheless, there is a trade-off between yield and achieving a desired ecological state of logged forest; the ecological state of the logged forests can only be improved by reducing yields and enlarging the logging cycles. Our study also demonstrates that high cutting limits or low logging intensities cannot compensate for the high level of damage caused by conventional logging techniques.

  15. Mechanical reduction of the intracanal Enterococcus faecalis population by Hyflex CM, K3XF, ProTaper Next, and two manual instrument systems: an in vitro comparative study.

    PubMed

    Tewari, Rajendra K; Ali, Sajid; Mishra, Surendra K; Kumar, Ashok; Andrabi, Syed Mukhtar-Un-Nisar; Zoya, Asma; Alam, Sharique

    2016-05-01

    In the present study, the effectiveness of three rotary and two manual nickel titanium instrument systems on mechanical reduction of the intracanal Enterococcus faecalis population was evaluated. Mandibular premolars with straight roots were selected. Teeth were decoronated and instrumented until 20 K file and irrigated with physiological saline. After sterilization by ethylene oxide gas, root canals were inoculated with Enterococcus faecalis. The specimens were randomly divided into five groups for canal instrumentation: Manual Nitiflex and Hero Shaper nickel titanium files, and rotary Hyflex CM, ProTaper Next, and K3XF nickel titanium files. Intracanal bacterial sampling was done before and after instrumentation. After serial dilution, samples were plated onto the Mitis Salivarius agar. The c.f.u. grown were counted, and log10 transformation was calculated. All instrumentation systems significantly reduced the intracanal bacterial population after root canal preparation. ProTaper Next was found to be significantly more effective than Hyflex CM and manual Nitiflex and Hero Shaper. However, ProTaper Next showed no significant difference with K3XF. Canal instrumentation by all the file systems significantly reduced the intracanal Enterococcus faecalis counts. ProTaper Next was found to be most effective in reducing the number of bacteria than other rotary or hand instruments. © 2014 Wiley Publishing Asia Pty Ltd.

  16. Modeling and evaluating user behavior in exploratory visual analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reda, Khairi; Johnson, Andrew E.; Papka, Michael E.

    Empirical evaluation methods for visualizations have traditionally focused on assessing the outcome of the visual analytic process as opposed to characterizing how that process unfolds. There are only a handful of methods that can be used to systematically study how people use visualizations, making it difficult for researchers to capture and characterize the subtlety of cognitive and interaction behaviors users exhibit during visual analysis. To validate and improve visualization design, however, it is important for researchers to be able to assess and understand how users interact with visualization systems under realistic scenarios. This paper presents a methodology for modeling andmore » evaluating the behavior of users in exploratory visual analysis. We model visual exploration using a Markov chain process comprising transitions between mental, interaction, and computational states. These states and the transitions between them can be deduced from a variety of sources, including verbal transcripts, videos and audio recordings, and log files. This model enables the evaluator to characterize the cognitive and computational processes that are essential to insight acquisition in exploratory visual analysis, and reconstruct the dynamics of interaction between the user and the visualization system. We illustrate this model with two exemplar user studies, and demonstrate the qualitative and quantitative analytical tools it affords.« less

  17. Financial and Economic Analysis of Reduced Impact Logging

    Treesearch

    Tom Holmes

    2016-01-01

    Concern regarding extensive damage to tropical forests resulting from logging increased dramatically after World War II when mechanized logging systems developed in industrialized countries were deployed in the tropics. As a consequence, tropical foresters began developing logging procedures that were more environmentally benign, and by the 1990s, these practices began...

  18. An Analysis of the Differences among Log Scaling Methods and Actual Log Volume

    Treesearch

    R. Edward Thomas; Neal D. Bennett

    2017-01-01

    Log rules estimate the volume of green lumber that can be expected to result from the sawing of a log. As such, this ability to reliably predict lumber recovery forms the foundation of log sales and purchase. The more efficient a sawmill, the less the scaling methods reflect the actual volume recovery and the greater the overrun factor. Using high-resolution scanned...

  19. Creative Analytics of Mission Ops Event Messages

    NASA Technical Reports Server (NTRS)

    Smith, Dan

    2017-01-01

    Historically, tremendous effort has been put into processing and displaying mission health and safety telemetry data; and relatively little attention has been paid to extracting information from missions time-tagged event log messages. Todays missions may log tens of thousands of messages per day and the numbers are expected to dramatically increase as satellite fleets and constellations are launched, as security monitoring continues to evolve, and as the overall complexity of ground system operations increases. The logs may contain information about orbital events, scheduled and actual observations, device status and anomalies, when operators were logged on, when commands were resent, when there were data drop outs or system failures, and much much more. When dealing with distributed space missions or operational fleets, it becomes even more important to systematically analyze this data. Several advanced information systems technologies make it appropriate to now develop analytic capabilities which can increase mission situational awareness, reduce mission risk, enable better event-driven automation and cross-mission collaborations, and lead to improved operations strategies: Industry Standard for Log Messages. The Object Management Group (OMG) Space Domain Task Force (SDTF) standards organization is in the process of creating a formal standard for industry for event log messages. The format is based on work at NASA GSFC. Open System Architectures. The DoD, NASA, and others are moving towards common open system architectures for mission ground data systems based on work at NASA GSFC with the full support of the commercial product industry and major integration contractors. Text Analytics. A specific area of data analytics which applies statistical, linguistic, and structural techniques to extract and classify information from textual sources. This presentation describes work now underway at NASA to increase situational awareness through the collection of non-telemetry mission operations information into a common log format and then providing display and analytics tools to provide in-depth assessment of the log contents. The work includes: Common interface formats for acquiring time-tagged text messages Conversion of common files for schedules, orbital events, and stored commands to the common log format Innovative displays to depict thousands of messages on a single display Structured English text queries against the log message data store, extensible to a more mature natural language query capability Goal of speech-to-text and text-to-speech additions to create a personal mission operations assistant to aid on-console operations. A wide variety of planned uses identified by the mission operations teams will be discussed.

  20. The association in a two-way contingency table through log odds ratio analysis: the case of Sarno river pollution.

    PubMed

    Camminatiello, Ida; D'Ambra, Antonello; Sarnacchiaro, Pasquale

    2014-01-01

    In this paper we are proposing a general framework for the analysis of the complete set of log Odds Ratios (ORs) generated by a two-way contingency table. Starting from the RC (M) association model and hypothesizing a Poisson distribution for the counts of the two-way contingency table we are obtaining the weighted Log Ratio Analysis that we are extending to the study of log ORs. Particularly we are obtaining an indirect representation of the log ORs and some synthesis measures. Then for studying the matrix of log ORs we are performing a generalized Singular Value Decomposition that allows us to obtain a direct representation of log ORs. We also expect to get summary measures of association too. We have considered the matrix of complete set of ORs, because, it is linked to the two-way contingency table in terms of variance and it allows us to represent all the ORs on a factorial plan. Finally, a two-way contingency table, which crosses pollution of the Sarno river and sampling points, is to be analyzed to illustrate the proposed framework.

  1. Assessment of feasibility of running RSNA's MIRC on a Raspberry Pi: a cost-effective solution for teaching files in radiology.

    PubMed

    Pereira, Andre; Atri, Mostafa; Rogalla, Patrik; Huynh, Thien; O'Malley, Martin E

    2015-11-01

    The value of a teaching case repository in radiology training programs is immense. The allocation of resources for putting one together is a complex issue, given the factors that have to be coordinated: hardware, software, infrastructure, administration, and ethics. Costs may be significant and cost-effective solutions are desirable. We chose Medical Imaging Resource Center (MIRC) to build our teaching file. It is offered by RSNA for free. For the hardware, we chose the Raspberry Pi, developed by the Raspberry Foundation: a small control board developed as a low cost computer for schools also used in alternative projects such as robotics and environmental data collection. Its performance and reliability as a file server were unknown to us. For the operational system, we chose Raspbian, a variant of Debian Linux, along with Apache (web server), MySql (database server) and PHP, which enhance the functionality of the server. A USB hub and an external hard drive completed the setup. Installation of software was smooth. The Raspberry Pi was able to handle very well the task of hosting the teaching file repository for our division. Uptime was logged at 100 %, and loading times were similar to other MIRC sites available online. We setup two servers (one for backup), each costing just below $200.00 including external storage and USB hub. It is feasible to run RSNA's MIRC off a low-cost control board (Raspberry Pi). Performance and reliability are comparable to full-size servers for the intended purpose of hosting a teaching file within an intranet environment.

  2. Archive of digital Boomer seismic reflection data collected during USGS Cruises 94CCT01 and 95CCT01, eastern Texas and western Louisiana, 1994 and 1995

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Morton, Robert A.; Wiese, Dana S.

    2004-01-01

    In June of 1994 and August and September of 1995, the U.S. Geological Survey, in cooperation with the University of Texas Bureau of Economic Geology, conducted geophysical surveys of the Sabine and Calcasieu Lake areas and the Gulf of Mexico offshore eastern Texas and western Louisiana. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, observers' logbooks, GIS information, and formal FGDC metadata. In addition, a filtered and gained GIF image of each seismic profile is provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. Processed profile images, trackline maps, navigation files, and formal metadata may be viewed with a web browser. Scanned handwritten logbooks and Field Activity Collection System (FACS) logs may be viewed with Adobe Reader.

  3. Archive of digital Boomer and Chirp seismic reflection data collected during USGS Cruises 01RCE05 and 02RCE01 in the Lower Atchafalaya River, Mississippi River Delta, and offshore southeastern Louisiana, October 23-30, 2001, and August 18-19, 2002

    USGS Publications Warehouse

    Calderon, Karynna; Dadisman, Shawn V.; Kindinger, Jack G.; Flocks, James G.; Ferina, Nicholas F.; Wiese, Dana S.

    2004-01-01

    In October of 2001 and August of 2002, the U.S. Geological Survey conducted geophysical surveys of the Lower Atchafalaya River, the Mississippi River Delta, Barataria Bay, and the Gulf of Mexico south of East Timbalier Island, Louisiana. This report serves as an archive of unprocessed digital marine seismic reflection data, trackline maps, navigation files, observers' logbooks, GIS information, and formal FGDC metadata. In addition, a filtered and gained GIF image of each seismic profile is provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and othes, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Examples of SU processing scripts and in-house (USGS) software for viewing SEG-Y files (Zihlman, 1992) are also provided. Processed profile images, trackline maps, navigation files, and formal metadata may be viewed with a web browser. Scanned handwritten logbooks and Field Activity Collection System (FACS) logs may be viewed with Adobe Reader.

  4. Petrophysical analysis of geophysical logs of the National Drilling Company-U.S. Geological Survey ground-water research project for Abu Dhabi Emirate, United Arab Emirates

    USGS Publications Warehouse

    Jorgensen, Donald G.; Petricola, Mario

    1994-01-01

    A program of borehole-geophysical logging was implemented to supply geologic and geohydrologic information for a regional ground-water investigation of Abu Dhabi Emirate. Analysis of geophysical logs was essential to provide information on geohydrologic properties because drill cuttings were not always adequate to define lithologic boundaries. The standard suite of logs obtained at most project test holes consisted of caliper, spontaneous potential, gamma ray, dual induction, microresistivity, compensated neutron, compensated density, and compensated sonic. Ophiolitic detritus from the nearby Oman Mountains has unusual petrophysical properties that complicated the interpretation of geophysical logs. The density of coarse ophiolitic detritus is typically greater than 3.0 grams per cubic centimeter, porosity values are large, often exceeding 45 percent, and the clay fraction included unusual clays, such as lizardite. Neither the spontaneous-potential log nor the natural gamma-ray log were useable clay indicators. Because intrinsic permeability is a function of clay content, additional research in determining clay content was critical. A research program of geophysical logging was conducted to determine the petrophysical properties of the shallow subsurface formations. The logging included spectral-gamma and thermal-decay-time logs. These logs, along with the standard geophysical logs, were correlated to mineralogy and whole-rock chemistry as determined from sidewall cores. Thus, interpretation of lithology and fluids was accomplished. Permeability and specific yield were calculated from geophysical-log data and correlated to results from an aquifer test. On the basis of results from the research logging, a method of lithologic and water-resistivity interpretation was developed for the test holes at which the standard suite of logs were obtained. In addition, a computer program was developed to assist in the analysis of log data. Geohydrologic properties were estimated, including volume of clay matrix, volume of matrix other than clay, density of matrix other than clay, density of matrix, intrinsic permeability, specific yield, and specific storage. Geophysical logs were used to (1) determine lithology, (2) correlate lithologic and permeable zones, (3) calibrate seismic reprocessing, (4) calibrate transient-electromagnetic surveys, and (5) calibrate uphole-survey interpretations. Logs were used at the drill site to (1) determine permeability zones, (2) determine dissolved-solids content, which is a function of water resistivity, and (3) design wells accordingly. Data and properties derived from logs were used to determine transmissivity and specific yield of aquifer materials.

  5. 75 FR 24718 - Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-05

    ...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... Programs and Data Files.'' This guidance is provided to inform study statisticians of recommendations for documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the...

  6. Keystroke Analysis: Reflections on Procedures and Measures

    ERIC Educational Resources Information Center

    Baaijen, Veerle M.; Galbraith, David; de Glopper, Kees

    2012-01-01

    Although keystroke logging promises to provide a valuable tool for writing research, it can often be difficult to relate logs to underlying processes. This article describes the procedures and measures that the authors developed to analyze a sample of 80 keystroke logs, with a view to achieving a better alignment between keystroke-logging measures…

  7. Electronic Warfare M-on-N Digital Simulation Logging Requirements and HDF5: A Preliminary Analysis

    DTIC Science & Technology

    2017-04-12

    LOGGING STREAM The goal of this report is to investigate logging of EW simulations not at the level of implementation in a database management ...differences of the logging stream and relational models.  A hierarchical navigation query style appears very natural for our application. Yet the

  8. Selective logging in the Brazilian Amazon.

    Treesearch

    G. P. Asner; D. E. Knapp; E. N. Broadbent; P. J. C. Oliveira; M Keller; J. N. Silva

    2005-01-01

    Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square...

  9. Technoeconomic analysis of conventional logging systems operating from stump to landing

    Treesearch

    Raymond L. Sarles; William G. Luppold; William G. Luppold

    1986-01-01

    Analyzes technical and economic factors for six conventional logging systems suitable for operation in eastern forests. Discusses financial risks and business implications for loggers investing in high-production, state-of-the-art logging systems. Provides logging contractors with information useful as a preliminary guide for selection of equipment and systems....

  10. VizieR Online Data Catalog: Nearby B-type stars abundances (Morel+, 2008)

    NASA Astrophysics Data System (ADS)

    Morel, T.; Butler, K.

    2008-06-01

    This Table gives the adopted loggf values, EW measurements (in mA) and line-by-line abundances (on the scale in which log[epsilon(H)]=12). A blank indicates that the EW was not reliably measurable, the line was considered blended for the relevant temperature range or yielded a discrepant abundance. The accuracy of the EW measurements is discussed in Sect.3 of the paper. The wing of HeI 4387.9 was taken as pseudo continuum in the case of NeII 4391.99. (2 data files).

  11. VizieR Online Data Catalog: Hubble Tarantula Treasury Project (HTTP). III. (Sabbi+, 2016)

    NASA Astrophysics Data System (ADS)

    Sabbi, E.; Lennon, D. J.; Anderson, J.; Cignoni, M.; van der Marel, R. P.; Zaritsky, D.; de Marchi, G.; Panagia, N.; Gouliermis, D. A.; Grebel, E. K.; Gallagher, J. S., III; Smith, L. J.; Sana, H.; Aloisi, A.; Tosi, M.; Evans, C. J.; Arab, H.; Boyer, M.; de Mink, S. E.; Gordon, K.; Koekemoer, A. M.; Larsen, S. S.; Ryon, J. E.; Zeidler, P.

    2016-02-01

    Hubble Tarantula Treasury Project (HTTP; HST 12939, PI Elena Sabbi + HST 12499, PI Danny Lennon) was awarded 60 orbits of HST time in cycle 20 to survey the entire Tarantula Nebula (30 Doradus), using both the UVIS and the IR channels of the Wide Field Camera 3 (WFC3), and, in parallel, the Wide Field Channel (WFC) of the Advanced Camera for Surveys (ACS). See log of the observations (from 2011 Oct 03 to 2013 Sep 17) in table 1. (2 data files).

  12. N2C2M2 Experimentation and Validation: Understanding Its C2 Approaches and Implications

    DTIC Science & Technology

    2010-06-01

    C O N FL IC TE D D EC O N FL IC TE D C O O R D IN A TE D C O LL A B O R A TI...Interactions (Shares and Posts) Log File LE VE L Fa ct oi d Se t Tr ia l TO TA L Va lu e TO TA L Va lu e / Su bj ec t Te nd en cy C TC TL s...80% 90% 100% CO N FL IC TE

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiCostanzo, D; Ayan, A; Woollard, J

    Purpose: To automate the daily verification of each patient’s treatment by utilizing the trajectory log files (TLs) written by the Varian TrueBeam linear accelerator while reducing the number of false positives including jaw and gantry positioning errors, that are displayed in the Treatment History tab of Varian’s Chart QA module. Methods: Small deviations in treatment parameters are difficult to detect in weekly chart checks, but may be significant in reducing delivery errors, and would be critical if detected daily. Software was developed in house to read TLs. Multiple functions were implemented within the software that allow it to operate viamore » a GUI to analyze TLs, or as a script to run on a regular basis. In order to determine tolerance levels for the scripted analysis, 15,241 TLs from seven TrueBeams were analyzed. The maximum error of each axis for each TL was written to a CSV file and statistically analyzed to determine the tolerance for each axis accessible in the TLs to flag for manual review. The software/scripts developed were tested by varying the tolerance values to ensure veracity. After tolerances were determined, multiple weeks of manual chart checks were performed simultaneously with the automated analysis to ensure validity. Results: The tolerance values for the major axis were determined to be, 0.025 degrees for the collimator, 1.0 degree for the gantry, 0.002cm for the y-jaws, 0.01cm for the x-jaws, and 0.5MU for the MU. The automated verification of treatment parameters has been in clinical use for 4 months. During that time, no errors in machine delivery of the patient treatments were found. Conclusion: The process detailed here is a viable and effective alternative to manually checking treatment parameters during weekly chart checks.« less

  14. The costs of heparin-induced thrombocytopenia: a patient-based cost of illness analysis.

    PubMed

    Wilke, T; Tesch, S; Scholz, A; Kohlmann, T; Greinacher, A

    2009-05-01

    SUMMARY BACKGROUND AND OBJECTIVES: Due to the complexity of heparin-induced thrombocytopenia (HIT), currently available cost analyses are rough estimates. The objectives of this study were quantification of costs involved in HIT and identification of main cost drivers based on a patient-oriented approach. Patients diagnosed with HIT (1995-2004, University-hospital Greifswald, Germany) based on a positive functional assay (HIPA test) were retrieved from the laboratory records and scored (4T-score) by two medical experts using the patient file. For cost of illness analysis, predefined HIT-relevant cost parameters (medication costs, prolonged in-hospital stay, diagnostic and therapeutic interventions, laboratory tests, blood transfusions) were retrieved from the patient files. The data were analysed by linear regression estimates with the log of costs and a gamma regression model. Mean length of stay data of non-HIT patients were obtained from the German Federal Statistical Office, adjusted for patient characteristics, comorbidities and year of treatment. Hospital costs were provided by the controlling department. One hundred and thirty HIT cases with a 4T-score >or=4 and a positive HIPA test were analyzed. Mean additional costs of a HIT case were 9008 euro. The main cost drivers were prolonged in-hospital stay (70.3%) and costs of alternative anticoagulants (19.7%). HIT was more costly in surgical patients compared with medical patients and in patients with thrombosis. Early start of alternative anticoagulation did not increase HIT costs despite the high medication costs indicating prevention of costly complications. An HIT cost calculator is provided, allowing online calculation of HIT costs based on local cost structures and different currencies.

  15. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS

    PubMed Central

    Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2016-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optimizations1 to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor a , an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions. PMID:27896971

  16. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS.

    PubMed

    Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2017-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.

  17. SPR online: creating, maintaining, and distributing a virtual professional society on the Internet.

    PubMed

    D'Alessandro, M P; Galvin, J R

    1998-01-01

    SPR Online (http:@www.pedrad.org) is a recently developed digital representation of the Society for Pediatric Radiology (SPR) that enables physicians to access pertinent information and services on the Internet. SPR Online was organized on the basis of the five main services of the SPR, which include Administration, Patient Care, Education, Research, and Meetings. For each service, related content from the SPR was digitized and placed onto SPR Online. Usage over a 12-month period was evaluated with server log file analysis. A total of 3,209 users accessed SPR Online, viewing 11,246 pages of information. A wide variety of information was accessed, with that from the Education, Administration, and Meetings services being the most popular. Fifteen percent of users came from foreign countries. As a virtual professional society, SPR Online greatly enhances the power and scope of the SPR and has proved to be a popular resource, meeting the diverse information needs of an international community of pediatric radiologists.

  18. Activity Catalog Tool (ACT) user manual, version 2.0

    NASA Technical Reports Server (NTRS)

    Segal, Leon D.; Andre, Anthony D.

    1994-01-01

    This report comprises the user manual for version 2.0 of the Activity Catalog Tool (ACT) software program, developed by Leon D. Segal and Anthony D. Andre in cooperation with NASA Ames Aerospace Human Factors Research Division, FLR branch. ACT is a software tool for recording and analyzing sequences of activity over time that runs on the Macintosh platform. It was designed as an aid for professionals who are interested in observing and understanding human behavior in field settings, or from video or audio recordings of the same. Specifically, the program is aimed at two primary areas of interest: human-machine interactions and interactions between humans. The program provides a means by which an observer can record an observed sequence of events, logging such parameters as frequency and duration of particular events. The program goes further by providing the user with a quantified description of the observed sequence, through application of a basic set of statistical routines, and enables merging and appending of several files and more extensive analysis of the resultant data.

  19. A Comparison of Reasoning Processes in a Collaborative Modelling Environment: Learning about genetics problems using virtual chat

    NASA Astrophysics Data System (ADS)

    Pata, Kai; Sarapuu, Tago

    2006-09-01

    This study investigated the possible activation of different types of model-based reasoning processes in two learning settings, and the influence of various terms of reasoning on the learners’ problem representation development. Changes in 53 students’ problem representations about genetic issue were analysed while they worked with different modelling tools in a synchronous network-based environment. The discussion log-files were used for the “microgenetic” analysis of reasoning types. For studying the stages of students’ problem representation development, individual pre-essays and post-essays and their utterances during two reasoning phases were used. An approach for mapping problem representations was developed. Characterizing the elements of mental models and their reasoning level enabled the description of five hierarchical categories of problem representations. Learning in exploratory and experimental settings was registered as the shift towards more complex stages of problem representations in genetics. The effect of different types of reasoning could be observed as the divergent development of problem representations within hierarchical categories.

  20. Use of Web-based library resources by medical students in community and ambulatory settings*

    PubMed Central

    Tannery, Nancy Hrinya; Foust, Jill E.; Gregg, Amy L.; Hartman, Linda M.; Kuller, Alice B.; Worona, Paul; Tulsky, Asher A.

    2002-01-01

    Purpose: The purpose was to evaluate the use of Web-based library resources by third-year medical students. Setting/Participants/Resources: Third-year medical students (147) in a twelve-week multidisciplinary primary care rotation in community and ambulatory settings. Methodology: Individual user surveys and log file analysis of Website were used. Results/Outcomes: Twenty resource topics were compiled into a Website to provide students with access to electronic library resources from any community-based clerkship location. These resource topics, covering subjects such as hypertension and back pain, linked to curriculum training problems, full-text journal articles, MEDLINE searches, electronic book chapters, and relevant Websites. More than half of the students (69%) accessed the Website on a daily or weekly basis. Over 80% thought the Website was a valuable addition to their clerkship. Discussion/Conclusion: Web-based information resources can provide curriculum support to students for whom access to the library is difficult and time consuming. PMID:12113515

Top