Development of Cross-Platform Software for Well Logging Data Visualization
NASA Astrophysics Data System (ADS)
Akhmadulin, R. K.; Miraev, A. I.
2017-07-01
Well logging data processing is one of the main sources of information in the oil-gas field analysis and is of great importance in the process of its development and operation. Therefore, it is important to have the software which would accurately and clearly provide the user with processed data in the form of well logs. In this work, there have been developed a software product which not only has the basic functionality for this task (loading data from .las files, well log curves display, etc.), but can be run in different operating systems and on different devices. In the article a subject field analysis and task formulation have been performed, and the software design stage has been considered. At the end of the work the resulting software product interface has been described.
Computer planning tools applied to a cable logging research study
Chris B. LeDoux; Penn A. Peters
1985-01-01
Contemporary harvest planning software was used in planning the layout of cable logging units for a production study of the Clearwater Yarder in upstate New York. Planning software, including payload analysis and digital terrain models, allowed researchers to identify layout and yarding problems before the experiment. Analysis of proposed ground profiles pinpointed the...
Who Goes There? Measuring Library Web Site Usage.
ERIC Educational Resources Information Center
Bauer, Kathleen
2000-01-01
Discusses how libraries can gather data on the use of their Web sites. Highlights include Web server log files, including the common log file, referrer log file, and agent log file; log file limitations; privacy concerns; and choosing log analysis software, both free and commercial. (LRW)
ERIC Educational Resources Information Center
Avouris, N.; Fiotakis, G.; Kahrimanis, G.; Margaritis, M.; Komis, V.
2007-01-01
In this article, we discuss key requirements for collecting behavioural data concerning technology-supported collaborative learning activities. It is argued that the common practice of analysis of computer generated log files of user interactions with software tools is not enough for building a thorough view of the activity. Instead, more…
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex
2012-01-01
LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).
Using recurrence plot analysis for software execution interpretation and fault detection
NASA Astrophysics Data System (ADS)
Mosdorf, M.
2015-09-01
This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.
Modeling strategic use of human computer interfaces with novel hidden Markov models
Mariano, Laura J.; Poore, Joshua C.; Krum, David M.; Schwartz, Jana L.; Coskren, William D.; Jones, Eric M.
2015-01-01
Immersive software tools are virtual environments designed to give their users an augmented view of real-world data and ways of manipulating that data. As virtual environments, every action users make while interacting with these tools can be carefully logged, as can the state of the software and the information it presents to the user, giving these actions context. This data provides a high-resolution lens through which dynamic cognitive and behavioral processes can be viewed. In this report, we describe new methods for the analysis and interpretation of such data, utilizing a novel implementation of the Beta Process Hidden Markov Model (BP-HMM) for analysis of software activity logs. We further report the results of a preliminary study designed to establish the validity of our modeling approach. A group of 20 participants were asked to play a simple computer game, instrumented to log every interaction with the interface. Participants had no previous experience with the game's functionality or rules, so the activity logs collected during their naïve interactions capture patterns of exploratory behavior and skill acquisition as they attempted to learn the rules of the game. Pre- and post-task questionnaires probed for self-reported styles of problem solving, as well as task engagement, difficulty, and workload. We jointly modeled the activity log sequences collected from all participants using the BP-HMM approach, identifying a global library of activity patterns representative of the collective behavior of all the participants. Analyses show systematic relationships between both pre- and post-task questionnaires, self-reported approaches to analytic problem solving, and metrics extracted from the BP-HMM decomposition. Overall, we find that this novel approach to decomposing unstructured behavioral data within software environments provides a sensible means for understanding how users learn to integrate software functionality for strategic task pursuit. PMID:26191026
Rule Systems for Runtime Verification: A Short Tutorial
NASA Astrophysics Data System (ADS)
Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex
In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.
Using Malware Analysis to Tailor SQUARE for Mobile Platforms
2014-11-01
identification data (SIM card and International Mobile Station Equipment Identity Number [IMEI]) to duplicate the phone in another device so that it can...applications. Key logging software can be used to steal passwords for financial websites and credit card information [Sophos 2014]. Data theft...for consumption. Apple provides a limited set of APIs and provides the iTunes store as the only ave- nue to install new software. All software
1988-09-01
software programs capable of being used on a microcomputer will be considered for analysis. No software intended for use on a miniframe or mainframe...Dial-A-Log consists of a program written in a computer language called L-10 that is run on a DEC-20 miniframe . The combination of the specific...proliferation of software dealing with microcomputers. Instead, they were geared more towards managing the use of miniframe or mainframe computer
SU-G-BRB-02: An Open-Source Software Analysis Library for Linear Accelerator Quality Assurance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerns, J; Yaldo, D
Purpose: Routine linac quality assurance (QA) tests have become complex enough to require automation of most test analyses. A new data analysis software library was built that allows physicists to automate routine linear accelerator quality assurance tests. The package is open source, code tested, and benchmarked. Methods: Images and data were generated on a TrueBeam linac for the following routine QA tests: VMAT, starshot, CBCT, machine logs, Winston Lutz, and picket fence. The analysis library was built using the general programming language Python. Each test was analyzed with the library algorithms and compared to manual measurements taken at the timemore » of acquisition. Results: VMAT QA results agreed within 0.1% between the library and manual measurements. Machine logs (dynalogs & trajectory logs) were successfully parsed; mechanical axis positions were verified for accuracy and MLC fluence agreed well with EPID measurements. CBCT QA measurements were within 10 HU and 0.2mm where applicable. Winston Lutz isocenter size measurements were within 0.2mm of TrueBeam’s Machine Performance Check. Starshot analysis was within 0.2mm of the Winston Lutz results for the same conditions. Picket fence images with and without a known error showed that the library was capable of detecting MLC offsets within 0.02mm. Conclusion: A new routine QA software library has been benchmarked and is available for use by the community. The library is open-source and extensible for use in larger systems.« less
Statistical Analysis of an Infrared Thermography Inspection of Reinforced Carbon-Carbon
NASA Technical Reports Server (NTRS)
Comeaux, Kayla
2011-01-01
Each piece of flight hardware being used on the shuttle must be analyzed and pass NASA requirements before the shuttle is ready for launch. One tool used to detect cracks that lie within flight hardware is Infrared Flash Thermography. This is a non-destructive testing technique which uses an intense flash of light to heat up the surface of a material after which an Infrared camera is used to record the cooling of the material. Since cracks within the material obstruct the natural heat flow through the material, they are visible when viewing the data from the Infrared camera. We used Ecotherm, a software program, to collect data pertaining to the delaminations and analyzed the data using Ecotherm and University of Dayton Log Logistic Probability of Detection (POD) Software. The goal was to reproduce the statistical analysis produced by the University of Dayton software, by using scatter plots, log transforms, and residuals to test the assumption of normality for the residuals.
Paleomagnetic dating: Methods, MATLAB software, example
NASA Astrophysics Data System (ADS)
Hnatyshin, Danny; Kravchinsky, Vadim A.
2014-09-01
A MATLAB software tool has been developed to provide an easy to use graphical interface for the plotting and interpretation of paleomagnetic data. The tool takes either paleomagnetic directions or paleopoles and compares them to a user defined apparent polar wander path or secular variation curve to determine the age of a paleomagnetic sample. Ages can be determined in two ways, either by translating the data onto the reference curve, or by rotating it about a set location (e.g. sampling location). The results are then compiled in data tables which can be exported as an excel file. This data can also be plotted using variety of built-in stereographic projections, which can then be exported as an image file. This software was used to date the giant Sukhoi Log gold deposit in Russia. Sukhoi Log has undergone a complicated history of faulting, folding, metamorphism, and is the vicinity of many granitic bodies. Paleomagnetic analysis of Sukhoi Log allowed for the timing of large scale thermal or chemical events to be determined. Paleomagnetic analysis from gold mineralized black shales was used to define the natural remanent magnetization recorded at Sukhoi Log. The obtained paleomagnetic direction from thermal demagnetization produced a paleopole at 61.3°N, 155.9°E, with the semi-major axis and semi-minor axis of the 95% confidence ellipse being 16.6° and 15.9° respectively. This paleopole is compared to the Siberian apparent polar wander path (APWP) by translating the paleopole to the nearest location on the APWP. This produced an age of 255.2- 31.0+ 32.0Ma and is the youngest well defined age known for Sukhoi Log. We propose that this is the last major stage of activity at Sukhoi Log, and likely had a role in determining the present day state of mineralization seen at the deposit.
Hardwood log defect photographic database, software and user's guide
R. Edward Thomas
2009-01-01
Computer software and user's guide for Hardwood Log Defect Photographic Database. The database contains photographs and information on external hardwood log defects and the corresponding internal characteristics. This database allows users to search for specific defect types, sizes, and locations by tree species. For every defect, the database contains photos of...
Analysis of 3D Modeling Software Usage Patterns for K-12 Students
ERIC Educational Resources Information Center
Wu, Yi-Chieh; Liao, Wen-Hung; Chi, Ming-Te; Li, Tsai-Yen
2016-01-01
In response to the recent trend in maker movement, teachers are learning 3D techniques actively and bringing 3D printing into the classroom to enhance variety and creativity in designing lectures. This study investigates the usage pattern of a 3D modeling software, Qmodel Creator, which is targeted at K-12 students. User logs containing…
Lenehan, Claire E.; Lewis, Simon W.
2002-01-01
LabVIEW®-based software for the automation of a sequential injection analysis instrument for the determination of morphine is presented. Detection was based on its chemiluminescence reaction with acidic potassium permanganate in the presence of sodium polyphosphate. The calibration function approximated linearity (range 5 × 10-10 to 5 × 10-6 M) with a line of best fit of y=1.05x+8.9164 (R2 =0.9959), where y is the log10 signal (mV) and x is the log10 morphine concentration (M). Precision, as measured by relative standard deviation, was 0.7% for five replicate analyses of morphine standard (5 × 10-8 M). The limit of detection (3σ) was determined as 5 × 10-11 M morphine. PMID:18924729
Lenehan, Claire E; Barnett, Neil W; Lewis, Simon W
2002-01-01
LabVIEW-based software for the automation of a sequential injection analysis instrument for the determination of morphine is presented. Detection was based on its chemiluminescence reaction with acidic potassium permanganate in the presence of sodium polyphosphate. The calibration function approximated linearity (range 5 x 10(-10) to 5 x 10(-6) M) with a line of best fit of y=1.05(x)+8.9164 (R(2) =0.9959), where y is the log10 signal (mV) and x is the log10 morphine concentration (M). Precision, as measured by relative standard deviation, was 0.7% for five replicate analyses of morphine standard (5 x 10(-8) M). The limit of detection (3sigma) was determined as 5 x 10(-11) M morphine.
Mass Storage Performance Information System
NASA Technical Reports Server (NTRS)
Scheuermann, Peter
2000-01-01
The purpose of this task is to develop a data warehouse to enable system administrators and their managers to gather information by querying the data logs of the MDSDS. Currently detailed logs capture the activity of the MDSDS internal to the different systems. The elements to be included in the data warehouse are requirements analysis, data cleansing, database design, database population, hardware/software acquisition, data transformation, query and report generation, and data mining.
Spreadsheet log analysis in subsurface geology
Doveton, J.H.
2000-01-01
Most of the direct knowledge of the geology of the subsurface is gained from the examination of core and drill-cuttings recovered from boreholes drilled by the petroleum and water industries. Wireline logs run in these same boreholes generally have been restricted to tasks of lithostratigraphic correlation and thee location of hydrocarbon pay zones. However, the range of petrophysical measurements has expanded markedly in recent years, so that log traces now can be transformed to estimates of rock composition. Increasingly, logs are available in a digital format that can be read easily by a desktop computer and processed by simple spreadsheet software methods. Taken together, these developments offer accessible tools for new insights into subsurface geology that complement the traditional, but limited, sources of core and cutting observations.
Capricorn-A Web-Based Automatic Case Log and Volume Analytics for Diagnostic Radiology Residents.
Chen, Po-Hao; Chen, Yin Jie; Cook, Tessa S
2015-10-01
On-service clinical learning is a mainstay of radiology education. However, an accurate and timely case log is difficult to keep, especially in the absence of software tools tailored to resident education. Furthermore, volume-related feedback from the residency program sometimes occurs months after a rotation ends, limiting the opportunity for meaningful intervention. We surveyed the residents of a single academic institution to evaluate the current state of and the existing need for tracking interpretation volume. Using the results of the survey, we created an open-source automated case log software. Finally, we evaluated the effect of the software tool on the residency in a 1-month, postimplementation survey. Before implementation of the system, 89% of respondents stated that volume is an important component of training, but 71% stated that volume data was inconvenient to obtain. Although the residency program provides semiannual reviews, 90% preferred reviewing interpretation volumes at least once monthly. After implementation, 95% of the respondents stated that the software is convenient to access, 75% found it useful, and 88% stated they would use the software at least once a month. The included analytics module, which benchmarks the user using historical aggregate average volumes, is the most often used feature of the software. Server log demonstrates that, on average, residents use the system approximately twice a week. An automated case log software system may fulfill a previously unmet need in diagnostic radiology training, making accurate and timely review of volume-related performance analytics a convenient process. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.
SU-E-T-142: Automatic Linac Log File: Analysis and Reporting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gainey, M; Rothe, T
Purpose: End to end QA for IMRT/VMAT is time consuming. Automated linac log file analysis and recalculation of daily recorded fluence, and hence dose, distribution bring this closer. Methods: Matlab (R2014b, Mathworks) software was written to read in and analyse IMRT/VMAT trajectory log files (TrueBeam 1.5, Varian Medical Systems) overnight, and are archived on a backed-up network drive (figure). A summary report (PDF) is sent by email to the duty linac physicist. A structured summary report (PDF) for each patient is automatically updated for embedding into the R&V system (Mosaiq 2.5, Elekta AG). The report contains cross-referenced hyperlinks to easemore » navigation between treatment fractions. Gamma analysis can be performed on planned (DICOM RTPlan) and treated (trajectory log) fluence distributions. Trajectory log files can be converted into RTPlan files for dose distribution calculation (Eclipse, AAA10.0.28, VMS). Results: All leaf positions are within +/−0.10mm: 57% within +/−0.01mm; 89% within 0.05mm. Mean leaf position deviation is 0.02mm. Gantry angle variations lie in the range −0.1 to 0.3 degrees, mean 0.04 degrees. Fluence verification shows excellent agreement between planned and treated fluence. Agreement between planned and treated dose distribution, the derived from log files, is very good. Conclusion: Automated log file analysis is a valuable tool for the busy physicist, enabling potential treated fluence distribution errors to be quickly identified. In the near future we will correlate trajectory log analysis with routine IMRT/VMAT QA analysis. This has the potential to reduce, but not eliminate, the QA workload.« less
Introducing high performance distributed logging service for ACS
NASA Astrophysics Data System (ADS)
Avarias, Jorge A.; López, Joao S.; Maureira, Cristián; Sommer, Heiko; Chiozzi, Gianluca
2010-07-01
The ALMA Common Software (ACS) is a software framework that provides the infrastructure for the Atacama Large Millimeter Array and other projects. ACS, based on CORBA, offers basic services and common design patterns for distributed software. Every properly built system needs to be able to log status and error information. Logging in a single computer scenario can be as easy as using fprintf statements. However, in a distributed system, it must provide a way to centralize all logging data in a single place without overloading the network nor complicating the applications. ACS provides a complete logging service infrastructure in which every log has an associated priority and timestamp, allowing filtering at different levels of the system (application, service and clients). Currently the ACS logging service uses an implementation of the CORBA Telecom Log Service in a customized way, using only a minimal subset of the features provided by the standard. The most relevant feature used by ACS is the ability to treat the logs as event data that gets distributed over the network in a publisher-subscriber paradigm. For this purpose the CORBA Notification Service, which is resource intensive, is used. On the other hand, the Data Distribution Service (DDS) provides an alternative standard for publisher-subscriber communication for real-time systems, offering better performance and featuring decentralized message processing. The current document describes how the new high performance logging service of ACS has been modeled and developed using DDS, replacing the Telecom Log Service. Benefits and drawbacks are analyzed. A benchmark is presented comparing the differences between the implementations.
Crangle, Robert D.
2007-01-01
Introduction The U.S. Geological Survey (USGS) uses geophysical wireline well logs for a variety of purposes, including stratigraphic correlation (Hettinger, 2001, Ryder, 2002), petroleum reservoir analyses (Nelson and Bird, 2005), aquifer studies (Balch, 1988), and synthetic seismic profiles (Kulander and Ryder, 2005). Commonly, well logs are easier to visualize, manipulate, and interpret when available in a digital format. In recent geologic cross sections E-E' and D-D', constructed through the central Appalachian basin (Ryder, Swezey, and others, in press; Ryder, Crangle, and others, in press), gamma ray well log traces and lithologic logs were used to correlate key stratigraphic intervals (Fig. 1). The stratigraphy and structure of the cross sections are illustrated through the use of graphical software applications (e.g., Adobe Illustrator). The gamma ray traces were digitized in Neuralog (proprietary software) from paper well logs and converted to a Log ASCII Standard (LAS) format. Once converted, the LAS files were transformed to images through an LAS-reader application (e.g., GeoGraphix Prizm) and then overlain in positions adjacent to well locations, used for stratigraphic control, on each cross section. This report summarizes the procedures used to convert paper logs to a digital LAS format using a third-party software application, Neuralog. Included in this report are LAS files for sixteen wells used in geologic cross section E-E' (Table 1) and thirteen wells used in geologic cross section D-D' (Table 2).
Contemporary computerized methods for logging planning
Chris B. LeDoux
1988-01-01
Contemporary harvest planning graphic software is highlighted with practical applications. Planning results from a production study of the Clearwater Cable Yarder are summarized. Application of the planning methods to evaluation of proposed silvicultural treatments is included. Results show that 3-dimensional graphic analysis of proposed harvesting or silvicultural...
2008-10-23
for Public Release, Distribution Unlimited, GDLS approved, log 2008-98, dated 10/13/08 Analysis Analysis from DFSS axiomatic design methods indicate the...solutions AA enables a comprehensive analysis across different force configurations and dynamic situations October 26, 2006 slide 25 Approved for public ...analyzed by Software Engineering Institute. Analysis results reviewed by NDIA SE Effectiveness Committee. Reports 1. Public NDIA/SEI report released
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac
2017-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac
2016-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.
NASA Astrophysics Data System (ADS)
2010-09-01
WE RECOMMEND Enjoyable Physics Mechanics book makes learning more fun SEP Colorimeter Box A useful and inexpensive colorimeter for the classroom Pursuing Power and Light Account of the development of science in the 19th centuary SEP Bottle Rocket Launcher An excellent resource for teaching about projectiles GLE Datalogger GPS software is combined with a datalogger EDU Logger Remote datalogger has greater sensing abilities Logotron Insight iLog Studio Software enables datlogging, data analysis and modelling iPhone Apps Mobile phone games aid study of gravity WORTH A LOOK Physics of Sailing Book journeys through the importance of physics in sailing The Lightness of Being Study of what the world is made from LECTURE The 2010 IOP Schools and Colleges Lecture presents the physics of fusion WEB WATCH Planet Scicast pushes boundaries of pupil creativity
NASA Astrophysics Data System (ADS)
2012-07-01
WE RECOMMEND Data logger Fourier NOVA LINK: data logging and analysis To Engineer is Human Engineering: essays and insights Soap, Science, & Flat-Screen TVs People, politics, business and science overlap uLog sensors and sensor adapter A new addition to the LogIT range offers simplicity and ease of use WORTH A LOOK Imagined Worlds Socio-scientific predictions for the future Mini light data logger and mini temperature data logger Small-scale equipment for schools SensorLab Plus LogIT's supporting software, with extra features HANDLE WITH CARE CAXE110P PICAXE-18M2 data logger Data logger 'on view' but disappoints Engineering: A Very Short Introduction A broad-brush treatment fails to satisfy WEB WATCH Two very different websites for students: advanced physics questions answered and a more general BBC science resource
ELOPTA: a novel microcontroller-based operant device.
Hoffman, Adam M; Song, Jianjian; Tuttle, Elaina M
2007-11-01
Operant devices have been used for many years in animal behavior research, yet such devices a regenerally highly specialized and quite expensive. Although commercial models are somewhat adaptable and resilient, they are also extremely expensive and are controlled by difficult to learn proprietary software. As an alternative to commercial devices, we have designed and produced a fully functional, programmable operant device, using a PICmicro microcontroller (Microchip Technology, Inc.). The electronic operant testing apparatus (ELOPTA) is designed to deliver food when a study animal, in this case a bird, successfully depresses the correct sequence of illuminated keys. The device logs each keypress and can detect and log whenever a test animal i spositioned at the device. Data can be easily transferred to a computer and imported into any statistical analysis software. At about 3% the cost of a commercial device, ELOPTA will advance behavioral sciences, including behavioral ecology, animal learning and cognition, and ethology.
An Integrated System for Wildlife Sensing
2014-08-14
design requirement. “Sensor Controller” software. A custom Sensor Controller application was developed for the Android device in order to collect...and log readings from that device’s sensors. “Camera Controller” software. A custom Camera Controller application was developed for the Android device...into 2 separate Android applications (Figure 4). The Sensor Controller logs readings periodically from the Android device’s organic sensors, and
ERIC Educational Resources Information Center
Jones, S.; And Others
1997-01-01
Discusses the use of transaction logging in Okapi-related projects to allow search algorithms and user interfaces to be investigated, evaluated, and compared. A series of examples is presented, illustrating logging software for character-based and graphical user interface systems, and demonstrating the usefulness of relational database management…
Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R
2009-01-01
We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.
An experiment in software reliability
NASA Technical Reports Server (NTRS)
Dunham, J. R.; Pierce, J. L.
1986-01-01
The results of a software reliability experiment conducted in a controlled laboratory setting are reported. The experiment was undertaken to gather data on software failures and is one in a series of experiments being pursued by the Fault Tolerant Systems Branch of NASA Langley Research Center to find a means of credibly performing reliability evaluations of flight control software. The experiment tests a small sample of implementations of radar tracking software having ultra-reliability requirements and uses n-version programming for error detection, and repetitive run modeling for failure and fault rate estimation. The experiment results agree with those of Nagel and Skrivan in that the program error rates suggest an approximate log-linear pattern and the individual faults occurred with significantly different error rates. Additional analysis of the experimental data raises new questions concerning the phenomenon of interacting faults. This phenomenon may provide one explanation for software reliability decay.
NASA Technical Reports Server (NTRS)
Liaw, Morris; Evesson, Donna
1988-01-01
This is a manual for users of the Software Engineering and Ada Database (SEAD). SEAD was developed to provide an information resource to NASA and NASA contractors with respect to Ada-based resources and activities that are available or underway either in NASA or elsewhere in the worldwide Ada community. The sharing of such information will reduce the duplication of effort while improving quality in the development of future software systems. The manual describes the organization of the data in SEAD, the user interface from logging in to logging out, and concludes with a ten chapter tutorial on how to use the information in SEAD. Two appendices provide quick reference for logging into SEAD and using the keyboard of an IBM 3270 or VT100 computer terminal.
Comparison of Survival Models for Analyzing Prognostic Factors in Gastric Cancer Patients
Habibi, Danial; Rafiei, Mohammad; Chehrei, Ali; Shayan, Zahra; Tafaqodi, Soheil
2018-03-27
Objective: There are a number of models for determining risk factors for survival of patients with gastric cancer. This study was conducted to select the model showing the best fit with available data. Methods: Cox regression and parametric models (Exponential, Weibull, Gompertz, Log normal, Log logistic and Generalized Gamma) were utilized in unadjusted and adjusted forms to detect factors influencing mortality of patients. Comparisons were made with Akaike Information Criterion (AIC) by using STATA 13 and R 3.1.3 softwares. Results: The results of this study indicated that all parametric models outperform the Cox regression model. The Log normal, Log logistic and Generalized Gamma provided the best performance in terms of AIC values (179.2, 179.4 and 181.1, respectively). On unadjusted analysis, the results of the Cox regression and parametric models indicated stage, grade, largest diameter of metastatic nest, largest diameter of LM, number of involved lymph nodes and the largest ratio of metastatic nests to lymph nodes, to be variables influencing the survival of patients with gastric cancer. On adjusted analysis, according to the best model (log normal), grade was found as the significant variable. Conclusion: The results suggested that all parametric models outperform the Cox model. The log normal model provides the best fit and is a good substitute for Cox regression. Creative Commons Attribution License
SedMob: A mobile application for creating sedimentary logs in the field
NASA Astrophysics Data System (ADS)
Wolniewicz, Pawel
2014-05-01
SedMob is an open-source, mobile software package for creating sedimentary logs, targeted for use in tablets and smartphones. The user can create an unlimited number of logs, save data from each bed in the log as well as export and synchronize the data with a remote server. SedMob is designed as a mobile interface to SedLog: a free multiplatform package for drawing graphic logs that runs on PC computers. Data entered into SedMob are saved in the CSV file format, fully compatible with SedLog.
A mathematical model of salmonid spawning habitat
Robert N. Havis; Carlos V. Alonzo; Keith E Woeste; Russell F. Thurow
1993-01-01
A simulation model [Salmonid Spawning Analysis Model (SSAM)I was developed as a management tool to evaluate the relative impacts of stream sediment load and water temperature on salmonid egg survival. The model is usefi.il for estimating acceptable sediment loads to spawning habitat that may result from upland development, such as logging and agriculture. Software in...
Automating linear accelerator quality assurance.
Eckhause, Tobias; Al-Hallaq, Hania; Ritter, Timothy; DeMarco, John; Farrey, Karl; Pawlicki, Todd; Kim, Gwe-Ya; Popple, Richard; Sharma, Vijeshwar; Perez, Mario; Park, SungYong; Booth, Jeremy T; Thorwarth, Ryan; Moran, Jean M
2015-10-01
The purpose of this study was 2-fold. One purpose was to develop an automated, streamlined quality assurance (QA) program for use by multiple centers. The second purpose was to evaluate machine performance over time for multiple centers using linear accelerator (Linac) log files and electronic portal images. The authors sought to evaluate variations in Linac performance to establish as a reference for other centers. The authors developed analytical software tools for a QA program using both log files and electronic portal imaging device (EPID) measurements. The first tool is a general analysis tool which can read and visually represent data in the log file. This tool, which can be used to automatically analyze patient treatment or QA log files, examines the files for Linac deviations which exceed thresholds. The second set of tools consists of a test suite of QA fields, a standard phantom, and software to collect information from the log files on deviations from the expected values. The test suite was designed to focus on the mechanical tests of the Linac to include jaw, MLC, and collimator positions during static, IMRT, and volumetric modulated arc therapy delivery. A consortium of eight institutions delivered the test suite at monthly or weekly intervals on each Linac using a standard phantom. The behavior of various components was analyzed for eight TrueBeam Linacs. For the EPID and trajectory log file analysis, all observed deviations which exceeded established thresholds for Linac behavior resulted in a beam hold off. In the absence of an interlock-triggering event, the maximum observed log file deviations between the expected and actual component positions (such as MLC leaves) varied from less than 1% to 26% of published tolerance thresholds. The maximum and standard deviations of the variations due to gantry sag, collimator angle, jaw position, and MLC positions are presented. Gantry sag among Linacs was 0.336 ± 0.072 mm. The standard deviation in MLC position, as determined by EPID measurements, across the consortium was 0.33 mm for IMRT fields. With respect to the log files, the deviations between expected and actual positions for parameters were small (<0.12 mm) for all Linacs. Considering both log files and EPID measurements, all parameters were well within published tolerance values. Variations in collimator angle, MLC position, and gantry sag were also evaluated for all Linacs. The performance of the TrueBeam Linac model was shown to be consistent based on automated analysis of trajectory log files and EPID images acquired during delivery of a standardized test suite. The results can be compared directly to tolerance thresholds. In addition, sharing of results from standard tests across institutions can facilitate the identification of QA process and Linac changes. These reference values are presented along with the standard deviation for common tests so that the test suite can be used by other centers to evaluate their Linac performance against those in this consortium.
New generation of exploration tools: interactive modeling software and microcomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krajewski, S.A.
1986-08-01
Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less
Building analytical platform with Big Data solutions for log files of PanDA infrastructure
NASA Astrophysics Data System (ADS)
Alekseev, A. A.; Barreiro Megino, F. G.; Klimentov, A. A.; Korchuganova, T. A.; Maendo, T.; Padolski, S. V.
2018-05-01
The paper describes the implementation of a high-performance system for the processing and analysis of log files for the PanDA infrastructure of the ATLAS experiment at the Large Hadron Collider (LHC), responsible for the workload management of order of 2M daily jobs across the Worldwide LHC Computing Grid. The solution is based on the ELK technology stack, which includes several components: Filebeat, Logstash, ElasticSearch (ES), and Kibana. Filebeat is used to collect data from logs. Logstash processes data and export to Elasticsearch. ES are responsible for centralized data storage. Accumulated data in ES can be viewed using a special software Kibana. These components were integrated with the PanDA infrastructure and replaced previous log processing systems for increased scalability and usability. The authors will describe all the components and their configuration tuning for the current tasks, the scale of the actual system and give several real-life examples of how this centralized log processing and storage service is used to showcase the advantages for daily operations.
Optimization of analytical laboratory work using computer networking and databasing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upp, D.L.; Metcalf, R.A.
1996-06-01
The Health Physics Analysis Laboratory (HPAL) performs around 600,000 analyses for radioactive nuclides each year at Los Alamos National Laboratory (LANL). Analysis matrices vary from nasal swipes, air filters, work area swipes, liquids, to the bottoms of shoes and cat litter. HPAL uses 8 liquid scintillation counters, 8 gas proportional counters, and 9 high purity germanium detectors in 5 laboratories to perform these analyses. HPAL has developed a computer network between the labs and software to produce analysis results. The software and hardware package includes barcode sample tracking, log-in, chain of custody, analysis calculations, analysis result printing, and utility programs.more » All data are written to a database, mirrored on a central server, and eventually written to CD-ROM to provide for online historical results. This system has greatly reduced the work required to provide for analysis results as well as improving the quality of the work performed.« less
Bennett, Erin R; Clausen, Jay; Linkov, Eugene; Linkov, Igor
2009-11-01
Reliable, up-front information on physical and biological properties of emerging materials is essential before making a decision and investment to formulate, synthesize, scale-up, test, and manufacture a new material for use in both military and civilian applications. Multiple quantitative structure-activity relationships (QSARs) software tools are available for predicting a material's physical/chemical properties and environmental effects. Even though information on emerging materials is often limited, QSAR software output is treated without sufficient uncertainty analysis. We hypothesize that uncertainty and variability in material properties and uncertainty in model prediction can be too large to provide meaningful results. To test this hypothesis, we predicted octanol water partitioning coefficients (logP) for multiple, similar compounds with limited physical-chemical properties using six different commercial logP calculators (KOWWIN, MarvinSketch, ACD/Labs, ALogP, CLogP, SPARC). Analysis was done for materials with largely uncertain properties that were similar, based on molecular formula, to military compounds (RDX, BTTN, TNT) and pharmaceuticals (Carbamazepine, Gemfibrizol). We have also compared QSAR modeling results for a well-studied pesticide and pesticide breakdown product (Atrazine, DDE). Our analysis shows variability due to structural variations of the emerging chemicals may be several orders of magnitude. The model uncertainty across six software packages was very high (10 orders of magnitude) for emerging materials while it was low for traditional chemicals (e.g. Atrazine). Thus the use of QSAR models for emerging materials screening requires extensive model validation and coupling QSAR output with available empirical data and other relevant information.
2015-04-01
report is to examine how a computer forensic investigator/incident handler, without specialised computer memory or software reverse engineering skills ...The skills amassed by incident handlers and investigators alike while using Volatility to examine Windows memory images will be of some help...bin/pulseaudio --start --log-target=syslog 1362 1000 1000 nautilus 1366 1000 1000 /usr/lib/pulseaudio/pulse/gconf- helper 1370 1000 1000 nm-applet
Simulation Control Graphical User Interface Logging Report
NASA Technical Reports Server (NTRS)
Hewling, Karl B., Jr.
2012-01-01
One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.
Predicting volumes and numbers of logs by grade from hardwood cruise data
Daniel A. Yaussy; Robert L. Brisbin; Mary J. Humphreys; Mary J. Humphreys
1988-01-01
The equations presented allow the estimation of quality and quantity of logs produced from a hardwood stand based on cruise data. When packaged in appropriate computer software, the information will provide the mill manager with the means to estimate the value of logs that would be added to a mill yard inventory from a timber sale.
Reservoir characterization using core, well log, and seismic data and intelligent software
NASA Astrophysics Data System (ADS)
Soto Becerra, Rodolfo
We have developed intelligent software, Oilfield Intelligence (OI), as an engineering tool to improve the characterization of oil and gas reservoirs. OI integrates neural networks and multivariate statistical analysis. It is composed of five main subsystems: data input, preprocessing, architecture design, graphics design, and inference engine modules. More than 1,200 lines of programming code as M-files using the language MATLAB been written. The degree of success of many oil and gas drilling, completion, and production activities depends upon the accuracy of the models used in a reservoir description. Neural networks have been applied for identification of nonlinear systems in almost all scientific fields of humankind. Solving reservoir characterization problems is no exception. Neural networks have a number of attractive features that can help to extract and recognize underlying patterns, structures, and relationships among data. However, before developing a neural network model, we must solve the problem of dimensionality such as determining dominant and irrelevant variables. We can apply principal components and factor analysis to reduce the dimensionality and help the neural networks formulate more realistic models. We validated OI by obtaining confident models in three different oil field problems: (1) A neural network in-situ stress model using lithology and gamma ray logs for the Travis Peak formation of east Texas, (2) A neural network permeability model using porosity and gamma ray and a neural network pseudo-gamma ray log model using 3D seismic attributes for the reservoir VLE 196 Lamar field located in Block V of south-central Lake Maracaibo (Venezuela), and (3) Neural network primary ultimate oil recovery (PRUR), initial waterflooding ultimate oil recovery (IWUR), and infill drilling ultimate oil recovery (IDUR) models using reservoir parameters for San Andres and Clearfork carbonate formations in west Texas. In all cases, we compared the results from the neural network models with the results from regression statistical and non-parametric approach models. The results show that it is possible to obtain the highest cross-correlation coefficient between predicted and actual target variables, and the lowest average absolute errors using the integrated techniques of multivariate statistical analysis and neural networks in our intelligent software.
Correlations between chromatographic parameters and bioactivity predictors of potential herbicides.
Janicka, Małgorzata
2014-08-01
Different liquid chromatography techniques, including reversed-phase liquid chromatography on Purosphere RP-18e, IAM.PC.DD2 and Cosmosil Cholester columns and micellar liqud chromatography with a Purosphere RP-8e column and using buffered sodium dodecyl sulfate-acetonitrile as the mobile phase, were applied to study the lipophilic properties of 15 newly synthesized phenoxyacetic and carbamic acid derivatives, which are potential herbicides. Chromatographic lipophilicity descriptors were used to extrapolate log k parameters (log kw and log km) and log k values. Partitioning lipophilicity descriptors, i.e., log P coefficients in an n-octanol-water system, were computed from the molecular structures of the tested compounds. Bioactivity descriptors, including partition coefficients in a water-plant cuticle system and water-human serum albumin and coefficients for human skin partition and permeation were calculated in silico by ACD/ADME software using the linear solvation energy relationship of Abraham. Principal component analysis was applied to describe similarities between various chromatographic and partitioning lipophilicities. Highly significant, predictive linear relationships were found between chromatographic parameters and bioactivity descriptors. © The Author [2013]. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
WE-D-204-06: An Open Source ImageJ CatPhan Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, G
2015-06-15
Purpose: The CatPhan is a popular QA device for assessing CT image quality. There are a number of software options which perform analysis of the CatPhan. However, there is often little ability for the user to adjust the analysis if it isn’t running properly, and these are all expensive options. An open source tool is an effective solution. Methods: To use the software, the user imports the CT as an image sequence in ImageJ. The user then scrolls to the slice with the lateral dots. The user then runs the plugin. If tolerance constraints are not already created, the usermore » is prompted to enter them or to use generic tolerances. Upon completion of the analysis, the plugin calls pdfLaTex to compile the pdf report. There is a csv version of the report as well. A log of the results from all CatPhan scans is kept as a csv file. The user can use this to baseline the machine. Results: The tool is capable of detecting the orientation of the phantom. If the CatPhan was scanned backwards, one can simply flip the stack of images horizontally and proceed with the analysis. The analysis includes Sensitometry (estimating the effective beam energy), HU values and linearity, Low Contrast Visibility (using LDPE & Polystyrene), Contrast Scale, Geometric Accuracy, Slice Thickness Accuracy, Spatial resolution (giving the MTF using the line pairs as well as the point spread function), CNR, Low Contrast Detectability (including the raw data), Uniformity (including the Cupping Effect). Conclusion: This is a robust tool that analyzes more components of the CatPhan than other software options (with the exception of ImageOwl). It produces an elegant pdf and keeps a log of analyses for long-term tracking of the system. Because it is open source, users are able to customize any component of it.« less
MAIL LOG, program theory, volume 1. [Scout project automatic data system
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
The program theory used to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, is described. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG data base consists of three main subfiles: (1) incoming and outgoing mail correspondence; (2) design information releases and reports; and (3) drawings and engineering orders. All subroutine descriptions, flowcharts, and MAIL LOG outputs are given and the data base design is described.
Crowther, Michael J; Look, Maxime P; Riley, Richard D
2014-09-28
Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.
1982-10-01
BACK_ GVRATON; (geserate log me"ses from wather and wind planning log screenJ PERFORN AIPORT.PLANNIG LOGISSA ORItATION; (eSamrate log message from...ISSAGN H31.TAL(COUNT) .26 -kAUX imesase is constructed with wather imforimetiom ad stated) IF(VILO.TAM(J).DIl A(S)’ ’)1 (VXLOG.TABL(J).VU.(S)’ THEN
New radio meteor detecting and logging software
NASA Astrophysics Data System (ADS)
Kaufmann, Wolfgang
2017-08-01
A new piece of software ``Meteor Logger'' for the radio observation of meteors is described. It analyses an incoming audio stream in the frequency domain to detect a radio meteor signal on the basis of its signature, instead of applying an amplitude threshold. For that reason the distribution of the three frequencies with the highest spectral power are considered over the time (3f method). An auto notch algorithm is developed to prevent the radio meteor signal detection from being jammed by a present interference line. The results of an exemplary logging session are discussed.
Ryder, Robert T.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.
2014-01-01
The appendixes in chapters E.4.1 and E.4.2 include (1) Log ASCII Standard (LAS) files, which encode gamma-ray, neutron, density, and other logs in text files that can be used by most well-logging software programs; and (2) graphic well-log traces. In the appendix to chapter E.4.1, the well-log traces are accompanied by lithologic descriptions with formation tops.
Measurement and Analysis of Failures in Computer Systems
NASA Technical Reports Server (NTRS)
Thakur, Anshuman
1997-01-01
This thesis presents a study of software failures spanning several different releases of Tandem's NonStop-UX operating system running on Tandem Integrity S2(TMR) systems. NonStop-UX is based on UNIX System V and is fully compliant with industry standards, such as the X/Open Portability Guide, the IEEE POSIX standards, and the System V Interface Definition (SVID) extensions. In addition to providing a general UNIX interface to the hardware, the operating system has built-in recovery mechanisms and audit routines that check the consistency of the kernel data structures. The analysis is based on data on software failures and repairs collected from Tandem's product report (TPR) logs for a period exceeding three years. A TPR log is created when a customer or an internal developer observes a failure in a Tandem Integrity system. This study concentrates primarily on those TPRs that report a UNIX panic that subsequently crashes the system. Approximately 200 of the TPRs fall into this category. Approximately 50% of the failures reported are from field systems, and the rest are from the testing and development sites. It has been observed by Tandem developers that fewer cases are encountered from the field than from the test centers. Thus, the data selection mechanism has introduced a slight skew.
Yu, S; Gao, S; Gan, Y; Zhang, Y; Ruan, X; Wang, Y; Yang, L; Shi, J
2016-04-01
Quantitative structure-property relationship modelling can be a valuable alternative method to replace or reduce experimental testing. In particular, some endpoints such as octanol-water (KOW) and organic carbon-water (KOC) partition coefficients of polychlorinated biphenyls (PCBs) are easier to predict and various models have been already developed. In this paper, two different methods, which are multiple linear regression based on the descriptors generated using Dragon software and hologram quantitative structure-activity relationships, were employed to predict suspended particulate matter (SPM) derived log KOC and generator column, shake flask and slow stirring method derived log KOW values of 209 PCBs. The predictive ability of the derived models was validated using a test set. The performances of all these models were compared with EPI Suite™ software. The results indicated that the proposed models were robust and satisfactory, and could provide feasible and promising tools for the rapid assessment of the SPM derived log KOC and generator column, shake flask and slow stirring method derived log KOW values of PCBs.
powerbox: Arbitrarily structured, arbitrary-dimension boxes and log-normal mocks
NASA Astrophysics Data System (ADS)
Murray, Steven G.
2018-05-01
powerbox creates density grids (or boxes) with an arbitrary two-point distribution (i.e. power spectrum). The software works in any number of dimensions, creates Gaussian or Log-Normal fields, and measures power spectra of output fields to ensure consistency. The primary motivation for creating the code was the simple creation of log-normal mock galaxy distributions, but the methodology can be used for other applications.
NASA Astrophysics Data System (ADS)
Valentini, S.
2013-12-01
A search of variable stars was carried out, using a new software specifically created by the author, on a series of images acquired at the Astronomical Observatory of Santa Lucia di Stroncone (Terni, Italy) between October 2010 and March 2012. This research, named Fast Variable Stars Survey (FVSS), arose from the idea to verify if the log files pr oduced by the software Astrometrica (H. Raab), could be used as a basis for rapid detection of short-period variable stars. The r esults obtained showed that the idea is very valid, so that the new software has allowed the identification and the correct determination of the period of thirty-two new variable stars in the six stellar fields subjected to analysis.
SU-E-T-784: Using MLC Log Files for Daily IMRT Delivery Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stathakis, S; Defoor, D; Linden, P
2015-06-15
Purpose: To verify daily intensity modulated radiation therapy (IMRT) treatments using multi-leaf collimator (MLC) log files. Methods: The MLC log files from a NovalisTX Varian linear accelerator were used in this study. The MLC files were recorded daily for all patients undergoing IMRT or volumetric modulated arc therapy (VMAT). The first record of each patient was used as reference and all records for subsequent days were compared against the reference. An in house MATLAB software code was used for the comparisons. Each MLC log file was converted to a fluence map (FM) and a gamma index (γ) analysis was usedmore » for the evaluation of each daily delivery for every patient. The tolerance for the gamma index was set to 2% dose difference and 2mm distance to agreement while points with signal of 10% or lower of the maximum value were excluded from the comparisons. Results: The γ between each of the reference FMs and the consecutive daily fraction FMs had an average value of 99.1% (ranged from 98.2 to 100.0%). The FM images were reconstructed at various resolutions in order to study the effect of the resolution on the γ and at the same time reduce the time for processing the images. We found that the comparison of images with the highest resolution (768×1024) yielded on average a lower γ (99.1%) than the ones with low resolution (192×256) (γ 99.5%). Conclusion: We developed an in-house software that allows us to monitor the quality of daily IMRT and VMAT treatment deliveries using information from the MLC log files of the linear accelerator. The information can be analyzed and evaluated as early as after the completion of each daily treatment. Such tool can be valuable to assess the effect of MLC positioning on plan quality, especially in the context of adaptive radiotherapy.« less
Code of Federal Regulations, 2013 CFR
2013-04-01
... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...
Code of Federal Regulations, 2014 CFR
2014-04-01
... limited to software, files, data, and prize schedules. (2) Downloads must use secure methodologies that... date of the completion of the download; (iii) The Class II gaming system components to which software was downloaded; (iv) The version(s) of download package and any software downloaded. Logging of the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bore II, co-developed by Berkeley Lab researchers Frank Hale, Chin-Fu Tsang, and Christine Doughty, provides vital information for solving water quality and supply problems and for improving remediation of contaminated sites. Termed "hydrophysical logging," this technology is based on the concept of measuring repeated depth profiles of fluid electric conductivity in a borehole that is pumping. As fluid enters the wellbore, its distinct electric conductivity causes peaks in the conductivity log that grow and migrate upward with time. Analysis of the evolution of the peaks enables characterization of groundwater flow distribution more quickly, more cost effectively, and with higher resolutionmore » than ever before. Combining the unique interpretation software Bore II with advanced downhole instrumentation (the hydrophysical logging tool), the method quantifies inflow and outflow locations, their associated flow rates, and the basic water quality parameters of the associated formation waters (e.g., pH, oxidation-reduction potential, temperature). In addition, when applied in conjunction with downhole fluid sampling, Bore II makes possible a complete assessment of contaminant concentration within groundwater.« less
NASA Astrophysics Data System (ADS)
Rahman, Md Mushfiqur; Lei, Yu; Kalantzis, Georgios
2018-01-01
Quality Assurance (QA) for medical linear accelerator (linac) is one of the primary concerns in external beam radiation Therapy. Continued advancements in clinical accelerators and computer control technology make the QA procedures more complex and time consuming which often, adequate software accompanied with specific phantoms is required. To ameliorate that matter, we introduce QALMA (Quality Assurance for Linac with MATLAB), a MALAB toolkit which aims to simplify the quantitative analysis of QA for linac which includes Star-Shot analysis, Picket Fence test, Winston-Lutz test, Multileaf Collimator (MLC) log file analysis and verification of light & radiation field coincidence test.
Chen, Ying; Cai, Xiaoyu; Jiang, Long; Li, Yu
2016-02-01
Based on the experimental data of octanol-air partition coefficients (KOA) for 19 polychlorinated biphenyl (PCB) congeners, two types of QSAR methods, comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA), are used to establish 3D-QSAR models using the structural parameters as independent variables and using logKOA values as the dependent variable with the Sybyl software to predict the KOA values of the remaining 190 PCB congeners. The whole data set (19 compounds) was divided into a training set (15 compounds) for model generation and a test set (4 compounds) for model validation. As a result, the cross-validation correlation coefficient (q(2)) obtained by the CoMFA and CoMSIA models (shuffled 12 times) was in the range of 0.825-0.969 (>0.5), the correlation coefficient (r(2)) obtained was in the range of 0.957-1.000 (>0.9), and the SEP (standard error of prediction) of test set was within the range of 0.070-0.617, indicating that the models were robust and predictive. Randomly selected from a set of models, CoMFA analysis revealed that the corresponding percentages of the variance explained by steric and electrostatic fields were 23.9% and 76.1%, respectively, while CoMSIA analysis by steric, electrostatic and hydrophobic fields were 0.6%, 92.6%, and 6.8%, respectively. The electrostatic field was determined as a primary factor governing the logKOA. The correlation analysis of the relationship between the number of Cl atoms and the average logKOA values of PCBs indicated that logKOA values gradually increased as the number of Cl atoms increased. Simultaneously, related studies on PCB detection in the Arctic and Antarctic areas revealed that higher logKOA values indicate a stronger PCB migration ability. From CoMFA and CoMSIA contour maps, logKOA decreased when substituents possessed electropositive groups at the 2-, 3-, 3'-, 5- and 6- positions, which could reduce the PCB migration ability. These results are expected to be beneficial in predicting logKOA values of PCB homologues and derivatives and in providing a theoretical foundation for further elucidation of the global migration behaviour of PCBs. Copyright © 2015 Elsevier Inc. All rights reserved.
Automatic Tools for Enhancing the Collaborative Experience in Large Projects
NASA Astrophysics Data System (ADS)
Bourilkov, D.; Rodriquez, J. L.
2014-06-01
With the explosion of big data in many fields, the efficient management of knowledge about all aspects of the data analysis gains in importance. A key feature of collaboration in large scale projects is keeping a log of what is being done and how - for private use, reuse, and for sharing selected parts with collaborators and peers, often distributed geographically on an increasingly global scale. Even better if the log is automatically created on the fly while the scientist or software developer is working in a habitual way, without the need for extra efforts. This saves time and enables a team to do more with the same resources. The CODESH - COllaborative DEvelopment SHell - and CAVES - Collaborative Analysis Versioning Environment System projects address this problem in a novel way. They build on the concepts of virtual states and transitions to enhance the collaborative experience by providing automatic persistent virtual logbooks. CAVES is designed for sessions of distributed data analysis using the popular ROOT framework, while CODESH generalizes the approach for any type of work on the command line in typical UNIX shells like bash or tcsh. Repositories of sessions can be configured dynamically to record and make available the knowledge accumulated in the course of a scientific or software endeavor. Access can be controlled to define logbooks of private sessions or sessions shared within or between collaborating groups. A typical use case is building working scalable systems for analysis of Petascale volumes of data as encountered in the LHC experiments. Our approach is general enough to find applications in many fields.
Minimizing bias in biomass allometry: Model selection and log transformation of data
Joseph Mascaro; undefined undefined; Flint Hughes; Amanda Uowolo; Stefan A. Schnitzer
2011-01-01
Nonlinear regression is increasingly used to develop allometric equations for forest biomass estimation (i.e., as opposed to the raditional approach of log-transformation followed by linear regression). Most statistical software packages, however, assume additive errors by default, violating a key assumption of allometric theory and possibly producing spurious models....
Current and future trends in marine image annotation software
NASA Astrophysics Data System (ADS)
Gomes-Pereira, Jose Nuno; Auger, Vincent; Beisiegel, Kolja; Benjamin, Robert; Bergmann, Melanie; Bowden, David; Buhl-Mortensen, Pal; De Leo, Fabio C.; Dionísio, Gisela; Durden, Jennifer M.; Edwards, Luke; Friedman, Ariell; Greinert, Jens; Jacobsen-Stout, Nancy; Lerner, Steve; Leslie, Murray; Nattkemper, Tim W.; Sameoto, Jessica A.; Schoening, Timm; Schouten, Ronald; Seager, James; Singh, Hanumant; Soubigou, Olivier; Tojeira, Inês; van den Beld, Inge; Dias, Frederico; Tempera, Fernando; Santos, Ricardo S.
2016-12-01
Given the need to describe, analyze and index large quantities of marine imagery data for exploration and monitoring activities, a range of specialized image annotation tools have been developed worldwide. Image annotation - the process of transposing objects or events represented in a video or still image to the semantic level, may involve human interactions and computer-assisted solutions. Marine image annotation software (MIAS) have enabled over 500 publications to date. We review the functioning, application trends and developments, by comparing general and advanced features of 23 different tools utilized in underwater image analysis. MIAS requiring human input are basically a graphical user interface, with a video player or image browser that recognizes a specific time code or image code, allowing to log events in a time-stamped (and/or geo-referenced) manner. MIAS differ from similar software by the capability of integrating data associated to video collection, the most simple being the position coordinates of the video recording platform. MIAS have three main characteristics: annotating events in real time, posteriorly to annotation and interact with a database. These range from simple annotation interfaces, to full onboard data management systems, with a variety of toolboxes. Advanced packages allow to input and display data from multiple sensors or multiple annotators via intranet or internet. Posterior human-mediated annotation often include tools for data display and image analysis, e.g. length, area, image segmentation, point count; and in a few cases the possibility of browsing and editing previous dive logs or to analyze the annotations. The interaction with a database allows the automatic integration of annotations from different surveys, repeated annotation and collaborative annotation of shared datasets, browsing and querying of data. Progress in the field of automated annotation is mostly in post processing, for stable platforms or still images. Integration into available MIAS is currently limited to semi-automated processes of pixel recognition through computer-vision modules that compile expert-based knowledge. Important topics aiding the choice of a specific software are outlined, the ideal software is discussed and future trends are presented.
Boland, Mary Regina; Rusanov, Alexander; So, Yat; Lopez-Jimenez, Carlos; Busacca, Linda; Steinman, Richard C; Bakken, Suzanne; Bigger, J Thomas; Weng, Chunhua
2014-12-01
Underspecified user needs and frequent lack of a gold standard reference are typical barriers to technology evaluation. To address this problem, this paper presents a two-phase evaluation framework involving usability experts (phase 1) and end-users (phase 2). In phase 1, a cross-system functionality alignment between expert-derived user needs and system functions was performed to inform the choice of "the best available" comparison system to enable a cognitive walkthrough in phase 1 and a comparative effectiveness evaluation in phase 2. During phase 2, five quantitative and qualitative evaluation methods are mixed to assess usability: time-motion analysis, software log, questionnaires - System Usability Scale and the Unified Theory of Acceptance of Use of Technology, think-aloud protocols, and unstructured interviews. Each method contributes data for a unique measure (e.g., time motion analysis contributes task-completion-time; software log contributes action transition frequency). The measures are triangulated to yield complementary insights regarding user-perceived ease-of-use, functionality integration, anxiety during use, and workflow impact. To illustrate its use, we applied this framework in a formative evaluation of a software called Integrated Model for Patient Care and Clinical Trials (IMPACT). We conclude that this mixed-methods evaluation framework enables an integrated assessment of user needs satisfaction and user-perceived usefulness and usability of a novel design. This evaluation framework effectively bridges the gap between co-evolving user needs and technology designs during iterative prototyping and is particularly useful when it is difficult for users to articulate their needs for technology support due to the lack of a baseline. Copyright © 2013 Elsevier Inc. All rights reserved.
Spectroscopy Made Easy: A New Tool for Fitting Observations with Synthetic Spectra
NASA Technical Reports Server (NTRS)
Valenti, J. A.; Piskunov, N.
1996-01-01
We describe a new software package that may be used to determine stellar and atomic parameters by matching observed spectra with synthetic spectra generated from parameterized atmospheres. A nonlinear least squares algorithm is used to solve for any subset of allowed parameters, which include atomic data (log gf and van der Waals damping constants), model atmosphere specifications (T(sub eff, log g), elemental abundances, and radial, turbulent, and rotational velocities. LTE synthesis software handles discontiguous spectral intervals and complex atomic blends. As a demonstration, we fit 26 Fe I lines in the NSO Solar Atlas (Kurucz et al.), determining various solar and atomic parameters.
Estimation of octanol/water partition coefficients using LSER parameters
Luehrs, Dean C.; Hickey, James P.; Godbole, Kalpana A.; Rogers, Tony N.
1998-01-01
The logarithms of octanol/water partition coefficients, logKow, were regressed against the linear solvation energy relationship (LSER) parameters for a training set of 981 diverse organic chemicals. The standard deviation for logKow was 0.49. The regression equation was then used to estimate logKow for a test of 146 chemicals which included pesticides and other diverse polyfunctional compounds. Thus the octanol/water partition coefficient may be estimated by LSER parameters without elaborate software but only moderate accuracy should be expected.
Log sampling methods and software for stand and landscape analyses.
Lisa J. Bate; Torolf R. Torgersen; Michael J. Wisdom; Edward O. Garton; Shawn C. Clabough
2008-01-01
We describe methods for efficient, accurate sampling of logs at landscape and stand scales to estimate density, total length, cover, volume, and weight. Our methods focus on optimizing the sampling effort by choosing an appropriate sampling method and transect length for specific forest conditions and objectives. Sampling methods include the line-intersect method and...
How Intrusion Detection Can Improve Software Decoy Applications
2003-03-01
THIS PAGE INTENTIONALLY LEFT BLANK 41 V. DISCUSSION Military history suggests it is best to employ a layered, defense-in...database: alert, postgresql , user=snort dbname=snort # output database: log, unixodbc, user=snort dbname=snort # output database: log, mssql, dbname...Threat Monitoring and Surveillance, James P. Anderson Co., Fort Washington. PA, April 1980. URL http://csrc.nist.gov/publications/ history /ande80
Implementation of an ADME enabling selection and visualization tool for drug discovery.
Stoner, Chad L; Gifford, Eric; Stankovic, Charles; Lepsy, Christopher S; Brodfuehrer, Joanne; Prasad, J V N Vara; Surendran, Narayanan
2004-05-01
The pharmaceutical industry has large investments in compound library enrichment, high throughput biological screening, and biopharmaceutical (ADME) screening. As the number of compounds submitted for in vitro ADME screens increases, data analysis, interpretation, and reporting will become rate limiting in providing ADME-structure-activity relationship information to guide the synthetic strategy for chemical series. To meet these challenges, a software tool was developed and implemented that enables scientists to explore in vitro and in silico ADME and chemistry data in a multidimensional framework. The present work integrates physicochemical and ADME data, encompassing results for Caco-2 permeability, human liver microsomal half-life, rat liver microsomal half-life, kinetic solubility, measured log P, rule of 5 descriptors (molecular weight, hydrogen bond acceptors, hydrogen bond donors, calculated log P), polar surface area, chemical stability, and CYP450 3A4 inhibition. To facilitate interpretation of this data, a semicustomized software solution using Spotfire was designed that allows for multidimensional data analysis and visualization. The solution also enables simultaneous viewing and export of chemical structures with the corresponding ADME properties, enabling a more facile analysis of ADME-structure-activity relationship. In vitro and in silico ADME data were generated for 358 compounds from a series of human immunodeficiency virus protease inhibitors, resulting in a data set of 5370 experimental values which were subsequently analyzed and visualized using the customized Spotfire application. Implementation of this analysis and visualization tool has accelerated the selection of molecules for further development based on optimum ADME characteristics, and provided medicinal chemistry with specific, data driven structural recommendations for improvements in the ADME profile. Copyright 2004 Wiley-Liss, Inc. and the American Pharmacists Association J Pharm Sci 93: 1131-1141, 2004
MAIL LOG, program summary and specifications
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
The summary and specifications to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS are provided. The MAIL LOG program has four modes of operation: (1) input - putting new records into the data base; (2) revise - changing or modifying existing records in the data base; (3) search - finding special records existing in the data base; and (4) archive - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the input and search modes.
P1198: software for tracing decision behavior in lending to small businesses.
Andersson, P
2001-05-01
This paper describes a process-tracing software program specially designed to capture decision behavior in lending to small businesses. The source code was written in Lotus Notes. The software runs in a Web browser and consists of two interacting systems: a database and a user interface. The database includes three realistic loan applications. The user interface consists of different but interacting screens that enable the participant to operate the software. Log files register the decision behavior of the participant. An empirical example is presented in order to show the software's potential in providing insights into judgment and decision making. The implications of the software are discussed.
Using Web Metric Software to Drive: Mobile Website Development
ERIC Educational Resources Information Center
Tidal, Junior
2011-01-01
Many libraries have developed mobile versions of their websites. In order to understand their users, web developers have conducted both usability tests and focus groups, yet analytical software and web server logs can also be used to better understand users. Using data collected from these tools, the Ursula C. Schwerin Library has made informed…
DOE Office of Scientific and Technical Information (OSTI.GOV)
HUBER, J.H.
An Enraf Densitometer is installed on tank 241-AY-102. The Densitometer will frequently be tasked to obtain and log density profiles. The activity can be effected a number of ways. Enraf Incorporated provides a software package called ''Logger18'' to its customers for the purpose of in-shop testing of their gauges. Logger18 is capable of accepting an input file which can direct the gauge to obtain a density profile for a given tank level and bottom limit. Logger18 is a complex, DOS based program which will require trained technicians and/or tank farm entries to obtain the data. ALARA considerations have prompted themore » development of a more user-friendly, computer-based interface to the Enraf densitometers. This document records the plan by which this new Enraf data acquisition software will be developed, reviewed, verified, and released. This plan applies to the development and implementation of a one-time-use software program, which will be called ''Enraf Control Panel.'' The software will be primarily used for remote operation of Enraf Densitometers for the purpose of obtaining and logging tank product density profiles.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandoval, D. M.; Strittmatter, R. B.; Abeyta, J. D.
2004-01-01
The initial objectives of this effort were to provide a hardware and software platform that can address the requirements for the accountability of classified removable electronic media and vault access logging. The Media Tracker system software assists classified media custodian in managing vault access logging and Media Tracking to prevent the inadvertent violation of rules or policies for the access to a restricted area and the movement and use of tracked items. The MediaTracker system includes the software tools to track and account for high consequence security assets and high value items. The overall benefits include: (1) real-time access tomore » the disposition of all Classified Removable Electronic Media (CREM), (2) streamlined security procedures and requirements, (3) removal of ambiguity and managerial inconsistencies, (4) prevention of incidents that can and should be prevented, (5) alignment with the DOE's initiative to achieve improvements in security and facility operations through technology deployment, and (6) enhanced individual responsibility by providing a consistent method of dealing with daily responsibilities. In response to initiatives to enhance the control of classified removable electronic media (CREM), the Media Tracker software suite was developed, piloted and implemented at the Los Alamos National Laboratory beginning in July 2000. The Media Tracker software suite assists in the accountability and tracking of CREM and other high-value assets. One component of the MediaTracker software suite provides a Laboratory-approved media tracking system. Using commercial touch screen and bar code technology, the MediaTracker (MT) component of the MediaTracker software suite provides an efficient and effective means to meet current Laboratory requirements and provides new-engineered controls to help assure compliance with those requirements. It also establishes a computer infrastructure at vault entrances for vault access logging, and can accommodate several methods of positive identification including smart cards and biometrics. Currently, we have three mechanisms that provide added security for accountability and tracking purposes. One mechanism consists of a portable, hand-held inventory scanner, which allows the custodian to physically track the items that are not accessible within a particular area. The second mechanism is a radio frequency identification (RFID) consisting of a monitoring portal, which tracks and logs in a database all activity tagged of items that pass through the portals. The third mechanism consists of an electronic tagging of a flash memory device for automated inventory of CREM in storage. By modifying this USB device the user is provided with added assurance, limiting the data from being obtained from any other computer.« less
Meta-analysis of Odds Ratios: Current Good Practices
Chang, Bei-Hung; Hoaglin, David C.
2016-01-01
Background Many systematic reviews of randomized clinical trials lead to meta-analyses of odds ratios. The customary methods of estimating an overall odds ratio involve weighted averages of the individual trials’ estimates of the logarithm of the odds ratio. That approach, however, has several shortcomings, arising from assumptions and approximations, that render the results unreliable. Although the problems have been documented in the literature for many years, the conventional methods persist in software and applications. A well-developed alternative approach avoids the approximations by working directly with the numbers of subjects and events in the arms of the individual trials. Objective We aim to raise awareness of methods that avoid the conventional approximations, can be applied with widely available software, and produce more-reliable results. Methods We summarize the fixed-effect and random-effects approaches to meta-analysis; describe conventional, approximate methods and alternative methods; apply the methods in a meta-analysis of 19 randomized trials of endoscopic sclerotherapy in patients with cirrhosis and esophagogastric varices; and compare the results. We demonstrate the use of SAS, Stata, and R software for the analysis. Results In the example, point estimates and confidence intervals for the overall log-odds-ratio differ between the conventional and alternative methods, in ways that can affect inferences. Programming is straightforward in the three software packages; an appendix gives the details. Conclusions The modest additional programming required should not be an obstacle to adoption of the alternative methods. Because their results are unreliable, use of the conventional methods for meta-analysis of odds ratios should be discontinued. PMID:28169977
Debugging and Logging Services for Defence Service Oriented Architectures
2012-02-01
Service A software component and callable end point that provides a logically related set of operations, each of which perform a logical step in a...important to note that in some cases when the fault is identified to lie in uneditable code such as program libraries, or outsourced software services ...debugging is limited to characterisation of the fault, reporting it to the software or service provider and development of work-arounds and management
Requirements-Driven Log Analysis Extended Abstract
NASA Technical Reports Server (NTRS)
Havelund, Klaus
2012-01-01
Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing logging infrastructure, which after all is part of most systems put in production. A log is a sequence of events, generated by special log recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a log using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such log analysis, compared to the entire testing team changing behavior.
A preliminary analysis of quantifying computer security vulnerability data in "the wild"
NASA Astrophysics Data System (ADS)
Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George
2016-05-01
A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.
Rowe, Steven P; Siddiqui, Adeel; Bonekamp, David
2014-07-01
To create novel radiology key image software that is easy to use for novice users, incorporates elements adapted from social networking Web sites, facilitates resident and fellow education, and can serve as the engine for departmental sharing of interesting cases and follow-up studies. Using open-source programming languages and software, radiology key image software (the key image and case log application, KICLA) was developed. This system uses a lightweight interface with the institutional picture archiving and communications systems and enables the storage of key images, image series, and cine clips. It was designed to operate with minimal disruption to the radiologists' daily workflow. Many features of the user interface have been inspired by social networking Web sites, including image organization into private or public folders, flexible sharing with other users, and integration of departmental teaching files into the system. We also review the performance, usage, and acceptance of this novel system. KICLA was implemented at our institution and achieved widespread popularity among radiologists. A large number of key images have been transmitted to the system since it became available. After this early experience period, the most commonly encountered radiologic modalities are represented. A survey distributed to users revealed that most of the respondents found the system easy to use (89%) and fast at allowing them to record interesting cases (100%). Hundred percent of respondents also stated that they would recommend a system such as KICLA to their colleagues. The system described herein represents a significant upgrade to the Digital Imaging and Communications in Medicine teaching file paradigm with efforts made to maximize its ease of use and inclusion of characteristics inspired by social networking Web sites that allow the system additional functionality such as individual case logging. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
SU-F-T-462: Lessons Learned From a Machine Incident Reporting System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sutlief, S; Hoisak, J
Purpose: Linear accelerators must operate with minimal downtime. Machine incident logs are a crucial tool to meet this requirement. They providing a history of service and demonstrate whether a fix is working. This study investigates the information content of a large department linear accelerator incident log. Methods: Our department uses an electronic reporting system to provide immediate information to both key department staff and the field service department. This study examines reports for five linac logs during 2015. The report attributes for analysis include frequency, level of documentation, who solved the problem, and type of fix used. Results: Of themore » reports, 36% were documented as resolved. In another 25% the resolution allowed treatment to proceed although the reported problem recurred within days. In 5% only intermediate troubleshooting was documented. The remainder lacked documentation. In 60% of the reports, radiation therapists resolved the problem, often by clearing the appropriate faults or reinitializing a software or hardware service. 22% were resolved by physics and 10% by field service engineers. The remaining 8% were resolved by IT, Facilities, or resolved spontaneously. Typical fixes, in order of scope, included clearing the fault and moving on, closing and re-opening the patient session or software, cycling power to a sub-unit, recalibrating a device (e.g., optical surface imaging), and calling in Field Service (usually resolving the problem through maintenance or component replacement). Conclusion: The reports with undocumented resolution represent a missed opportunity for learning. Frequency of who resolves a problem scales with the proximity of the person’s role (therapist, physicist, or service engineer), which is inversely related to the permanence of the resolution. Review of lessons learned from machine incident logs can form the basis for guidance to radiation therapists and medical physicists to minimize equipment downtime and ensure safe operation.« less
Chris B. LeDoux; Gary W. Miller
2008-01-01
In this study we used data from 16 Appalachian hardwood stands, a growth and yield computer simulation model, and stump-to-mill logging cost-estimating software to evaluate the optimal economic timing of crop tree release (CTR) treatments. The simulated CTR treatments consisted of one-time logging operations at stand age 11, 23, 31, or 36 years, with the residual...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fujii, T; Fujii, Y; Shimizu, S
Purpose: To acquire correct information for inside the body in patient positioning of Real-time-image Gated spot scanning Proton Therapy (RGPT), utilization of tomographic image at exhale phase of patient respiration obtained from 4-dimensional Cone beam CT (4D-CBCT) has been desired. We developed software named “Image Analysis Platform” for 4D-CBCT researches which has technique to segment projection-images based on 3D marker position in the body. The 3D marker position can be obtained by using two axes CBCT system at Hokkaido University Hospital Proton Therapy Center. Performance verification of the software was implemented. Methods: The software calculates 3D marker position retrospectively bymore » using matching positions on pair projection-images obtained by two axes fluoroscopy mode of CBCT system. Log data of 3D marker tracking are outputted after the tracking. By linking the Log data and gantry-angle file of projection-image, all projection-images are equally segmented to spatial five-phases according to marker 3D position of SI direction and saved to specified phase folder. Segmented projection-images are used for CBCT reconstruction of each phase. As performance verification of the software, test of segmented projection-images was implemented for sample CT phantom (Catphan) image acquired by two axes fluoroscopy mode of CBCT. Dummy marker was added on the images. Motion of the marker was modeled to move in 3D space. Motion type of marker is sin4 wave function has amplitude 10.0 mm/5.0 mm/0 mm, cycle 4 s/4 s/0 s for SI/AP/RL direction. Results: The marker was tracked within 0.58 mm accuracy in 3D for all images, and it was confirmed that all projection-images were segmented and saved to each phase folder correctly. Conclusion: We developed software for 4D-CBCT research which can segment projection-image based on 3D marker position. It will be helpful to create high quality of 4D-CBCT reconstruction image for RGPT.« less
Gabadinho, José; Beteva, Antonia; Guijarro, Matias; Rey-Bakaikoa, Vicente; Spruce, Darren; Bowler, Matthew W.; Brockhauser, Sandor; Flot, David; Gordon, Elspeth J.; Hall, David R.; Lavault, Bernard; McCarthy, Andrew A.; McCarthy, Joanne; Mitchell, Edward; Monaco, Stéphanie; Mueller-Dieckmann, Christoph; Nurizzo, Didier; Ravelli, Raimond B. G.; Thibault, Xavier; Walsh, Martin A.; Leonard, Gordon A.; McSweeney, Sean M.
2010-01-01
The design and features of a beamline control software system for macromolecular crystallography (MX) experiments developed at the European Synchrotron Radiation Facility (ESRF) are described. This system, MxCuBE, allows users to easily and simply interact with beamline hardware components and provides automated routines for common tasks in the operation of a synchrotron beamline dedicated to experiments in MX. Additional functionality is provided through intuitive interfaces that enable the assessment of the diffraction characteristics of samples, experiment planning, automatic data collection and the on-line collection and analysis of X-ray emission spectra. The software can be run in a tandem client-server mode that allows for remote control and relevant experimental parameters and results are automatically logged in a relational database, ISPyB. MxCuBE is modular, flexible and extensible and is currently deployed on eight macromolecular crystallography beamlines at the ESRF. Additionally, the software is installed at MAX-lab beamline I911-3 and at BESSY beamline BL14.1. PMID:20724792
Global Combat Support System-Marine Corps Proof-of-Concept for Dashboard Analytics
2014-12-01
The core is modern, commercial-off-the-shelf enterprise resource planning ( ERP ) software (Oracle 11i e-Business Suite). GCSS-MCs design is focused...factor in the decision to implement this new software . GCSS-MC is the technology centerpiece of the Logistics Modernization (LogMod) Program...GCSS-MC is based on the implementation of Oracle e-Business Suite 11i as the core software package. This is the same infrastructure that Oracle
Rapid estimation of aquifer salinity structure from oil and gas geophysical logs
NASA Astrophysics Data System (ADS)
Shimabukuro, D.; Stephens, M.; Ducart, A.; Skinner, S. M.
2016-12-01
We describe a workflow for creating aquifer salinity maps using Archie's equation for areas that have geophysical data from oil and gas wells. We apply this method in California, where geophysical logs are available in raster format from the Division of Oil, Gas, and Geothermal Resource (DOGGR) online archive. This method should be applicable to any region where geophysical logs are readily available. Much of the work is controlled by computer code, allowing salinity estimates for new areas to be rapidly generated. For a region of interest, the DOGGR online database is scraped for wells that were logged with multi-tool suites, such as the Platform Express or Triple Combination Logging Tools. Then, well construction metadata, such as measured depth, spud date, and well orientation, is attached. The resultant local database allows a weighted criteria selection of wells that are most likely to have the shallow resistivity, deep resistivity, and density porosity measurements necessary to calculate salinity over the longest depth interval. The algorithm can be adjusted for geophysical log availability for older well fields and density of sampling. Once priority wells are identified, a student researcher team uses Neuralog software to digitize the raster geophysical logs. Total dissolved solid (TDS) concentration is then calculated in clean, wet sand intervals using the resistivity-porosity method, a modified form of Archie's equation. These sand intervals are automatically selected using a combination of spontaneous potential and the difference in shallow resistivity and deep resistivity measurements. Gamma ray logs are not used because arkosic sands common in California make it difficult to distinguish sand and shale. Computer calculation allows easy adjustment of Archie's parameters. The result is a semi-continuous TDS profile for the wells of interest. These profiles are combined and contoured using standard 3-d visualization software to yield preliminary salinity maps for the region of interest. We present results for select well fields in the Southern San Joaquin Valley, California.
ERIC Educational Resources Information Center
Pendzick, Richard E.; Downs, Robert L.
2002-01-01
Describes software for electronic visitor management (EVM) called EasyLobbyTM, currently in use in thousands of federal and corporate installations throughout the world and its application for school and campus environments. Explains EasyLobbyTM's use to replace visitor logs, capture and store visitor data electronically, and provide badges that…
Users guide for FRCS: fuel reduction cost simulator software.
Roger D. Fight; Bruce R. Hartsough; Peter Noordijk
2006-01-01
The Fuel Reduction Cost Simulator (FRCS) spreadsheet application is public domain software used to estimate costs for fuel reduction treatments involving removal of trees of mixed sizes in the form of whole trees, logs, or chips from a forest. Equipment production rates were developed from existing studies. Equipment operating cost rates are from December 2002 prices...
Adaptable Computing Environment/Self-Assembling Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osbourn, Gordon C.; Bouchard, Ann M.; Bartholomew, John W.
Complex software applications are difficult to learn to use and to remember how to use. Further, the user has no control over the functionality available in a given application. The software we use can be created and modified only by a relatively small group of elite, highly skilled artisans known as programmers. "Normal users" are powerless to create and modify software themselves, because the tools for software development, designed by and for programmers, are a barrier to entry. This software, when completed, will be a user-adaptable computing environment in which the user is really in control of his/her own software,more » able to adapt the system, make new parts of the system interactive, and even modify the behavior of the system itself. Som key features of the basic environment that have been implemented are (a) books in bookcases, where all data is stored, (b) context-sensitive compass menus (compass, because the buttons are located in compass directions relative to the mouose cursor position), (c) importing tabular data and displaying it in a book, (d) light-weight table querying/sorting, (e) a Reach&Get capability (sort of a "smart" copy/paste that prevents the user from copying invalid data), and (f) a LogBook that automatically logs all user actions that change data or the system itself. To bootstrap toward full end-user adaptability, we implemented a set of development tools. With the development tools, compass menus can be made and customized.« less
Analysis of survival in breast cancer patients by using different parametric models
NASA Astrophysics Data System (ADS)
Enera Amran, Syahila; Asrul Afendi Abdullah, M.; Kek, Sie Long; Afiqah Muhamad Jamil, Siti
2017-09-01
In biomedical applications or clinical trials, right censoring was often arising when studying the time to event data. In this case, some individuals are still alive at the end of the study or lost to follow up at a certain time. It is an important issue to handle the censoring data in order to prevent any bias information in the analysis. Therefore, this study was carried out to analyze the right censoring data with three different parametric models; exponential model, Weibull model and log-logistic models. Data of breast cancer patients from Hospital Sultan Ismail, Johor Bahru from 30 December 2008 until 15 February 2017 was used in this study to illustrate the right censoring data. Besides, the covariates included in this study are the time of breast cancer infection patients survive t, age of each patients X1 and treatment given to the patients X2 . In order to determine the best parametric models in analysing survival of breast cancer patients, the performance of each model was compare based on Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and log-likelihood value using statistical software R. When analysing the breast cancer data, all three distributions were shown consistency of data with the line graph of cumulative hazard function resembles a straight line going through the origin. As the result, log-logistic model was the best fitted parametric model compared with exponential and Weibull model since it has the smallest value in AIC and BIC, also the biggest value in log-likelihood.
Software framework for the upcoming MMT Observatory primary mirror re-aluminization
NASA Astrophysics Data System (ADS)
Gibson, J. Duane; Clark, Dusty; Porter, Dallan
2014-07-01
Details of the software framework for the upcoming in-situ re-aluminization of the 6.5m MMT Observatory (MMTO) primary mirror are presented. This framework includes: 1) a centralized key-value store and data structure server for data exchange between software modules, 2) a newly developed hardware-software interface for faster data sampling and better hardware control, 3) automated control algorithms that are based upon empirical testing, modeling, and simulation of the aluminization process, 4) re-engineered graphical user interfaces (GUI's) that use state-of-the-art web technologies, and 5) redundant relational databases for data logging. Redesign of the software framework has several objectives: 1) automated process control to provide more consistent and uniform mirror coatings, 2) optional manual control of the aluminization process, 3) modular design to allow flexibility in process control and software implementation, 4) faster data sampling and logging rates to better characterize the approximately 100-second aluminization event, and 5) synchronized "real-time" web application GUI's to provide all users with exactly the same data. The framework has been implemented as four modules interconnected by a data store/server. The four modules are integrated into two Linux system services that start automatically at boot-time and remain running at all times. Performance of the software framework is assessed through extensive testing within 2.0 meter and smaller coating chambers at the Sunnyside Test Facility. The redesigned software framework helps ensure that a better performing and longer lasting coating will be achieved during the re-aluminization of the MMTO primary mirror.
NASA Technical Reports Server (NTRS)
Lowman, Douglas S.; Withers, B. Edward; Shagnea, Anita M.; Dent, Leslie A.; Hayhurst, Kelly J.
1990-01-01
A variety of instructions to be used in the development of implementations of software for the Guidance and Control Software (GCS) project is described. This document fulfills the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, 'Software Considerations in Airborne Systems and Equipment Certification' requirements for document No. 4, which specifies the information necessary for understanding and programming the host computer, and document No. 12, which specifies the software design and implementation standards that are applicable to the software development and testing process. Information on the following subjects is contained: activity recording, communication protocol, coding standards, change management, error handling, design standards, problem reporting, module testing logs, documentation formats, accuracy requirements, and programmer responsibilities.
Enhancing DSN Operations Efficiency with the Discrepancy Reporting Management System (DRMS)
NASA Technical Reports Server (NTRS)
Chatillon, Mark; Lin, James; Cooper, Tonja M.
2003-01-01
The DRMS is the Discrepancy Reporting Management System used by the Deep Space Network (DSN). It uses a web interface and is a management tool designed to track and manage: data outage incidents during spacecraft tracks against equipment and software known as DRs (discrepancy Reports), to record "out of pass" incident logs against equipment and software in a Station Log, to record instances where equipment has be restarted or reset as Reset records, and to electronically record equipment readiness status across the DSN. Tracking and managing these items increases DSN operational efficiency by providing: the ability to establish the operational history of equipment items, data on the quality of service provided to the DSN customers, the ability to measure service performance, early insight into processes, procedures and interfaces that may need updating or changing, and the capability to trace a data outage to a software or hardware change. The items listed above help the DSN to focus resources on areas of most need.
Windows Instant Messaging App Forensics: Facebook and Skype as Case Studies
Yang, Teing Yee; Dehghantanha, Ali; Choo, Kim-Kwang Raymond; Muda, Zaiton
2016-01-01
Instant messaging (IM) has changed the way people communicate with each other. However, the interactive and instant nature of these applications (apps) made them an attractive choice for malicious cyber activities such as phishing. The forensic examination of IM apps for modern Windows 8.1 (or later) has been largely unexplored, as the platform is relatively new. In this paper, we seek to determine the data remnants from the use of two popular Windows Store application software for instant messaging, namely Facebook and Skype on a Windows 8.1 client machine. This research contributes to an in-depth understanding of the types of terrestrial artefacts that are likely to remain after the use of instant messaging services and application software on a contemporary Windows operating system. Potential artefacts detected during the research include data relating to the installation or uninstallation of the instant messaging application software, log-in and log-off information, contact lists, conversations, and transferred files. PMID:26982207
Windows Instant Messaging App Forensics: Facebook and Skype as Case Studies.
Yang, Teing Yee; Dehghantanha, Ali; Choo, Kim-Kwang Raymond; Muda, Zaiton
2016-01-01
Instant messaging (IM) has changed the way people communicate with each other. However, the interactive and instant nature of these applications (apps) made them an attractive choice for malicious cyber activities such as phishing. The forensic examination of IM apps for modern Windows 8.1 (or later) has been largely unexplored, as the platform is relatively new. In this paper, we seek to determine the data remnants from the use of two popular Windows Store application software for instant messaging, namely Facebook and Skype on a Windows 8.1 client machine. This research contributes to an in-depth understanding of the types of terrestrial artefacts that are likely to remain after the use of instant messaging services and application software on a contemporary Windows operating system. Potential artefacts detected during the research include data relating to the installation or uninstallation of the instant messaging application software, log-in and log-off information, contact lists, conversations, and transferred files.
Ryder, Robert T.; Swezey, Christopher S.; Crangle, Robert D.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.
2014-01-01
This chapter is a re-release of U.S. Geological Survey Scientific Investigations Map 2985, of the same title, by Ryder and others (2008). For this chapter, two appendixes have been added that do not appear with the original version. Appendix A provides Log ASCII Standard (LAS) files for each drill hole along cross-section E–E'; they are text files which encode gamma-ray, neutron, density, and other logs that can be used by most well-logging software. Appendix B provides graphic well-log traces from each drill hole.
WE-G-213CD-03: A Dual Complementary Verification Method for Dynamic Tumor Tracking on Vero SBRT.
Poels, K; Depuydt, T; Verellen, D; De Ridder, M
2012-06-01
to use complementary cine EPID and gimbals log file analysis for in-vivo tracking accuracy monitoring. A clinical prototype of dynamic tracking (DT) was installed on the Vero SBRT system. This prototype version allowed tumor tracking by gimballed linac rotations using an internal-external correspondence model. The DT prototype software allowed the detailed logging of all applied gimbals rotations during tracking. The integration of an EPID on the vero system allowed the acquisition of cine EPID images during DT. We quantified the tracking error on cine EPID (E-EPID) by subtracting the target center (fiducial marker detection) and the field centroid. Dynamic gimbals log file information was combined with orthogonal x-ray verification images to calculate the in-vivo tracking error (E-kVLog). The correlation between E-kVLog and E-EPID was calculated for validation of the gimbals log file. Further, we investigated the sensitivity of the log file tracking error by introducing predefined systematic tracking errors. As an application we calculate gimbals log file tracking error for dynamic hidden target tests to investigate gravity effects and decoupled gimbals rotation from gantry rotation. Finally, calculating complementary cine EPID and log file tracking errors evaluated the clinical accuracy of dynamic tracking. A strong correlation was found between log file and cine EPID tracking error distribution during concurrent measurements (R=0.98). We found sensitivity in the gimbals log files to detect a systematic tracking error up to 0.5 mm. Dynamic hidden target tests showed no gravity influence on tracking performance and high degree of decoupled gimbals and gantry rotation during dynamic arc dynamic tracking. A submillimetric agreement between clinical complementary tracking error measurements was found. Redundancy of the internal gimbals log file with x-ray verification images with complementary independent cine EPID images was implemented to monitor the accuracy of gimballed tumor tracking on Vero SBRT. Research was financially supported by the Flemish government (FWO), Hercules Foundation and BrainLAB AG. © 2012 American Association of Physicists in Medicine.
NASA Technical Reports Server (NTRS)
Fletcher, Daryl P.; Alena, Richard L.; Akkawi, Faisal; Duncavage, Daniel P.
2004-01-01
This paper presents some of the challenges associated with bringing software projects from the research world into an operationa1 environment. While the core functional components of research-oriented software applications can have great utility in an operational setting, these applications often lack aspects important in an operational environment such as logging and security. Furthermore, these stand-alone applications, sometimes developed in isolation from one another, can produce data products useful to other applications in a software ecosystem.
MODIS. Volume 1: MODIS level 1A software baseline requirements
NASA Technical Reports Server (NTRS)
Masuoka, Edward; Fleig, Albert; Ardanuy, Philip; Goff, Thomas; Carpenter, Lloyd; Solomon, Carl; Storey, James
1994-01-01
This document describes the level 1A software requirements for the moderate resolution imaging spectroradiometer (MODIS) instrument. This includes internal and external requirements. Internal requirements include functional, operational, and data processing as well as performance, quality, safety, and security engineering requirements. External requirements include those imposed by data archive and distribution systems (DADS); scheduling, control, monitoring, and accounting (SCMA); product management (PM) system; MODIS log; and product generation system (PGS). Implementation constraints and requirements for adapting the software to the physical environment are also included.
A software for managing chemical processes in a multi-user laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camino, Fernando E.
Here, we report a software for logging chemical processes in a multi-user laboratory, which implements a work flow designed to reduce hazardous situations associated with the disposal of chemicals in incompatible waste containers. The software allows users to perform only those processes displayed in their list of authorized chemical processes and provides the location and label code of waste containers, among other useful information. The software has been used for six years in the cleanroom of the Center for Functional Nanomaterials at Brookhaven National Laboratory and has been an important factor for the excellent safety record of the Center.
A software for managing chemical processes in a multi-user laboratory
Camino, Fernando E.
2016-10-26
Here, we report a software for logging chemical processes in a multi-user laboratory, which implements a work flow designed to reduce hazardous situations associated with the disposal of chemicals in incompatible waste containers. The software allows users to perform only those processes displayed in their list of authorized chemical processes and provides the location and label code of waste containers, among other useful information. The software has been used for six years in the cleanroom of the Center for Functional Nanomaterials at Brookhaven National Laboratory and has been an important factor for the excellent safety record of the Center.
NASA Technical Reports Server (NTRS)
Kavelund, Klaus; Barringer, Howard
2012-01-01
TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.
Vaas, Lea A I; Sikorski, Johannes; Michael, Victoria; Göker, Markus; Klenk, Hans-Peter
2012-01-01
The Phenotype MicroArray (OmniLog® PM) system is able to simultaneously capture a large number of phenotypes by recording an organism's respiration over time on distinct substrates. This technique targets the object of natural selection itself, the phenotype, whereas previously addressed '-omics' techniques merely study components that finally contribute to it. The recording of respiration over time, however, adds a longitudinal dimension to the data. To optimally exploit this information, it must be extracted from the shapes of the recorded curves and displayed in analogy to conventional growth curves. The free software environment R was explored for both visualizing and fitting of PM respiration curves. Approaches using either a model fit (and commonly applied growth models) or a smoothing spline were evaluated. Their reliability in inferring curve parameters and confidence intervals was compared to the native OmniLog® PM analysis software. We consider the post-processing of the estimated parameters, the optimal classification of curve shapes and the detection of significant differences between them, as well as practically relevant questions such as detecting the impact of cultivation times and the minimum required number of experimental repeats. We provide a comprehensive framework for data visualization and parameter estimation according to user choices. A flexible graphical representation strategy for displaying the results is proposed, including 95% confidence intervals for the estimated parameters. The spline approach is less prone to irregular curve shapes than fitting any of the considered models or using the native PM software for calculating both point estimates and confidence intervals. These can serve as a starting point for the automated post-processing of PM data, providing much more information than the strict dichotomization into positive and negative reactions. Our results form the basis for a freely available R package for the analysis of PM data.
Vaas, Lea A. I.; Sikorski, Johannes; Michael, Victoria; Göker, Markus; Klenk, Hans-Peter
2012-01-01
Background The Phenotype MicroArray (OmniLog® PM) system is able to simultaneously capture a large number of phenotypes by recording an organism's respiration over time on distinct substrates. This technique targets the object of natural selection itself, the phenotype, whereas previously addressed ‘-omics’ techniques merely study components that finally contribute to it. The recording of respiration over time, however, adds a longitudinal dimension to the data. To optimally exploit this information, it must be extracted from the shapes of the recorded curves and displayed in analogy to conventional growth curves. Methodology The free software environment R was explored for both visualizing and fitting of PM respiration curves. Approaches using either a model fit (and commonly applied growth models) or a smoothing spline were evaluated. Their reliability in inferring curve parameters and confidence intervals was compared to the native OmniLog® PM analysis software. We consider the post-processing of the estimated parameters, the optimal classification of curve shapes and the detection of significant differences between them, as well as practically relevant questions such as detecting the impact of cultivation times and the minimum required number of experimental repeats. Conclusions We provide a comprehensive framework for data visualization and parameter estimation according to user choices. A flexible graphical representation strategy for displaying the results is proposed, including 95% confidence intervals for the estimated parameters. The spline approach is less prone to irregular curve shapes than fitting any of the considered models or using the native PM software for calculating both point estimates and confidence intervals. These can serve as a starting point for the automated post-processing of PM data, providing much more information than the strict dichotomization into positive and negative reactions. Our results form the basis for a freely available R package for the analysis of PM data. PMID:22536335
PAD_AUDIT -- PAD Auditing Package
NASA Astrophysics Data System (ADS)
Clayton, C. A.
The PAD (Packet Assembler Disassembler) utility is the part of the VAX/VMS Coloured Book Software (CBS) which allows a user to log onto remote computers from a local VAX. Unfortunately, logging into a computer via either the Packet SwitchStream (PSS) or the International Packet SwitchStream (IPSS) costs real money. Some users either do not appreciate this or do not care and have been known to clock up rather large quarterly bills. This software package allows a system manager to determine who has used PAD to call where and (most importantly) how much it has cost. The system manager can then take appropriate action - either charging the individuals, warning them to use the facility with more care or even denying access to a greedy user to one or more sites.
Shim, Heejung; Chasman, Daniel I.; Smith, Joshua D.; Mora, Samia; Ridker, Paul M.; Nickerson, Deborah A.; Krauss, Ronald M.; Stephens, Matthew
2015-01-01
We conducted a genome-wide association analysis of 7 subfractions of low density lipoproteins (LDLs) and 3 subfractions of intermediate density lipoproteins (IDLs) measured by gradient gel electrophoresis, and their response to statin treatment, in 1868 individuals of European ancestry from the Pharmacogenomics and Risk of Cardiovascular Disease study. Our analyses identified four previously-implicated loci (SORT1, APOE, LPA, and CETP) as containing variants that are very strongly associated with lipoprotein subfractions (log10Bayes Factor > 15). Subsequent conditional analyses suggest that three of these (APOE, LPA and CETP) likely harbor multiple independently associated SNPs. Further, while different variants typically showed different characteristic patterns of association with combinations of subfractions, the two SNPs in CETP show strikingly similar patterns - both in our original data and in a replication cohort - consistent with a common underlying molecular mechanism. Notably, the CETP variants are very strongly associated with LDL subfractions, despite showing no association with total LDLs in our study, illustrating the potential value of the more detailed phenotypic measurements. In contrast with these strong subfraction associations, genetic association analysis of subfraction response to statins showed much weaker signals (none exceeding log10Bayes Factor of 6). However, two SNPs (in APOE and LPA) previously-reported to be associated with LDL statin response do show some modest evidence for association in our data, and the subfraction response proles at the LPA SNP are consistent with the LPA association, with response likely being due primarily to resistance of Lp(a) particles to statin therapy. An additional important feature of our analysis is that, unlike most previous analyses of multiple related phenotypes, we analyzed the subfractions jointly, rather than one at a time. Comparisons of our multivariate analyses with standard univariate analyses demonstrate that multivariate analyses can substantially increase power to detect associations. Software implementing our multivariate analysis methods is available at http://stephenslab.uchicago.edu/software.html. PMID:25898129
Testing a Low-Interaction Honeypot against Live Cyber Attackers
2011-09-01
to run Snort were the Sourcefire Vulnerability Research Team ( VRT ) rules, which are the official rules available for the program. We used the...latest VRT rules that were available free to registered users, rules an average of 30 days old when released. The software provides a detailed alert log...although it is production system. On the Windows machine we installed Snort 2.9 with the VRT rules. Snort was configured to log comma-separated
NASA Technical Reports Server (NTRS)
Burleigh, Scott C.
2011-01-01
Sptrace is a general-purpose space utilization tracing system that is conceptually similar to the commercial Purify product used to detect leaks and other memory usage errors. It is designed to monitor space utilization in any sort of heap, i.e., a region of data storage on some device (nominally memory; possibly shared and possibly persistent) with a flat address space. This software can trace usage of shared and/or non-volatile storage in addition to private RAM (random access memory). Sptrace is implemented as a set of C function calls that are invoked from within the software that is being examined. The function calls fall into two broad classes: (1) functions that are embedded within the heap management software [e.g., JPL's SDR (Simple Data Recorder) and PSM (Personal Space Management) systems] to enable heap usage analysis by populating a virtual time-sequenced log of usage activity, and (2) reporting functions that are embedded within the application program whose behavior is suspect. For ease of use, these functions may be wrapped privately inside public functions offered by the heap management software. Sptrace can be used for VxWorks or RTEMS realtime systems as easily as for Linux or OS/X systems.
Maintenance of Automated Library Systems.
ERIC Educational Resources Information Center
Epstein, Susan Baerg
1983-01-01
Discussion of the maintenance of both the software and hardware in an automated library system highlights maintenance by the vendor, contracts and costs, the maintenance log, downtime, and planning for trouble. (EJS)
Dorazio, Robert M; Hunter, Margaret E
2015-11-03
Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log-log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model's parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.
Ryder, Robert T.; Crangle, Robert D.; Trippi, Michael H.; Swezey, Christopher S.; Lentz, Erika E.; Rowan, Elisabeth L.; Hope, Rebecca S.; Ruppert, Leslie F.; Ryder, Robert T.
2014-01-01
This chapter is a re-release of U.S. Geological Survey Scientific Investigations Map 3067, of the same title, by Ryder and others (2009). For this chapter, two appendixes have been added that do not appear with the original version. Appendix A provides Log ASCII Standard (LAS) files for each drill hole along cross-section D-D'; they are text files which encode gamma-ray, neutron, density, and other logs that can be used by most well-logging software. Appendix B provides graphic well-log traces and lithologic descriptions with formation tops from each drill hole.
Mail LOG: Program operating instructions
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
The operating instructions for the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, are provided. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG program has the following four modes of operation: (1) INPUT - putting new records into the data base (2) REVISE - changing or modifying existing records in the data base (3) SEARCH - finding special records existing in the data base (4) ARCHIVE - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the INPUT and SEARCH modes. The MAIL LOG data base consists of three main subfiles: Incoming and outgoing mail correspondence; Design Information Releases and Releases and Reports; and Drawings and Engineering orders.
NASA Astrophysics Data System (ADS)
Gülşen, Esra; Kurtulus, Bedri; Necati Yaylim, Tolga; Avsar, Ozgur
2017-04-01
In groundwater studies, quantification and detection of fluid flows in borehole is an important part of assessment aquifer characteristic at different depths. Monitoring wells disturbs the natural flow field and this disturbance creates different flow paths to an aquifer. Vertical flow fluid analyses are one of the important techniques to deal with the detection and quantification of these vertical flows in borehole/monitoring wells. Liwa region is located about 146 km to the south west of Abu Dhabi city and about 36 km southwest of Madinat Zayed. SWSR (Strategic Water Storage & Recovery Project) comprises three Schemes (A, B and C) and each scheme contains an infiltration basin in the center, 105 recovery wells, 10 clusters and each cluster comprises 3 monitoring wells with different depths; shallow ( 50 m), intermediate ( 75 m) and deep ( 100 m). The scope of this study is to calculate the transmissivity values at different depth and evaluate the Fluid Flow Log (FFL) data for Scheme A (105 recovery wells) in order to understand the aquifer characteristic at different depths. The transmissivity values at different depth levels are calculated using Razack and Huntley (1991) equation for vertical flow rates of 30 m3 /h, 60 m3 /h, 90 m3 /h, 120 m3 /h and then Empirical Bayesian Kriging is used for interpolation in Scheme A using ArcGIS 10.2 software. FFL are drawn by GeODin software. Derivative analysis of fluid flow data are done by Microsoft Office: Excel software. All statistical analyses are calculated by IBMSPSS software. The interpolation results show that the transmissivity values are higher at the top of the aquifer. In other word, the aquifer is found more productive at the upper part of the Liwa aquifer. We are very grateful for financial support and providing us the data to ZETAS Dubai Inc.
NASA Technical Reports Server (NTRS)
Moore, Andrew J.; Schubert, Matthew; Nicholas Rymer
2017-01-01
The report details test and measurement flights to demonstrate autonomous UAV inspection of high voltage electrical transmission structures. A UAV built with commercial, off-the-shelf hardware and software, supplemented with custom sensor logging software, measured ultraviolet emissions from a test generator placed on a low-altitude substation and a medium-altitude switching tower. Since corona discharge precedes catastrophic electrical faults on high-voltage structures, detection and geolocation of ultraviolet emissions is needed to develop a UAV-based self-diagnosing power grid. Signal readings from an onboard ultraviolet sensor were validated during flight with a commercial corona camera. Geolocation was accomplished with onboard GPS; the UAV position was logged to a local ground station and transmitted in real time to a NASA server for tracking in the national airspace.
High-performance computing on GPUs for resistivity logging of oil and gas wells
NASA Astrophysics Data System (ADS)
Glinskikh, V.; Dudaev, A.; Nechaev, O.; Surodina, I.
2017-10-01
We developed and implemented into software an algorithm for high-performance simulation of electrical logs from oil and gas wells using high-performance heterogeneous computing. The numerical solution of the 2D forward problem is based on the finite-element method and the Cholesky decomposition for solving a system of linear algebraic equations (SLAE). Software implementations of the algorithm used the NVIDIA CUDA technology and computing libraries are made, allowing us to perform decomposition of SLAE and find its solution on central processor unit (CPU) and graphics processor unit (GPU). The calculation time is analyzed depending on the matrix size and number of its non-zero elements. We estimated the computing speed on CPU and GPU, including high-performance heterogeneous CPU-GPU computing. Using the developed algorithm, we simulated resistivity data in realistic models.
The Wettzell System Monitoring Concept and First Realizations
NASA Technical Reports Server (NTRS)
Ettl, Martin; Neidhardt, Alexander; Muehlbauer, Matthias; Ploetz, Christian; Beaudoin, Christopher
2010-01-01
Automated monitoring of operational system parameters for the geodetic space techniques is becoming more important in order to improve the geodetic data and to ensure the safety and stability of automatic and remote-controlled observations. Therefore, the Wettzell group has developed the system monitoring software, SysMon, which is based on a reliable, remotely-controllable hardware/software realization. A multi-layered data logging system based on a fanless, robust industrial PC with an internal database system is used to collect data from several external, serial, bus, or PCI-based sensors. The internal communication is realized with Remote Procedure Calls (RPC) and uses generative programming with the interface software generator idl2rpc.pl developed at Wettzell. Each data monitoring stream can be configured individually via configuration files to define the logging rates or analog-digital-conversion parameters. First realizations are currently installed at the new laser ranging system at Wettzell to address safety issues and at the VLBI station O Higgins as a meteorological data logger. The system monitoring concept should be realized for the Wettzell radio telescope in the near future.
NASA Astrophysics Data System (ADS)
Irawan, R.; Yong, B.; Kristiani, F.
2017-02-01
Bandung, one of the cities in Indonesia, is vulnerable to dengue disease for both early-stage (Dengue Fever) and severe-stage (Dengue Haemorrhagic Fever and Dengue Shock Syndrome). In 2013, there were 5,749 patients in Bandung and 2,032 of the patients were hospitalized in Santo Borromeus Hospital. In this paper, there are two models, Poisson-gamma and Log-normal models, that use Bayesian inference to estimate the value of the relative risk. The calculation is done by Markov Chain Monte Carlo method which is the simulation using Gibbs Sampling algorithm in WinBUGS 1.4.3 software. The analysis results for dengue disease of 30 sub-districts in Bandung in 2013 based on Santo Borromeus Hospital’s data are Coblong and Bandung Wetan sub-districts had the highest relative risk using both models for the early-stage, severe-stage, and all stages. Meanwhile, Cinambo sub-district had the lowest relative risk using both models for the severe-stage and all stages and BojongloaKaler sub-district had the lowest relative risk using both models for the early-stage. For the model comparison using DIC (Deviance Information Criterion) method, the Log-normal model is a better model for the early-stage and severe-stage, but for the all stages, the Poisson-gamma model is a better model which fits the data.
Borehole geophysical logs at Naval Weapons Industrial Reserve Plant, Dallas, Texas
Braun, Christopher L.; Anaya, Roberto; Kuniansky, Eve L.
2000-01-01
A shallow alluvial aquifer at the Naval Weapons Industrial Reserve Plant near Dallas, Texas, has been contaminated by organic solvents used in the fabrication and assembly of aircraft and aircraft parts. Natural gamma-ray and electromagnetic-induction borehole geophysical logs were obtained from 162 poly vinyl-chloride-cased wells at the plant and were integrated with existing lithologic data to improve site characterization of the subsurface alluvium. Software was developed for filtering and classifying the log data and for processing, analyzing, and creating graphical output of the digital data. The alluvium consists of mostly fine-grained low-permeability sediments; however for this study, the alluvium was classified into low, intermediate, and high clay-content sediments on the basis of the gamma-ray logs. The low clay-content sediments were interpreted as being relatively permeable, whereas the high clay-content sediments were interpreted as being relatively impermeable. Simple statistics were used to identify zones of potentially contaminated sediments on the basis of the gamma-ray log classifications and the electromagnetic-induction log conductivity data.
Modelling the structure of sludge aggregates
Smoczyński, Lech; Ratnaweera, Harsha; Kosobucka, Marta; Smoczyński, Michał; Kalinowski, Sławomir; Kvaal, Knut
2016-01-01
ABSTRACT The structure of sludge is closely associated with the process of wastewater treatment. Synthetic dyestuff wastewater and sewage were coagulated using the PAX and PIX methods, and electro-coagulated on aluminium electrodes. The processes of wastewater treatment were supported with an organic polymer. The images of surface structures of the investigated sludge were obtained using scanning electron microscopy (SEM). The software image analysis permitted obtaining plots log A vs. log P, wherein A is the surface area and P is the perimeter of the object, for individual objects comprised in the structure of the sludge. The resulting database confirmed the ‘self-similarity’ of the structural objects in the studied groups of sludge, which enabled calculating their fractal dimension and proposing models for these objects. A quantitative description of the sludge aggregates permitted proposing a mechanism of the processes responsible for their formation. In the paper, also, the impact of the structure of the investigated sludge on the process of sedimentation, and dehydration of the thickened sludge after sedimentation, was discussed. PMID:26549812
Dietsch, Benjamin J.; Wilson, Richard C.; Strauch, Kellan R.
2008-01-01
Repeated flooding of Omaha Creek has caused damage in the Village of Homer. Long-term degradation and bridge scouring have changed substantially the channel characteristics of Omaha Creek. Flood-plain managers, planners, homeowners, and others rely on maps to identify areas at risk of being inundated. To identify areas at risk for inundation by a flood having a 1-percent annual probability, maps were created using topographic data and water-surface elevations resulting from hydrologic and hydraulic analyses. The hydrologic analysis for the Omaha Creek study area was performed using historical peak flows obtained from the U.S. Geological Survey streamflow gage (station number 06601000). Flood frequency and magnitude were estimated using the PEAKFQ Log-Pearson Type III analysis software. The U.S. Army Corps of Engineers' Hydrologic Engineering Center River Analysis System, version 3.1.3, software was used to simulate the water-surface elevation for flood events. The calibrated model was used to compute streamflow-gage stages and inundation elevations for the discharges corresponding to floods of selected probabilities. Results of the hydrologic and hydraulic analyses indicated that flood inundation elevations are substantially lower than from a previous study.
Ganesan, Balasubramanian; Martini, Silvana; Solorio, Jonathan; Walsh, Marie K
2015-01-01
This study investigated the effects of high intensity ultrasound (temperature, amplitude, and time) on the inactivation of indigenous bacteria in pasteurized milk, Bacillus atrophaeus spores inoculated into sterile milk, and Saccharomyces cerevisiae inoculated into sterile orange juice using response surface methodology. The variables investigated were sonication temperature (range from 0 to 84°C), amplitude (range from 0 to 216 μm), and time (range from 0.17 to 5 min) on the response, log microbe reduction. Data were analyzed by statistical analysis system software and three models were developed, each for bacteria, spore, and yeast reduction. Regression analysis identified sonication temperature and amplitude to be significant variables on microbe reduction. Optimization of the inactivation of microbes was found to be at 84.8°C, 216 μm amplitude, and 5.8 min. In addition, the predicted log reductions of microbes at common processing conditions (72°C for 20 sec) using 216 μm amplitude were computed. The experimental responses for bacteria, spore, and yeast reductions fell within the predicted levels, confirming the accuracy of the models.
Impact detection and analysis/health monitoring system for composites
NASA Astrophysics Data System (ADS)
Child, James E.; Kumar, Amrita; Beard, Shawn; Qing, Peter; Paslay, Don G.
2006-05-01
This manuscript includes information from test evaluations and development of a smart event detection system for use in monitoring composite rocket motor cases for damaging impacts. The primary purpose of the system as a sentry for case impact event logging is accomplished through; implementation of a passive network of miniaturized piezoelectric sensors, logger with pre-determined force threshold levels, and analysis software. Empirical approaches to structural characterizations and network calibrations along with implementation techniques were successfully evaluated, testing was performed on both unloaded (less propellants) as well as loaded rocket motors with the cylindrical areas being of primary focus. The logged test impact data with known physical network parameters provided for impact location as well as force determination, typically within 3 inches of actual impact location using a 4 foot network grid and force accuracy within 25%of an actual impact force. The simplistic empirical characterization approach along with the robust / flexible sensor grids and battery operated portable logger show promise of a system that can increase confidence in composite integrity for both new assets progressing through manufacturing processes as well as existing assets that may be in storage or transportation.
Martini, Silvana; Solorio, Jonathan; Walsh, Marie K.
2015-01-01
This study investigated the effects of high intensity ultrasound (temperature, amplitude, and time) on the inactivation of indigenous bacteria in pasteurized milk, Bacillus atrophaeus spores inoculated into sterile milk, and Saccharomyces cerevisiae inoculated into sterile orange juice using response surface methodology. The variables investigated were sonication temperature (range from 0 to 84°C), amplitude (range from 0 to 216 μm), and time (range from 0.17 to 5 min) on the response, log microbe reduction. Data were analyzed by statistical analysis system software and three models were developed, each for bacteria, spore, and yeast reduction. Regression analysis identified sonication temperature and amplitude to be significant variables on microbe reduction. Optimization of the inactivation of microbes was found to be at 84.8°C, 216 μm amplitude, and 5.8 min. In addition, the predicted log reductions of microbes at common processing conditions (72°C for 20 sec) using 216 μm amplitude were computed. The experimental responses for bacteria, spore, and yeast reductions fell within the predicted levels, confirming the accuracy of the models. PMID:26904659
Ahmadi, Ali; Mobasheri, Mahmoud; Hashemi-Nazari, Seyed Saeed; Baradaran, Azar; Choobini, Zahra Molavi
2014-09-01
Type 2 diabetes mellitus (DM) and hypertension are worldwide epidemic. Association between DM and colon cancer was obtained in previous studies. Prevalence of DM and hypertension in the patients with colorectal cancer (CRC) has not been reported in Iran. The present study was aimed to investigate the prevalence of hypertension and type 2 DM and their effect on median survival time in patients with CRC. Overall, 2570 individual-year follow-ups were conducted for 1127 patients with CRC. For the diagnosis of type 2 DM, fasting blood sugar test and glycosylated hemoglobin test were used and for hypertension, blood pressure was measured in two turns. The descriptive indices were calculated, and the mean and median survival from CRC diagnosis time was calculated using survival analysis and a comparison among survival times was done through log-rank test. Stata software 12 (Stata Corp. 2011. Stata Statistical Software: Release 12. College Station, TX: Stata Corp LP) was used for data analysis. The prevalence of hypertension and type 2 DM in the patients with CRC was respectively 13.38% (95% confidence interval [CI]: 11.1-15.8) and 8.69% (95% CI: 7-10.7). Median survival time in patients with hypertension and DM were 8.52 and 4.9 years. According to log-rank test, no significant difference was observed between the survival time of CRC patients suffering from hypertension and diabetes type 2. The obtained findings in this study indicate that survival time in patients with type 2 DM less than hypertension but two metabolic diseases have the same effect on survival rate of the patients with CRC. Understanding the risk factors for CRC may guide the development of strategies targeted toward its prevention.
Test-bench system for a borehole azimuthal acoustic reflection imaging logging tool
NASA Astrophysics Data System (ADS)
Liu, Xianping; Ju, Xiaodong; Qiao, Wenxiao; Lu, Junqiang; Men, Baiyong; Liu, Dong
2016-06-01
The borehole azimuthal acoustic reflection imaging logging tool (BAAR) is a new generation of imaging logging tool, which is able to investigate stratums in a relatively larger range of space around the borehole. The BAAR is designed based on the idea of modularization with a very complex structure, so it has become urgent for us to develop a dedicated test-bench system to debug each module of the BAAR. With the help of a test-bench system introduced in this paper, test and calibration of BAAR can be easily achieved. The test-bench system is designed based on the client/server model. The hardware system mainly consists of a host computer, an embedded controlling board, a bus interface board, a data acquisition board and a telemetry communication board. The host computer serves as the human machine interface and processes the uploaded data. The software running on the host computer is designed based on VC++. The embedded controlling board uses Advanced Reduced Instruction Set Machines 7 (ARM7) as the micro controller and communicates with the host computer via Ethernet. The software for the embedded controlling board is developed based on the operating system uClinux. The bus interface board, data acquisition board and telemetry communication board are designed based on a field programmable gate array (FPGA) and provide test interfaces for the logging tool. To examine the feasibility of the test-bench system, it was set up to perform a test on BAAR. By analyzing the test results, an unqualified channel of the electronic receiving cabin was discovered. It is suggested that the test-bench system can be used to quickly determine the working condition of sub modules of BAAR and it is of great significance in improving production efficiency and accelerating industrial production of the logging tool.
Using a Formal Approach for Reverse Engineering and Design Recovery to Support Software Reuse
NASA Technical Reports Server (NTRS)
Gannod, Gerald C.
2002-01-01
This document describes 3rd year accomplishments and summarizes overall project accomplishments. Included as attachments are all published papers from year three. Note that the budget for this project was discontinued after year two, but that a residual budget from year two allowed minimal continuance into year three. Accomplishments include initial investigations into log-file based reverse engineering, service-based software reuse, and a source to XML generator.
1993-08-01
pricing and sales, order processing , and purchasing. The class of manufacturing planning functions include aggregate production planning, materials...level. I Depending on the application, each control level will have a number of functions associated with it. For instance, order processing , purchasing...include accounting, sales forecasting, product costing, pricing and sales, order processing , and purchasing. The class of manufacturing planning functions
Quantifying the clay content with borehole depth and impact on reservoir flow
NASA Astrophysics Data System (ADS)
Sarath Kumar, Aaraellu D.; Chattopadhyay, Pallavi B.
2017-04-01
This study focuses on the application of reservoir well log data and 3D transient numerical model for proper optimization of flow dynamics and hydrocarbon potential. Fluid flow through porous media depends on clay content that controls porosity, permeability and pore pressure. The pressure dependence of permeability is more pronounced in tight formations. Therefore, preliminary clay concentration analysis and geo-mechanical characterizations have been done by using wells logs. The assumption of a constant permeability for a reservoir is inappropriate and therefore the study deals with impact of permeability variation for pressure-sensitive formation. The study started with obtaining field data from available well logs. Then, the mathematical models are developed to understand the efficient extraction of oil in terms of reservoir architecture, porosity and permeability. The fluid flow simulations have been done using COMSOL Multiphysics Software by choosing time dependent subsurface flow module that is governed by Darcy's law. This study suggests that the reservoir should not be treated as a single homogeneous structure with unique porosity and permeability. The reservoir parameters change with varying clay content and it should be considered for effective planning and extraction of oil. There is an optimum drawdown for maximum production with varying permeability in a reservoir.
Tank Monitoring and Document control System (TMACS) As Built Software Design Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
GLASSCOCK, J.A.
This document describes the software design for the Tank Monitor and Control System (TMACS). This document captures the existing as-built design of TMACS as of November 1999. It will be used as a reference document to the system maintainers who will be maintaining and modifying the TMACS functions as necessary. The heart of the TMACS system is the ''point-processing'' functionality where a sample value is received from the field sensors and the value is analyzed, logged, or alarmed as required. This Software Design Document focuses on the point-processing functions.
An expert system for prediction of chemical toxicity
Hickey, James P.; Aldridge, Andrew J.; Passino-Reader, Dora R.; Frank, Anthony M.
1992-01-01
The National Fisheries Research Center- Great Lakes has developed an interactive computer program that uses the structure of an organic molecule to predict its acute toxicity to four aquatic species. The expert system software, written in the muLISP language, identifies the skeletal structures and substituent groups of an organic molecule from a user-supplied standard chemical notation known as a SMILES string, and then generates values for four solvatochromic parameters. Multiple regression equations relate these parameters to the toxicities (expressed as log10LC50s and log10EC50s, along with 95% confidence intervals) for four species. The system is demonstrated by prediction of toxicity for anilide-type pesticides to the fathead minnow (Pimephales promelas). This software is designed for use on an IBM-compatible personal computer by personnel with minimal toxicology background for rapid estimation of chemical toxicity. The system has numerous applications, with much potential for use in the pharmaceutical industry
Pallicer, Juan M; Pascual, Rosalia; Port, Adriana; Rosés, Martí; Ràfols, Clara; Bosch, Elisabeth
2013-02-14
The influence of the hydrogen bond acidity when the 1-octanol/water partition coefficient (log P(o/w)) of drugs is determined from chromatographic measurements was studied in this work. This influence was firstly evaluated by means of the comparison between the Abraham solvation parameter model when it is applied to express the 1-octanol/water partitioning and the chromatographic retention, expressed as the solute polarity p. Then, several hydrogen bond acidity descriptors were compared in order to determine properly the log P(o/w) of drugs. These descriptors were obtained from different software and comprise two-dimensional parameters such as the calculated Abraham hydrogen bond acidity A and three-dimensional descriptors like HDCA-2 from CODESSA program or WO1 and DRDODO descriptors calculated from Volsurf+software. The additional HOMO-LUMO polarizability descriptor should be added when the three-dimensional descriptors are used to complement the chromatographic retention. The models generated using these descriptors were compared studying the correlations between the determined log P(o/w) values and the reference ones. The comparison showed that there was no significant difference between the tested models and any of them was able to determine the log P(o/w) of drugs from a single chromatographic measurement and the correspondent molecular descriptors terms. However, the model that involved the calculated A descriptor was simpler and it is thus recommended for practical uses. Copyright © 2012 Elsevier B.V. All rights reserved.
Active Wireline Heave Compensation for Ocean Drilling
NASA Astrophysics Data System (ADS)
Goldberg, D.; Liu, T.; Swain, K.; Furman, C.; Iturrino, G. J.
2014-12-01
The up-and-down heave motion of a ship causes a similar motion on any instruments tethered on wireline cable below it. If the amplitude of this motion is greater than a few tens of cm, significant discrepancy in the depth below the ship is introduced, causing uncertainty in the acquired data. Large and irregular cabled motions also increase the risk of damaging tethered instruments, particularly those with relatively delicate sensors. In 2005, Schlumberger and Deep Down, Inc built an active wireline heave compensator (AHC) system for use onboard the JOIDES Resolution to compensate for heave motion on wireline logging tools deployed in scientific drill holes. The goals for the new AHC system were to (1) design a reliable heave compensation system; and (2) devise a robust and quantitative methodology for routine assessment of compensation efficiency (CE) during wireline operations. Software programs were developed to monitor CE and the dynamics of logging tools in real-time, including system performance under variable parameters such as water depth, sea state, cable length, logging speed and direction. We present the CE results from the AHC system on the JOIDES Resolution during a 5-year period of recent IODP operations and compare the results to those from previous compensation systems deployed during ODP and IODP. Based on new data under heave conditions of ±0.2-2.0 m and water depths of 300-4,800 m in open holes, the system reduces 65-80% of downhole tool displacement under stationary conditions and 50-60% during normal logging operations. Moreover, down/up tool motion at low speeds (300-600 m/h) reduces the system's CE values by 15-20%, and logging down at higher speeds (1,000-1,200 m/h) reduces CE values by 55-65%. Furthermore, the system yields slightly lower CE values of 40-50% without tension feedback of the downhole cable while logging. These results indicate that the new system's compensation efficiency is comparable to or better than previous systems, with additional advantages that include upgradable compensation control software and the capability for continued assessment under varying environmental conditions. Future integration of downhole cable dynamics as an input feedback could further improve CE during logging operations.
Predicting through-focus visual acuity with the eye's natural aberrations.
Kingston, Amanda C; Cox, Ian G
2013-10-01
To develop a predictive optical modeling process that utilizes individual computer eye models along with a novel through-focus image quality metric. Individual eye models were implemented in optical design software (Zemax, Bellevue, WA) based on evaluation of ocular aberrations, pupil diameter, visual acuity, and accommodative response of 90 subjects (180 eyes; 24-63 years of age). Monocular high-contrast minimum angle of resolution (logMAR) acuity was assessed at 6 m, 2 m, 1 m, 67 cm, 50 cm, 40 cm, 33 cm, 28 cm, and 25 cm. While the subject fixated on the lowest readable line of acuity, total ocular aberrations and pupil diameter were measured three times each using the Complete Ophthalmic Analysis System (COAS HD VR) at each distance. A subset of 64 mature presbyopic eyes was used to predict the clinical logMAR acuity performance of five novel multifocal contact lens designs. To validate predictability of the design process, designs were manufactured and tested clinically on a population of 24 mature presbyopes (having at least +1.50 D spectacle add at 40 cm). Seven object distances were used in the validation study (6 m, 2 m, 1 m, 67 cm, 50 cm, 40 cm, and 25 cm) to measure monocular high-contrast logMAR acuity. Baseline clinical through-focus logMAR was shown to correlate highly (R² = 0.85) with predicted logMAR from individual eye models. At all object distances, each of the five multifocal lenses showed less than one line difference, on average, between predicted and clinical normalized logMAR acuity. Correlation showed R² between 0.90 and 0.97 for all multifocal designs. Computer-based models that account for patient's aberrations, pupil diameter changes, and accommodative amplitude can be used to predict the performance of contact lens designs. With this high correlation (R² ≥ 0.90) and high level of predictability, more design options can be explored in the computer to optimize performance before a lens is manufactured and tested clinically.
Liang, Chao; Qiao, Jun-Qin; Lian, Hong-Zhen
2017-12-15
Reversed-phase liquid chromatography (RPLC) based octanol-water partition coefficient (logP) or distribution coefficient (logD) determination methods were revisited and assessed comprehensively. Classic isocratic and some gradient RPLC methods were conducted and evaluated for neutral, weak acid and basic compounds. Different lipophilicity indexes in logP or logD determination were discussed in detail, including the retention factor logk w corresponding to neat water as mobile phase extrapolated via linear solvent strength (LSS) model from isocratic runs and calculated with software from gradient runs, the chromatographic hydrophobicity index (CHI), apparent gradient capacity factor (k g ') and gradient retention time (t g ). Among the lipophilicity indexes discussed, logk w from whether isocratic or gradient elution methods best correlated with logP or logD. Therefore logk w is recommended as the preferred lipophilicity index for logP or logD determination. logk w easily calculated from methanol gradient runs might be the main candidate to replace logk w calculated from classic isocratic run as the ideal lipophilicity index. These revisited RPLC methods were not applicable for strongly ionized compounds that are hardly ion-suppressed. A previously reported imperfect ion-pair RPLC method was attempted and further explored for studying distribution coefficients (logD) of sulfonic acids that totally ionized in the mobile phase. Notably, experimental logD values of sulfonic acids were given for the first time. The IP-RPLC method provided a distinct way to explore logD values of ionized compounds. Copyright © 2017 Elsevier B.V. All rights reserved.
Pre-selection and assessment of green organic solvents by clustering chemometric tools.
Tobiszewski, Marek; Nedyalkova, Miroslava; Madurga, Sergio; Pena-Pereira, Francisco; Namieśnik, Jacek; Simeonov, Vasil
2018-01-01
The study presents the result of the application of chemometric tools for selection of physicochemical parameters of solvents for predicting missing variables - bioconcentration factors, water-octanol and octanol-air partitioning constants. EPI Suite software was successfully applied to predict missing values for solvents commonly considered as "green". Values for logBCF, logK OW and logK OA were modelled for 43 rather nonpolar solvents and 69 polar ones. Application of multivariate statistics was also proved to be useful in the assessment of the obtained modelling results. The presented approach can be one of the first steps and support tools in the assessment of chemicals in terms of their greenness. Copyright © 2017 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.
Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the datamore » into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.« less
Kook, D; Bühren, J; Klaproth, O K; Bauch, A S; Derhartunian, V; Kohnen, T
2011-02-01
The purpose of this study was to evaluate a novel technique for the correction of postoperative astigmatism after penetrating keratoplasty with the use of the femtosecond laser creating astigmatic keratotomies (femto-AK) in the scope of a retrospective case series. Clinical data of ten eyes of nine patients with high residual astigmatism after penetrating keratoplasty undergoing paired femto-AK using a 60-kHz femtosecond laser (IntraLase™, AMO) were analyzed. A new software algorithm was used to create paired arcuate cuts deep into the donor corneal button with different cut angles. Target values were refraction, uncorrected visual acuity, best corrected visual acuity, topographic data (Orbscan®, Bausch & Lomb, Rochester, NY, USA), and corneal wavefront analysis using Visual Optics Lab (VOL)-Pro 7.14 Software (Sarver and Associates). Vector analysis was performed using the Holladay, Cravy and Koch formula. Statistical analysis was performed to detect significances between visits using Student's t test. All procedures were performed without any major complications. The mean follow-up was 13 months. The mean patient age was 48.7 years. The preoperative mean uncorrected visual acuity (logMAR) was 1.27, best corrected visual acuity 0.55, mean subjective cylinder -7.4 D, and mean topometric astigmatism 9.3 D. The postoperative mean uncorrected visual acuity (logMAR) was 1.12, best corrected visual acuity 0.47, mean subjective cylinder -4.1 D, and mean topometric astigmatism 6.5 D. Differences between corneal higher order aberrations showed a high standard deviation and were therefore not statistically significant. Astigmatic keratotomy using the femtosecond laser seems to be a safe and effective tool for the correction of higher corneal astigmatisms. Due to the biomechanical properties of the cornea and missing empirical data for the novel femto-AK technology, higher numbers of patients are necessary to develop optimal treatment nomograms.
Discrepancy Reporting Management System
NASA Technical Reports Server (NTRS)
Cooper, Tonja M.; Lin, James C.; Chatillon, Mark L.
2004-01-01
Discrepancy Reporting Management System (DRMS) is a computer program designed for use in the stations of NASA's Deep Space Network (DSN) to help establish the operational history of equipment items; acquire data on the quality of service provided to DSN customers; enable measurement of service performance; provide early insight into the need to improve processes, procedures, and interfaces; and enable the tracing of a data outage to a change in software or hardware. DRMS is a Web-based software system designed to include a distributed database and replication feature to achieve location-specific autonomy while maintaining a consistent high quality of data. DRMS incorporates commercial Web and database software. DRMS collects, processes, replicates, communicates, and manages information on spacecraft data discrepancies, equipment resets, and physical equipment status, and maintains an internal station log. All discrepancy reports (DRs), Master discrepancy reports (MDRs), and Reset data are replicated to a master server at NASA's Jet Propulsion Laboratory; Master DR data are replicated to all the DSN sites; and Station Logs are internal to each of the DSN sites and are not replicated. Data are validated according to several logical mathematical criteria. Queries can be performed on any combination of data.
Bladder cancer mapping in Libya based on standardized morbidity ratio and log-normal model
NASA Astrophysics Data System (ADS)
Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley
2017-05-01
Disease mapping contains a set of statistical techniques that detail maps of rates based on estimated mortality, morbidity, and prevalence. A traditional approach to measure the relative risk of the disease is called Standardized Morbidity Ratio (SMR). It is the ratio of an observed and expected number of accounts in an area, which has the greatest uncertainty if the disease is rare or if geographical area is small. Therefore, Bayesian models or statistical smoothing based on Log-normal model are introduced which might solve SMR problem. This study estimates the relative risk for bladder cancer incidence in Libya from 2006 to 2007 based on the SMR and log-normal model, which were fitted to data using WinBUGS software. This study starts with a brief review of these models, starting with the SMR method and followed by the log-normal model, which is then applied to bladder cancer incidence in Libya. All results are compared using maps and tables. The study concludes that the log-normal model gives better relative risk estimates compared to the classical method. The log-normal model has can overcome the SMR problem when there is no observed bladder cancer in an area.
NASA Astrophysics Data System (ADS)
Fletcher, Stephen; Kirkpatrick, Iain; Dring, Roderick; Puttock, Robert; Thring, Rob; Howroyd, Simon
2017-03-01
Supercapacitors are an emerging technology with applications in pulse power, motive power, and energy storage. However, their carbon electrodes show a variety of non-ideal behaviours that have so far eluded explanation. These include Voltage Decay after charging, Voltage Rebound after discharging, and Dispersed Kinetics at long times. In the present work, we establish that a vertical ladder network of RC components can reproduce all these puzzling phenomena. Both software and hardware realizations of the network are described. In general, porous carbon electrodes contain random distributions of resistance R and capacitance C, with a wider spread of log R values than log C values. To understand what this implies, a simplified model is developed in which log R is treated as a Gaussian random variable while log C is treated as a constant. From this model, a new family of equivalent circuits is developed in which the continuous distribution of log R values is replaced by a discrete set of log R values drawn from a geometric series. We call these Pascal Equivalent Circuits. Their behaviour is shown to resemble closely that of real supercapacitors. The results confirm that distributions of RC time constants dominate the behaviour of real supercapacitors.
NASA Astrophysics Data System (ADS)
Toropov, Andrey A.; Toropova, Alla P.
2018-06-01
Predictive model of logP for Pt(II) and Pt(IV) complexes built up with the Monte Carlo method using the CORAL software has been validated with six different splits into the training and validation sets. The improving of the predictive potential of models for six different splits has been obtained using so-called index of ideality of correlation. The suggested models give possibility to extract molecular features, which cause the increase or vice versa decrease of the logP.
Development of software for geodynamic processes monitoring system
NASA Astrophysics Data System (ADS)
Kabanov, M. M.; Kapustin, S. N.; Gordeev, V. F.; Botygin, I. A.; Tartakovsky, V. A.
2017-11-01
This article justifies the usage of natural pulsed electromagnetic Earth's noises logging method for mapping anomalies of strain-stress state of Earth's crust. The methods and technologies for gathering, processing and systematization of data gathered by ground multi-channel geophysical loggers for monitoring geomagnetic situation have been experimentally tested, and software had been developed. The data was consolidated in a network storage and can be accessed without using any specialized client software. The article proposes ways to distinguish global and regional small-scale time-space variations of Earth's natural electromagnetic field. For research purposes, the software provides a way to export data for any given period of time for any loggers and displays measurement data charts for selected set of stations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takahashi, R; Kamima, T; Tachibana, H
2016-06-15
Purpose: To investigate the effect of the trajectory files from linear accelerator for Clarkson-based independent dose verification in IMRT and VMAT plans. Methods: A CT-based independent dose verification software (Simple MU Analysis: SMU, Triangle Products, Japan) with a Clarksonbased algorithm was modified to calculate dose using the trajectory log files. Eclipse with the three techniques of step and shoot (SS), sliding window (SW) and Rapid Arc (RA) was used as treatment planning system (TPS). In this study, clinically approved IMRT and VMAT plans for prostate and head and neck (HN) at two institutions were retrospectively analyzed to assess the dosemore » deviation between DICOM-RT plan (PL) and trajectory log file (TJ). An additional analysis was performed to evaluate MLC error detection capability of SMU when the trajectory log files was modified by adding systematic errors (0.2, 0.5, 1.0 mm) and random errors (5, 10, 30 mm) to actual MLC position. Results: The dose deviations for prostate and HN in the two sites were 0.0% and 0.0% in SS, 0.1±0.0%, 0.1±0.1% in SW and 0.6±0.5%, 0.7±0.9% in RA, respectively. The MLC error detection capability shows the plans for HN IMRT were the most sensitive and 0.2 mm of systematic error affected 0.7% dose deviation on average. Effect of the MLC random error did not affect dose error. Conclusion: The use of trajectory log files including actual information of MLC location, gantry angle, etc should be more effective for an independent verification. The tolerance level for the secondary check using the trajectory file may be similar to that of the verification using DICOM-RT plan file. From the view of the resolution of MLC positional error detection, the secondary check could detect the MLC position error corresponding to the treatment sites and techniques. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less
Treatment delivery software for a new clinical grade ultrasound system for thermoradiotherapy.
Novák, Petr; Moros, Eduardo G; Straube, William L; Myerson, Robert J
2005-11-01
A detailed description of a clinical grade Scanning Ultrasound Reflector Linear Array System (SURLAS) applicator was given in a previous paper [Med. Phys. 32, 230-240 (2005)]. In this paper we concentrate on the design, development, and testing of the personal computer (PC) based treatment delivery software that runs the therapy system. The SURLAS requires the coordinated interaction between the therapy applicator and several peripheral devices for its proper and safe operation. One of the most important tasks was the coordination of the input power sequences for the elements of two parallel opposed ultrasound arrays (eight 1.5 cm x 2 cm elements/array, array 1 and 2 operate at 1.9 and 4.9 MHz, respectively) in coordination with the position of a dual-face scanning acoustic reflector. To achieve this, the treatment delivery software can divide the applicator's treatment window in up to 64 sectors (minimum size of 2 cm x 2 cm), and control the power to each sector independently by adjusting the power output levels from the channels of a 16-channel radio-frequency generator. The software coordinates the generator outputs with the position of the reflector as it scans back and forth between the arrays. Individual sector control and dual frequency operation allows the SURLAS to adjust power deposition in three dimensions to superficial targets coupled to its treatment window. The treatment delivery software also monitors and logs several parameters such as temperatures acquired using a 16-channel thermocouple thermometry unit. Safety (in particular to patients) was the paramount concern and design criterion. Failure mode and effects analysis (FMEA) was applied to the applicator as well as to the entire therapy system in order to identify safety issues and rank their relative importance. This analysis led to the implementation of several safety mechanisms and a software structure where each device communicates with the controlling PC independently of the others. In case of a malfunction in any part of the system or a violation of a user-defined safety criterion based on temperature readings, the software terminates treatment immediately and the user is notified. The software development process consisting of problem analysis, design, implementation, and testing is presented in this paper. Once the software was finished and integrated with the hardware, the therapy system was extensively tested. Results demonstrated that the software operates the SURLAS as intended with minimum risk to future patients.
A general framework for parametric survival analysis.
Crowther, Michael J; Lambert, Paul C
2014-12-30
Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.
Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I
2015-11-03
We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.
The Ruggedized STD Bus Microcomputer - A low cost computer suitable for Space Shuttle experiments
NASA Technical Reports Server (NTRS)
Budney, T. J.; Stone, R. W.
1982-01-01
Previous space flight computers have been costly in terms of both hardware and software. The Ruggedized STD Bus Microcomputer is based on the commercial Mostek/Pro-Log STD Bus. Ruggedized PC cards can be based on commercial cards from more than 60 manufacturers, reducing hardware cost and design time. Software costs are minimized by using standard 8-bit microprocessors and by debugging code using commercial versions of the ruggedized flight boards while the flight hardware is being fabricated.
Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator
NASA Technical Reports Server (NTRS)
Bolen, Kenny; Greenlaw, Ronald
2010-01-01
A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.
Performance Data, Analytics & Services Job Logs & Statistics Training & Tutorials Software Outages NERSC Training Spectrum Scale User Group Meeting Live Status Now Computing Queue Look MOTD » Deep Learning at 15 PFlops Enables Training for Extreme Weather Identification at Scale March 29, 2018
Precision measurements from very-large scale aerial digital imagery.
Booth, D Terrance; Cox, Samuel E; Berryman, Robert D
2006-01-01
Managers need measurements and resource managers need the length/width of a variety of items including that of animals, logs, streams, plant canopies, man-made objects, riparian habitat, vegetation patches and other things important in resource monitoring and land inspection. These types of measurements can now be easily and accurately obtained from very large scale aerial (VLSA) imagery having spatial resolutions as fine as 1 millimeter per pixel by using the three new software programs described here. VLSA images have small fields of view and are used for intermittent sampling across extensive landscapes. Pixel-coverage among images is influenced by small changes in airplane altitude above ground level (AGL) and orientation relative to the ground, as well as by changes in topography. These factors affect the object-to-camera distance used for image-resolution calculations. 'ImageMeasurement' offers a user-friendly interface for accounting for pixel-coverage variation among images by utilizing a database. 'LaserLOG' records and displays airplane altitude AGL measured from a high frequency laser rangefinder, and displays the vertical velocity. 'Merge' sorts through large amounts of data generated by LaserLOG and matches precise airplane altitudes with camera trigger times for input to the ImageMeasurement database. We discuss application of these tools, including error estimates. We found measurements from aerial images (collection resolution: 5-26 mm/pixel as projected on the ground) using ImageMeasurement, LaserLOG, and Merge, were accurate to centimeters with an error less than 10%. We recommend these software packages as a means for expanding the utility of aerial image data.
Tracking Clouds with low cost GNSS chips aided by the Arduino platform
NASA Astrophysics Data System (ADS)
Hameed, Saji; Realini, Eugenio; Ishida, Shinya
2016-04-01
The Global Navigation Satellite System (GNSS) is a constellation of satellites that is used to provide geo-positioning services. Besides this application, the GNSS system is important for a wide range of scientific and civilian applications. For example, GNSS systems are routinely used in civilian applications such as surveying and scientific applications such as the study of crustal deformation. Another important scientific application of GNSS system is in meteorological research. Here it is mainly used to determine the total water vapour content of the troposphere, hereafter Precipitable Water Vapor (PWV). However, both GNSS receivers and software have prohibitively high price due to a variety of reasons. To overcome this somewhat artificial barrier we are exploring the use of low-cost GNSS receivers along with open source GNSS software for scientific research, in particular for GNSS meteorology research. To achieve this aim, we have developed a custom Arduino compatible data logging board that is able to operate together with a specific low-cost single frequency GNSS receiver chip from NVS Technologies AG. We have also developed an open-source software bundle that includes a new Arduino core for the Atmel324p chip, which is the main processor used in our custom logger. We have also developed software code that enables data collection, logging and parsing of the GNSS data stream. Additionally we have comprehensively evaluated the low power characteristics of the GNSS receiver and logger boards. Currently we are exploring the use of several openly source or free to use for research software to map GNSS delays to PWV. These include the open source goGPS (http://www.gogps-project.org/) and gLAB (http://gage.upc.edu/gLAB) and the openly available GAMIT software from Massachusetts Institute of Technology (MIT). We note that all the firmware and software developed as part of this project is available on an open source license.
Activity Catalog Tool (ACT) user manual, version 2.0
NASA Technical Reports Server (NTRS)
Segal, Leon D.; Andre, Anthony D.
1994-01-01
This report comprises the user manual for version 2.0 of the Activity Catalog Tool (ACT) software program, developed by Leon D. Segal and Anthony D. Andre in cooperation with NASA Ames Aerospace Human Factors Research Division, FLR branch. ACT is a software tool for recording and analyzing sequences of activity over time that runs on the Macintosh platform. It was designed as an aid for professionals who are interested in observing and understanding human behavior in field settings, or from video or audio recordings of the same. Specifically, the program is aimed at two primary areas of interest: human-machine interactions and interactions between humans. The program provides a means by which an observer can record an observed sequence of events, logging such parameters as frequency and duration of particular events. The program goes further by providing the user with a quantified description of the observed sequence, through application of a basic set of statistical routines, and enables merging and appending of several files and more extensive analysis of the resultant data.
DIY soundcard based temperature logging system. Part I: design
NASA Astrophysics Data System (ADS)
Nunn, John
2016-11-01
This paper aims to enable schools to make their own low-cost temperature logging instrument and to learn a something about its calibration in the process. This paper describes how a thermistor can be integrated into a simple potential divider circuit which is powered with the sound output of a computer and monitored by the microphone input. The voltage across a fixed resistor is recorded and scaled to convert it into a temperature reading in the range 0-100 °C. The calibration process is described with reference to fixed points and the effects of non-linearity are highlighted. An optimised calibration procedure is described which enables sub degree resolution and a software program was written which makes it possible to log, display and save temperature changes over a user determined period of time.
Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.
Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R
2012-08-01
Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.
Educational interactive multimedia software: The impact of interactivity on learning
NASA Astrophysics Data System (ADS)
Reamon, Derek Trent
This dissertation discusses the design, development, deployment and testing of two versions of educational interactive multimedia software. Both versions of the software are focused on teaching mechanical engineering undergraduates about the fundamentals of direct-current (DC) motor physics and selection. The two versions of Motor Workshop software cover the same basic materials on motors, but differ in the level of interactivity between the students and the software. Here, the level of interactivity refers to the particular role of the computer in the interaction between the user and the software. In one version, the students navigate through information that is organized by topic, reading text, and viewing embedded video clips; this is referred to as "low-level interactivity" software because the computer simply presents the content. In the other version, the students are given a task to accomplish---they must design a small motor-driven 'virtual' vehicle that competes against computer-generated opponents. The interaction is guided by the software which offers advice from 'experts' and provides contextual information; we refer to this as "high-level interactivity" software because the computer is actively participating in the interaction. The software was used in two sets of experiments, where students using the low-level interactivity software served as the 'control group,' and students using the highly interactive software were the 'treatment group.' Data, including pre- and post-performance tests, questionnaire responses, learning style characterizations, activity tracking logs and videotapes were collected for analysis. Statistical and observational research methods were applied to the various data to test the hypothesis that the level of interactivity effects the learning situation, with higher levels of interactivity being more effective for learning. The results show that both the low-level and high-level interactive versions of the software were effective in promoting learning about the subject of motors. The focus of learning varied between users of the two versions, however. The low-level version was more effective for teaching concepts and terminology, while the high-level version seemed to be more effective for teaching engineering applications.
The CCD/Transit Instrument (CTI) data-analysis system
NASA Technical Reports Server (NTRS)
Cawson, M. G. M.; Mcgraw, J. T.; Keane, M. J.
1995-01-01
The automated software system for archiving, analyzing, and interrogating data from the CCD/Transit Instrument (CTI) is described. The CTI collects up to 450 Mbytes of image-data each clear night in the form of a narrow strip of sky observed in two colors. The large data-volumes and the scientific aims of the project make it imperative that the data are analyzed within the 24-hour period following the observations. To this end a fully automatic and self evaluating software system has been developed. The data are collected from the telescope in real-time and then transported to Tucson for analysis. Verification is performed by visual inspection of random subsets of the data and obvious cosmic rays are detected and removed before permanent archival is made to the optical disc. The analysis phase is performed by a pair of linked algorithms, one operating on the absolute pixel-values and the other on the spatial derivative of the data. In this way both isolated and merged images are reliably detected in a single pass. In order to isolate the latter algorithm from the effects of noise spikes a 3x3 Hanning filter is applied to the raw data before the analysis is run. The algorithms reduce the input pixel-data to a database of measured parameters for each image which has been found. A contrast filter is applied in order to assign a detection-probability to each image and then x-y calibration and intensity calibration are performed using known reference stars in the strip. These are added to as necessary by secondary standards boot-strapped from the CTI data itself. The final stages involve merging the new data into the CTI Master-list and History-list and the automatic comparison of each new detection with a set of pre-defined templates in parameter-space to find interesting objects such as supernovae, quasars and variable stars. Each stage of the processing from verification to interesting image selection is performed under a data-logging system which both controls the pipe-lining of data through the system and records key performance monitor parameters which are built into the software. Furthermore, the data from each stage are stored in databases to facilitate evaluation, and all stages offer the facility to enter keyword-indexed free-format text into the data-logging system. In this way a large measure of certification is built into the system to provide the necessary confidence in the end results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.H. Frantz Jr; K.G. Brown; W.K. Sawyer
2006-03-01
This report summarizes the work performed under contract DE-FC26-03NT41743. The primary objective of this study was to develop tools that would allow Underground Gas Storage (UGS) operators to use wellhead electronic flow measurement (EFM) data to quickly and efficiently identify trends in well damage over time, thus aiding in the identification of potential causes of the damage. Secondary objectives of this work included: (1) To assist UGS operators in the evaluation of hardware and software requirements for implementing an EFM system similar to the one described in this report, and (2) To provide a cost-benefit analysis framework UGS operators canmore » use to evaluate economic benefits of installing wellhead EFM systems in their particular fields. Assessment of EFM data available for use, and selection of the specific study field are reviewed. The various EFM data processing tasks, including data collection, organization, extraction, processing, and interpretation are discussed. The process of damage assessment via pressure transient analysis of EFM data is outlined and demonstrated, including such tasks as quality control, semi-log analysis, and log-log analysis of pressure transient test data extracted from routinely collected EFM data. Output from pressure transient test analyses for 21 wells is presented, and the interpretation of these analyses to determine the timing of damage development is demonstrated using output from specific study wells. Development of processing and interpretation modules to handle EFM data interpretation in horizontal wells is also a presented and discussed. A spreadsheet application developed to aid underground gas storage operators in the selection of EFM equipment is presented, discussed, and used to determine the cost benefit of installing EFM equipment in a gas storage field. Recommendations for future work related to EFM in gas storage fields are presented and discussed.« less
Effect of cobalt doping on structural and dielectric properties of nanocrystalline LaCrO3
NASA Astrophysics Data System (ADS)
Zarrin, Naima; Husain, Shahid
2018-05-01
Pure and Co doped Lanthanum chromite (LaCrO3) nanoparticles, LaCr1-xCoxO3 (0≤x≤0.3), have been synthesized through sol-gel process and their structural, morphological and dielectric properties have been studied. X ray diffraction patterns reveal that the samples are in single phase having orthorhombic structure with Pnma space group. Structural parameters are refined by Rietveld refinement using Fullprof software. Lattice parameters and unit cell volume are found to decrease with increase in Co doping. Crystallite size is calculated using Scherrer equation and is also found to decrease with increase in Co concentration. Surface morphology is examined using SEM-EDX analysis, which confirms the formation of regular and homogeneous samples without any impurities. The value of dielectric constant (ɛ') decreases with the increase in frequency while it enhances with the increase in Co concentration. The log (ɛ'×f) versus log (f) graphs have been plotted to verify the universal dielectric response (UDR) model. All the samples follow UDR model in the low frequency range.
Ultrasonic cleaning of conveyor belt materials using Listeria monocytogenes as a model organism.
Tolvanén, Riina; Lunden, Janne; Korkeala, Hannu; Wirtanen, Gun
2007-03-01
Persistent Listeria monocytogenes contamination of food industry equipment is a difficult problem to solve. Ultrasonic cleaning offers new possibilities for cleaning conveyors and other equipment that are not easy to clean. Ultrasonic cleaning was tested on three conveyor belt materials: polypropylene, acetal, and stainless steel (cold-rolled, AISI 304). Cleaning efficiency was tested at two temperatures (30 and 45 degrees C) and two cleaning times (30 and 60 s) with two cleaning detergents (KOH, and NaOH combined with KOH). Conveyor belt materials were soiled with milk-based soil and L. monocytogenes strains V1, V3, and B9, and then incubated for 72 h to attach bacteria to surfaces. Ultrasonic cleaning treatments reduced L. monocytogenes counts on stainless steel 4.61 to 5.90 log units; on acetal, 3.37 to 5.55 log units; and on polypropylene, 2.31 to 4.40 log units. The logarithmic reduction differences were statistically analyzed by analysis of variance using Statistical Package for the Social Sciences software. The logarithmic reduction was significantly greater in stainless steel than in plastic materials (P < 0.001 for polypropylene, P = 0.023 for acetal). Higher temperatures enhanced the cleaning efficiency in tested materials. No significant difference occurred between cleaning times. The logarithmic reduction was significantly higher (P = 0.013) in cleaning treatments with potassium hydroxide detergent. In this study, ultrasonic cleaning was efficient for cleaning conveyor belt materials.
An artificial intelligence approach to lithostratigraphic correlation using geophysical well logs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olea, R.A.; Davis, J.C.
1986-01-01
Computer programs for lithostratigraphic correlation of well logs have achieved limited success. Their algorithms are based on an oversimplified view of the manual process used by analysts to establish geologically correct correlations. The programs experience difficulties if the correlated rocks deviate from an ideal geometry of perfectly homogeneous, parallel layers of infinite extent. Artificial intelligence provides a conceptual basis for formulating the task of lithostratigraphic correlation, leading to more realistic procedures. A prototype system using the ''production rule'' approach of expert systems successfully correlates well logs in areas of stratigraphic complexity. Two digitized logs are used per well, one formore » curve matching and the other for lithologic comparison. The software has been successfully used to correlate more than 100,000 ft (30 480 m) of section, through clastic sequences in Louisiana and through carbonate sequences in Kansas. Correlations have been achieved even in the presence of faults, unconformities, facies changes, and lateral variations in bed thickness.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vazquez Quino, L; Huerta Hernandez, C; Morrow, A
2016-06-15
Purpose: To evaluate the use of MobiusFX as a pre-treatment verification IMRT QA tool and compare it with a commercial 4D detector array for VMAT plan QA. Methods: 15 VMAT plan QA of different treatment sites were delivered and measured by traditional means with the 4D detector array ArcCheck (Sun Nuclear corporation) and at the same time measurement in linac treatment logs (Varian Dynalogs files) were analyzed from the same delivery with MobiusFX software (Mobius Medical Systems). VMAT plan QAs created in Eclipse treatment planning system (Varian) in a TrueBeam linac machine (Varian) were delivered and analyzed with the gammamore » analysis routine from SNPA software (Sun Nuclear corporation). Results: Comparable results in terms of the gamma analysis with 99.06% average gamma passing with 3%,3mm passing rate is observed in the comparison among MobiusFX, ArcCheck measurements, and the Treatment Planning System dose calculated. When going to a stricter criterion (1%,1mm) larger discrepancies are observed in different regions of the measurements with an average gamma of 66.24% between MobiusFX and ArcCheck. Conclusion: This work indicates the potential for using MobiusFX as a routine pre-treatment patient specific IMRT method for quality assurance purposes and its advantages as a phantom-less method which reduce the time for IMRT QA measurement. MobiusFX is capable of produce similar results of those by traditional methods used for patient specific pre-treatment verification VMAT QA. Even the gamma results comparing to the TPS are similar the analysis of both methods show that the errors being identified by each method are found in different regions. Traditional methods like ArcCheck are sensitive to setup errors and dose difference errors coming from the linac output. On the other hand linac log files analysis record different errors in the VMAT QA associated with the MLCs and gantry motion that by traditional methods cannot be detected.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shiinoki, T; Hanazawa, H; Shibuya, K
Purpose: The respirato ry gating system combined the TrueBeam and a new real-time tumor-tracking radiotherapy system (RTRT) was installed. The RTRT system consists of two x-ray tubes and color image intensifiers. Using fluoroscopic images, the fiducial marker which was implanted near the tumor was tracked and was used as the internal surrogate for respiratory gating. The purposes of this study was to develop the verification technique of the respiratory gating with the new RTRT using cine electronic portal image device images (EPIDs) of TrueBeam and log files of the RTRT. Methods: A patient who underwent respiratory gated SBRT of themore » lung using the RTRT were enrolled in this study. For a patient, the log files of three-dimensional coordinate of fiducial marker used as an internal surrogate were acquired using the RTRT. Simultaneously, the cine EPIDs were acquired during respiratory gated radiotherapy. The data acquisition was performed for one field at five sessions during the course of SBRT. The residual motion errors were calculated using the log files (E{sub log}). The fiducial marker used as an internal surrogate into the cine EPIDs was automatically extracted by in-house software based on the template-matching algorithm. The differences between the the marker positions of cine EPIDs and digitally reconstructed radiograph were calculated (E{sub EPID}). Results: Marker detection on EPID using in-house software was influenced by low image contrast. For one field during the course of SBRT, the respiratory gating using the RTRT showed the mean ± S.D. of 95{sup th} percentile E{sub EPID} were 1.3 ± 0.3 mm,1.1 ± 0.5 mm,and those of E{sub log} were 1.5 ± 0.2 mm, 1.1 ± 0.2 mm in LR and SI directions, respectively. Conclusion: We have developed the verification method of respiratory gating combined TrueBeam and new real-time tumor-tracking radiotherapy system using EPIDs and log files.« less
Operational field evaluation of the PAC-MAG man-portable magnetometer array
NASA Astrophysics Data System (ADS)
Keranen, Joe; Topolosky, Zeke; Schultz, Gregory; Miller, Jonathan
2013-06-01
Detection and discrimination of unexploded ordnance (UXO) in areas of prior conflict is of high importance to the international community and the United States government. For humanitarian applications, sensors and processing methods need to be robust, reliable, and easy to train and implement using indigenous UXO removal personnel. This paper describes system characterization, system testing, and a continental United States (CONUS) Operational Field Evaluations (OFE) of the PAC-MAG man-portable UXO detection system. System testing occurred at a government test facility in June, 2010 and December, 2011 and the OFE occurred at the same location in June, 2012. NVESD and White River Technologies personnel were present for all testing and evaluation. The PAC-MAG system is a manportable magnetometer array for the detection and characterization of ferrous UXO. System hardware includes four Cesium vapor magnetometers for detection, a Real-time Kinematic Global Position System (RTK-GPS) for sensor positioning, an electronics module for merging array data and WiFi communications and a tablet computer for transmitting and logging data. An odometer, or "hipchain" encoder, provides position information in GPS-denied areas. System software elements include data logging software and post-processing software for detection and characterization of ferrous anomalies. The output of the post-processing software is a dig list containing locations of potential UXO(s), formatted for import into the system GPS equipment for reacquisition of anomalies. Results from system characterization and the OFE will be described.
An experiment in software reliability: Additional analyses using data from automated replications
NASA Technical Reports Server (NTRS)
Dunham, Janet R.; Lauterbach, Linda A.
1988-01-01
A study undertaken to collect software error data of laboratory quality for use in the development of credible methods for predicting the reliability of software used in life-critical applications is summarized. The software error data reported were acquired through automated repetitive run testing of three independent implementations of a launch interceptor condition module of a radar tracking problem. The results are based on 100 test applications to accumulate a sufficient sample size for error rate estimation. The data collected is used to confirm the results of two Boeing studies reported in NASA-CR-165836 Software Reliability: Repetitive Run Experimentation and Modeling, and NASA-CR-172378 Software Reliability: Additional Investigations into Modeling With Replicated Experiments, respectively. That is, the results confirm the log-linear pattern of software error rates and reject the hypothesis of equal error rates per individual fault. This rejection casts doubt on the assumption that the program's failure rate is a constant multiple of the number of residual bugs; an assumption which underlies some of the current models of software reliability. data raises new questions concerning the phenomenon of interacting faults.
Warfarin: history, tautomerism and activity
NASA Astrophysics Data System (ADS)
Porter, William R.
2010-06-01
The anticoagulant drug warfarin, normally administered as the racemate, can exist in solution in potentially as many as 40 topologically distinct tautomeric forms. Only 11 of these forms for each enantiomer can be distinguished by selected computational software commonly used to estimate octanol-water partition coefficients and/or ionization constants. The history of studies on warfarin tautomerism is reviewed, along with the implications of tautomerism to its biological properties (activity, protein binding and metabolism) and chemical properties (log P, log D, p K a). Experimental approaches to assessing warfarin tautomerism and computational results for different tautomeric forms are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiong, Z; Vijayan, S; Rana, V
2015-06-15
Purpose: A system was developed that automatically calculates the organ and effective dose for individual fluoroscopically-guided procedures using a log of the clinical exposure parameters. Methods: We have previously developed a dose tracking system (DTS) to provide a real-time color-coded 3D- mapping of skin dose. This software produces a log file of all geometry and exposure parameters for every x-ray pulse during a procedure. The data in the log files is input into PCXMC, a Monte Carlo program that calculates organ and effective dose for projections and exposure parameters set by the user. We developed a MATLAB program to readmore » data from the log files produced by the DTS and to automatically generate the definition files in the format used by PCXMC. The processing is done at the end of a procedure after all exposures are completed. Since there are thousands of exposure pulses with various parameters for fluoroscopy, DA and DSA and at various projections, the data for exposures with similar parameters is grouped prior to entry into PCXMC to reduce the number of Monte Carlo calculations that need to be performed. Results: The software developed automatically transfers data from the DTS log file to PCXMC and runs the program for each grouping of exposure pulses. When the dose from all exposure events are calculated, the doses for each organ and all effective doses are summed to obtain procedure totals. For a complicated interventional procedure, the calculations can be completed on a PC without manual intervention in less than 30 minutes depending on the level of data grouping. Conclusion: This system allows organ dose to be calculated for individual procedures for every patient without tedious calculations or data entry so that estimates of stochastic risk can be obtained in addition to the deterministic risk estimate provided by the DTS. Partial support from NIH grant R01EB002873 and Toshiba Medical Systems Corp.« less
On the Automation of the MarkIII Data Analysis System.
NASA Astrophysics Data System (ADS)
Schwegmann, W.; Schuh, H.
1999-03-01
A faster and semiautomatic data analysis is an important contribution to the acceleration of the VLBI procedure. A concept for the automation of one of the most widely used VLBI software packages the MarkIII Data Analysis System was developed. Then, the program PWXCB, which extracts weather and cable calibration data from the station log-files, was automated supplementing the existing Fortran77 program-code. The new program XLOG and its results will be presented. Most of the tasks in the VLBI data analysis are very complex and their automation requires typical knowledge-based techniques. Thus, a knowledge-based system (KBS) for support and guidance of the analyst is being developed using the AI-workbench BABYLON, which is based on methods of artificial intelligence (AI). The advantages of a KBS for the MarkIII Data Analysis System and the required steps to build a KBS will be demonstrated. Examples about the current status of the project will be given, too.
Genes, age, and alcoholism: analysis of GAW14 data.
Apprey, Victor; Afful, Joseph; Harrell, Jules P; Taylor, Robert E; Bonney, George E
2005-12-30
A genetic analysis of age of onset of alcoholism was performed on the Collaborative Study on the Genetics of Alcoholism data released for Genetic Analysis Workshop 14. Our study illustrates an application of the log-normal age of onset model in our software Genetic Epidemiology Models (GEMs). The phenotype ALDX1 of alcoholism was studied. The analysis strategy was to first find the markers of the Affymetrix SNP dataset with significant association with age of onset, and then to perform linkage analysis on them. ALDX1 revealed strong evidence of linkage for marker tsc0041591 on chromosome 2 and suggestive linkage for marker tsc0894042 on chromosome 3. The largest separation in mean ages of onset of ALDX1 was 19.76 and 24.41 between male smokers who are carriers of the risk allele of tsc0041591 and the non-carriers, respectively. Hence, male smokers who are carriers of marker tsc0041591 on chromosome 2 have an average onset of ALDX1 almost 5 years earlier than non-carriers.
NASA Astrophysics Data System (ADS)
Ozkaya, Sait I.
2018-03-01
Fracture corridors are interconnected large fractures in a narrow sub vertical tabular array, which usually traverse entire reservoir vertically and extended for several hundreds of meters laterally. Fracture corridors with their huge conductivities constitute an important element of many fractured reservoirs. Unlike small diffuse fractures, actual fracture corridors must be mapped deterministically for simulation or field development purposes. Fracture corridors can be identified and quantified definitely with borehole image logs and well testing. However, there are rarely sufficient image logs or well tests, and it is necessary to utilize various fracture corridor indicators with varying degrees of reliability. Integration of data from many different sources, in turn, requires a platform with powerful editing and layering capability. Available commercial reservoir characterization software packages, with layering and editing capabilities, can be cost intensive. CAD packages are far more affordable and may easily acquire the versatility and power of commercial software packages with addition of a small software toolbox. The objective of this communication is to present FRACOR, a software toolbox which enables deterministic 2D fracture corridor mapping and modeling on AutoCAD platform. The FRACOR toolbox is written in AutoLISPand contains several independent routines to import and integrate available fracture corridor data from an oil field, and export results as text files. The resulting fracture corridor maps consists mainly of fracture corridors with different confidence levels from combination of static and dynamic data and exclusion zones where no fracture corridor can exist. The exported text file of fracture corridors from FRACOR can be imported into an upscaling programs to generate fracture grid for dual porosity simulation or used for field development and well planning.
ERIC Educational Resources Information Center
Joseph, Linda C.
This training manual written for elementary and secondary school teachers in the Columbus, Ohio, Public Schools, explains how to use the Internet. An introduction explains how computers in different locations can be connected and provides a general description of telecommunications software. How to log in and how to use various Internet tools are…
Automated lithology prediction from PGNAA and other geophysical logs.
Borsaru, M; Zhou, B; Aizawa, T; Karashima, H; Hashimoto, T
2006-02-01
Different methods of lithology predictions from geophysical data have been developed in the last 15 years. The geophysical logs used for predicting lithology are the conventional logs: sonic, neutron-neutron, gamma (total natural-gamma) and density (backscattered gamma-gamma). The prompt gamma neutron activation analysis (PGNAA) is another established geophysical logging technique for in situ element analysis of rocks in boreholes. The work described in this paper was carried out to investigate the application of PGNAA to the lithology interpretation. The data interpretation was conducted using the automatic interpretation program LogTrans based on statistical analysis. Limited test suggests that PGNAA logging data can be used to predict the lithology. A success rate of 73% for lithology prediction was achieved from PGNAA logging data only. It can also be used in conjunction with the conventional geophysical logs to enhance the lithology prediction.
LIME: 3D visualisation and interpretation of virtual geoscience models
NASA Astrophysics Data System (ADS)
Buckley, Simon; Ringdal, Kari; Dolva, Benjamin; Naumann, Nicole; Kurz, Tobias
2017-04-01
Three-dimensional and photorealistic acquisition of surface topography, using methods such as laser scanning and photogrammetry, has become widespread across the geosciences over the last decade. With recent innovations in photogrammetric processing software, robust and automated data capture hardware, and novel sensor platforms, including unmanned aerial vehicles, obtaining 3D representations of exposed topography has never been easier. In addition to 3D datasets, fusion of surface geometry with imaging sensors, such as multi/hyperspectral, thermal and ground-based InSAR, and geophysical methods, create novel and highly visual datasets that provide a fundamental spatial framework to address open geoscience research questions. Although data capture and processing routines are becoming well-established and widely reported in the scientific literature, challenges remain related to the analysis, co-visualisation and presentation of 3D photorealistic models, especially for new users (e.g. students and scientists new to geomatics methods). Interpretation and measurement is essential for quantitative analysis of 3D datasets, and qualitative methods are valuable for presentation purposes, for planning and in education. Motivated by this background, the current contribution presents LIME, a lightweight and high performance 3D software for interpreting and co-visualising 3D models and related image data in geoscience applications. The software focuses on novel data integration and visualisation of 3D topography with image sources such as hyperspectral imagery, logs and interpretation panels, geophysical datasets and georeferenced maps and images. High quality visual output can be generated for dissemination purposes, to aid researchers with communication of their research results. The background of the software is described and case studies from outcrop geology, in hyperspectral mineral mapping and geophysical-geospatial data integration are used to showcase the novel methods developed.
User Centric Job Monitoring - a redesign and novel approach in the STAR experiment
NASA Astrophysics Data System (ADS)
Arkhipkin, D.; Lauret, J.; Zulkarneeva, Y.
2014-06-01
User Centric Monitoring (or UCM) has been a long awaited feature in STAR, whereas programs, workflows and system "events" could be logged, broadcast and later analyzed. UCM allows to collect and filter available job monitoring information from various resources and present it to users in a user-centric view rather than an administrative-centric point of view. The first attempt and implementation of "a" UCM approach was made in STAR 2004 using a log4cxx plug-in back-end and then further evolved with an attempt to push toward a scalable database back-end (2006) and finally using a Web-Service approach (2010, CSW4DB SBIR). The latest showed to be incomplete and not addressing the evolving needs of the experiment where streamlined messages for online (data acquisition) purposes as well as the continuous support for the data mining needs and event analysis need to coexists and unified in a seamless approach. The code also revealed to be hardly maintainable. This paper presents the next evolutionary step of the UCM toolkit, a redesign and redirection of our latest attempt acknowledging and integrating recent technologies and a simpler, maintainable and yet scalable manner. The extended version of the job logging package is built upon three-tier approach based on Task, Job and Event, and features a Web-Service based logging API, a responsive AJAX-powered user interface, and a database back-end relying on MongoDB, which is uniquely suited for STAR needs. In addition, we present details of integration of this logging package with the STAR offline and online software frameworks. Leveraging on the reported experience and work from the ATLAS and CMS experience on using the ESPER engine, we discuss and show how such approach has been implemented in STAR for meta-data event triggering stream processing and filtering. An ESPER based solution seems to fit well into the online data acquisition system where many systems are monitored.
Mørkrid, Lars; Rowe, Alexander D; Elgstoen, Katja B P; Olesen, Jess H; Ruijter, George; Hall, Patricia L; Tortorelli, Silvia; Schulze, Andreas; Kyriakopoulou, Lianna; Wamelink, Mirjam M C; van de Kamp, Jiddeke M; Salomons, Gajja S; Rinaldo, Piero
2015-05-01
Urinary concentrations of creatine and guanidinoacetic acid divided by creatinine are informative markers for cerebral creatine deficiency syndromes (CDSs). The renal excretion of these substances varies substantially with age and sex, challenging the sensitivity and specificity of postanalytical interpretation. Results from 155 patients with CDS and 12 507 reference individuals were contributed by 5 diagnostic laboratories. They were binned into 104 adjacent age intervals and renormalized with Box-Cox transforms (Ξ). Estimates for central tendency (μ) and dispersion (σ) of Ξ were obtained for each bin. Polynomial regression analysis was used to establish the age dependence of both μ[log(age)] and σ[log(age)]. The regression residuals were then calculated as z-scores = {Ξ - μ[log(age)]}/σ[log(age)]. The process was iterated until all z-scores outside Tukey fences ±3.372 were identified and removed. Continuous percentile charts were then calculated and plotted by retransformation. Statistically significant and biologically relevant subgroups of z-scores were identified. Significantly higher marker values were seen in females than males, necessitating separate reference intervals in both adolescents and adults. Comparison between our reconstructed reference percentiles and current standard age-matched reference intervals highlights an underlying risk of false-positive and false-negative events at certain ages. Disease markers depending strongly on covariates such as age and sex require large numbers of reference individuals to establish peripheral percentiles with sufficient precision. This is feasible only through collaborative data sharing and the use of appropriate statistical methods. Broad application of this approach can be implemented through freely available Web-based software. © 2015 American Association for Clinical Chemistry.
A LabVIEW based template for user created experiment automation.
Kim, D J; Fisk, Z
2012-12-01
We have developed an expandable software template to automate user created experiments. The LabVIEW based template is easily modifiable to add together user created measurements, controls, and data logging with virtually any type of laboratory equipment. We use reentrant sequential selection to implement sequence script making it possible to wrap a long series of the user created experiments and execute them in sequence. Details of software structure and application examples for scanning probe microscope and automated transport experiments using custom built laboratory electronics and a cryostat are described.
NASA Astrophysics Data System (ADS)
Mohsin, Muhammad; Mu, Yongtong; Memon, Aamir Mahmood; Kalhoro, Muhammad Talib; Shah, Syed Baber Hussain
2017-07-01
Pakistani marine waters are under an open access regime. Due to poor management and policy implications, blind fishing is continued which may result in ecological as well as economic losses. Thus, it is of utmost importance to estimate fishery resources before harvesting. In this study, catch and effort data, 1996-2009, of Kiddi shrimp Parapenaeopsis stylifera fishery from Pakistani marine waters was analyzed by using specialized fishery software in order to know fishery stock status of this commercially important shrimp. Maximum, minimum and average capture production of P. stylifera was observed as 15 912 metric tons (mt) (1997), 9 438 mt (2009) and 11 667 mt/a. Two stock assessment tools viz. CEDA (catch and effort data analysis) and ASPIC (a stock production model incorporating covariates) were used to compute MSY (maximum sustainable yield) of this organism. In CEDA, three surplus production models, Fox, Schaefer and Pella-Tomlinson, along with three error assumptions, log, log normal and gamma, were used. For initial proportion (IP) 0.8, the Fox model computed MSY as 6 858 mt (CV=0.204, R 2 =0.709) and 7 384 mt (CV=0.149, R 2 =0.72) for log and log normal error assumption respectively. Here, gamma error produced minimization failure. Estimated MSY by using Schaefer and Pella-Tomlinson models remained the same for log, log normal and gamma error assumptions i.e. 7 083 mt, 8 209 mt and 7 242 mt correspondingly. The Schafer results showed highest goodness of fit R 2 (0.712) values. ASPIC computed MSY, CV, R 2, F MSY and B MSY parameters for the Fox model as 7 219 mt, 0.142, 0.872, 0.111 and 65 280, while for the Logistic model the computed values remained 7 720 mt, 0.148, 0.868, 0.107 and 72 110 correspondingly. Results obtained have shown that P. stylifera has been overexploited. Immediate steps are needed to conserve this fishery resource for the future and research on other species of commercial importance is urgently needed.
SU-E-P-05: Electronic Brachytherapy: A Physics Perspective On Field Implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pai, S; Ayyalasomayajula, S; Lee, S
2015-06-15
Purpose: We want to summarize our experience implementing a successful program of electronic brachytherapy at several dermatology clinics with the help of a cloud based software to help us define the key program parameters and capture physics QA aspects. Optimally developed software helps the physicist in peer review and qualify the physical parameters. Methods: Using the XOFT™ Axxent™ electronic brachytherapy system in conjunction with a cloud-based software, a process was setup to capture and record treatments. It was implemented initially at about 10 sites in California. For dosimetric purposes, the software facilitated storage of the physics parameters of surface applicatorsmore » used in treatment and other source calibration parameters. In addition, the patient prescription, pathology and other setup considerations were input by radiation oncologist and the therapist. This facilitated physics planning of the treatment parameters and also independent check of the dwell time. From 2013–2014, nearly1500 such calculation were completed by a group of physicists. A total of 800 patients with multiple lesions have been treated successfully during this period. The treatment log files have been uploaded and documented in the software which facilitated physics peer review of treatments per the standards in place by AAPM and ACR. Results: The program model was implemented successfully at multiple sites. The cloud based software allowed for proper peer review and compliance of the program at 10 clinical sites. Dosimtery was done on 800 patients and executed in a timely fashion to suit the clinical needs. Accumulated physics data in the software from the clinics allows for robust analysis and future development. Conclusion: Electronic brachytherapy implementation experience from a quality assurance perspective was greatly enhanced by using a cloud based software. The comprehensive database will pave the way for future developments to yield superior physics outcomes.« less
Computers and Cognitive Development at Work
ERIC Educational Resources Information Center
Roth, Wolff-Michael; Lee, Yew-Jin
2006-01-01
Data-logging exercises in science classrooms assume that with the proper scaffolding and provision of contexts by instructors, pupils are able to meaningfully comprehend the experimental variables under investigation. From a case study of knowing and learning in a fish hatchery using real-time computer statistical software, we show that…
On the Nature of SEM Estimates of ARMA Parameters.
ERIC Educational Resources Information Center
Hamaker, Ellen L.; Dolan, Conor V.; Molenaar, Peter C. M.
2002-01-01
Reexamined the nature of structural equation modeling (SEM) estimates of autoregressive moving average (ARMA) models, replicated the simulation experiments of P. Molenaar, and examined the behavior of the log-likelihood ratio test. Simulation studies indicate that estimates of ARMA parameters observed with SEM software are identical to those…
Techtalk: Telecommunications for Improving Developmental Education.
ERIC Educational Resources Information Center
Caverly, David C.; Broderick, Bill
1993-01-01
Explains how to access the Internet, discussing hardware and software considerations, connectivity, and types of access available to users. Describes the uses of electronic mail; TELNET, a method for remotely logging onto another computer; and anonymous File Transfer Protocol (FTP), a method for downloading files from a remote computer. (MAB)
Tools for Administration of a UNIX-Based Network
NASA Technical Reports Server (NTRS)
LeClaire, Stephen; Farrar, Edward
2004-01-01
Several computer programs have been developed to enable efficient administration of a large, heterogeneous, UNIX-based computing and communication network that includes a variety of computers connected to a variety of subnetworks. One program provides secure software tools for administrators to create, modify, lock, and delete accounts of specific users. This program also provides tools for users to change their UNIX passwords and log-in shells. These tools check for errors. Another program comprises a client and a server component that, together, provide a secure mechanism to create, modify, and query quota levels on a network file system (NFS) mounted by use of the VERITAS File SystemJ software. The client software resides on an internal secure computer with a secure Web interface; one can gain access to the client software from any authorized computer capable of running web-browser software. The server software resides on a UNIX computer configured with the VERITAS software system. Directories where VERITAS quotas are applied are NFS-mounted. Another program is a Web-based, client/server Internet Protocol (IP) address tool that facilitates maintenance lookup of information about IP addresses for a network of computers.
NASA Astrophysics Data System (ADS)
Zhang, Y.; Hu, C.; Wang, M.
2017-12-01
The evaluation of total organic carbon (TOC) in shale using logging data is one of the most crucial steps in shale gas exploration. However, it didn't achieve the ideal effect for the application of `ΔlogR' method in the Longmaxi Formation shale of Sichuan Basin.The reason may be the organic matter carbonization in Longmaxi Formation. An improved evaluation method, using the classification by lithology and sedimentary structure: 1) silty mudstone (wellsite logging data show silty); 2) calcareous mudstone (calcareous content > 25%); 3) laminated mudstone (laminations are recognized by core and imaging logging technology); 4) massive mudstone (massive textures are recognized by core and imaging logging technology, was proposed. This study compares two logging evaluation methods for measuring TOC in shale: the △logR method and the new proposed method. The results showed that the correlation coefficient between the calculated TOC and the tested TOC, based on the △logR method, was only 0.17. The correlation coefficient obtained according to the new method reached 0.80. The calculation results illustrated that, because of the good correlation between lithologies and sedimentary structure zones and TOC of different types of shale, the shale reservoirs could be graded according to four shale types. The new proposed method is more efficient, faster, and has higher vertical resolution than the △logR method. In addition, a new software had been completed. It was found to be especially effective under conditions of insufficient data during the early stages of shale gas exploration in the Silurian Longmaxi Formation, Muai Syncline Belt, south of the Sichuan Basin.
ERIC Educational Resources Information Center
Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm
2016-01-01
Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…
Separate-channel analysis of two-channel microarrays: recovering inter-spot information.
Smyth, Gordon K; Altman, Naomi S
2013-05-26
Two-channel (or two-color) microarrays are cost-effective platforms for comparative analysis of gene expression. They are traditionally analysed in terms of the log-ratios (M-values) of the two channel intensities at each spot, but this analysis does not use all the information available in the separate channel observations. Mixed models have been proposed to analyse intensities from the two channels as separate observations, but such models can be complex to use and the gain in efficiency over the log-ratio analysis is difficult to quantify. Mixed models yield test statistics for the null distributions can be specified only approximately, and some approaches do not borrow strength between genes. This article reformulates the mixed model to clarify the relationship with the traditional log-ratio analysis, to facilitate information borrowing between genes, and to obtain an exact distributional theory for the resulting test statistics. The mixed model is transformed to operate on the M-values and A-values (average log-expression for each spot) instead of on the log-expression values. The log-ratio analysis is shown to ignore information contained in the A-values. The relative efficiency of the log-ratio analysis is shown to depend on the size of the intraspot correlation. A new separate channel analysis method is proposed that assumes a constant intra-spot correlation coefficient across all genes. This approach permits the mixed model to be transformed into an ordinary linear model, allowing the data analysis to use a well-understood empirical Bayes analysis pipeline for linear modeling of microarray data. This yields statistically powerful test statistics that have an exact distributional theory. The log-ratio, mixed model and common correlation methods are compared using three case studies. The results show that separate channel analyses that borrow strength between genes are more powerful than log-ratio analyses. The common correlation analysis is the most powerful of all. The common correlation method proposed in this article for separate-channel analysis of two-channel microarray data is no more difficult to apply in practice than the traditional log-ratio analysis. It provides an intuitive and powerful means to conduct analyses and make comparisons that might otherwise not be possible.
Ebe, Kazuyu; Sugimoto, Satoru; Utsunomiya, Satoru; Kagamu, Hiroshi; Aoyama, Hidefumi; Court, Laurence; Tokuyama, Katsuichi; Baba, Ryuta; Ogihara, Yoshisada; Ichikawa, Kosuke; Toyama, Joji
2015-08-01
To develop and evaluate a new video image-based QA system, including in-house software, that can display a tracking state visually and quantify the positional accuracy of dynamic tumor tracking irradiation in the Vero4DRT system. Sixteen trajectories in six patients with pulmonary cancer were obtained with the ExacTrac in the Vero4DRT system. Motion data in the cranio-caudal direction (Y direction) were used as the input for a programmable motion table (Quasar). A target phantom was placed on the motion table, which was placed on the 2D ionization chamber array (MatriXX). Then, the 4D modeling procedure was performed on the target phantom during a reproduction of the patient's tumor motion. A substitute target with the patient's tumor motion was irradiated with 6-MV x-rays under the surrogate infrared system. The 2D dose images obtained from the MatriXX (33 frames/s; 40 s) were exported to in-house video-image analyzing software. The absolute differences in the Y direction between the center of the exposed target and the center of the exposed field were calculated. Positional errors were observed. The authors' QA results were compared to 4D modeling function errors and gimbal motion errors obtained from log analyses in the ExacTrac to verify the accuracy of their QA system. The patients' tumor motions were evaluated in the wave forms, and the peak-to-peak distances were also measured to verify their reproducibility. Thirteen of sixteen trajectories (81.3%) were successfully reproduced with Quasar. The peak-to-peak distances ranged from 2.7 to 29.0 mm. Three trajectories (18.7%) were not successfully reproduced due to the limited motions of the Quasar. Thus, 13 of 16 trajectories were summarized. The mean number of video images used for analysis was 1156. The positional errors (absolute mean difference + 2 standard deviation) ranged from 0.54 to 1.55 mm. The error values differed by less than 1 mm from 4D modeling function errors and gimbal motion errors in the ExacTrac log analyses (n = 13). The newly developed video image-based QA system, including in-house software, can analyze more than a thousand images (33 frames/s). Positional errors are approximately equivalent to those in ExacTrac log analyses. This system is useful for the visual illustration of the progress of the tracking state and for the quantification of positional accuracy during dynamic tumor tracking irradiation in the Vero4DRT system.
[Evaluation of estimation of prevalence ratio using bayesian log-binomial regression model].
Gao, W L; Lin, H; Liu, X N; Ren, X W; Li, J S; Shen, X P; Zhu, S L
2017-03-10
To evaluate the estimation of prevalence ratio ( PR ) by using bayesian log-binomial regression model and its application, we estimated the PR of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea in their infants by using bayesian log-binomial regression model in Openbugs software. The results showed that caregivers' recognition of infant' s risk signs of diarrhea was associated significantly with a 13% increase of medical care-seeking. Meanwhile, we compared the differences in PR 's point estimation and its interval estimation of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea and convergence of three models (model 1: not adjusting for the covariates; model 2: adjusting for duration of caregivers' education, model 3: adjusting for distance between village and township and child month-age based on model 2) between bayesian log-binomial regression model and conventional log-binomial regression model. The results showed that all three bayesian log-binomial regression models were convergence and the estimated PRs were 1.130(95 %CI : 1.005-1.265), 1.128(95 %CI : 1.001-1.264) and 1.132(95 %CI : 1.004-1.267), respectively. Conventional log-binomial regression model 1 and model 2 were convergence and their PRs were 1.130(95 % CI : 1.055-1.206) and 1.126(95 % CI : 1.051-1.203), respectively, but the model 3 was misconvergence, so COPY method was used to estimate PR , which was 1.125 (95 %CI : 1.051-1.200). In addition, the point estimation and interval estimation of PRs from three bayesian log-binomial regression models differed slightly from those of PRs from conventional log-binomial regression model, but they had a good consistency in estimating PR . Therefore, bayesian log-binomial regression model can effectively estimate PR with less misconvergence and have more advantages in application compared with conventional log-binomial regression model.
NASA Astrophysics Data System (ADS)
Breton, D. J.; Koffman, B. G.; Kreutz, K. J.; Hamilton, G. S.
2010-12-01
Paleoclimate data are often extracted from ice cores by careful geochemical analysis of meltwater samples. The analysis of the microparticles found in ice cores can also yield unique clues about atmospheric dust loading and transport, dust provenance and past environmental conditions. Determination of microparticle concentration, size distribution and chemical makeup as a function of depth is especially difficult because the particle size measurement either consumes or contaminates the meltwater, preventing further geochemical analysis. Here we describe a microcontroller-based ice core melting system which allows the collection of separate microparticle and chemistry samples from the same depth intervals in the ice core, while logging and accurately depth-tagging real-time electrical conductivity and particle size distribution data. This system was designed specifically to support microparticle analysis of the WAIS Divide WDC06A deep ice core, but many of the subsystems are applicable to more general ice core melting operations. Major system components include: a rotary encoder to measure ice core melt displacement with 0.1 millimeter accuracy, a meltwater tracking system to assign core depths to conductivity, particle and sample vial data, an optical debubbler level control system to protect the Abakus laser particle counter from damage due to air bubbles, a Rabbit 3700 microcontroller which communicates with a host PC, collects encoder and optical sensor data and autonomously operates Gilson peristaltic pumps and fraction collectors to provide automatic sample handling, melt monitor control software operating on a standard PC allowing the user to control and view the status of the system, data logging software operating on the same PC to collect data from the melting, electrical conductivity and microparticle measurement systems. Because microparticle samples can easily be contaminated, we use optical air bubble sensors and high resolution ice core density profiles to guide the melting process. The combination of these data allow us to analyze melt head performance, minimize outer-to-inner fraction contamination and avoid melt head flooding. The WAIS Melt Monitor system allows the collection of real-time, sub-annual microparticle and electrical conductivity data while producing and storing enough sample for traditional Coulter-Counter particle measurements as well long term acid leaching of bioactive metals (e.g., Fe, Co, Cd, Cu, Zn) prior to chemical analysis.
Twining, Brian V.; Hodges, Mary K.V.; Orr, Stephanie
2008-01-01
This report summarizes construction, geophysical, and lithologic data collected from ten U.S. Geological Survey (USGS) boreholes completed between 1999 nd 2006 at the Idaho National Laboratory (INL): USGS 126a, 126b, 127, 128, 129, 130, 131, 132, 133, and 134. Nine boreholes were continuously cored; USGS 126b had 5 ft of core. Completion depths range from 472 to 1,238 ft. Geophysical data were collected for each borehole, and those data are summarized in this report. Cores were photographed and digitally logged using commercially available software. Digital core logs are in appendixes A through J. Borehole descriptions summarize location, completion date, and amount and type of core recovered. This report was prepared by the USGS in cooperation with the U.S. Department of Energy (DOE).
The algorithm for automatic detection of the calibration object
NASA Astrophysics Data System (ADS)
Artem, Kruglov; Irina, Ugfeld
2017-06-01
The problem of the automatic image calibration is considered in this paper. The most challenging task of the automatic calibration is a proper detection of the calibration object. The solving of this problem required the appliance of the methods and algorithms of the digital image processing, such as morphology, filtering, edge detection, shape approximation. The step-by-step process of the development of the algorithm and its adopting to the specific conditions of the log cuts in the image's background is presented. Testing of the automatic calibration module was carrying out under the conditions of the production process of the logging enterprise. Through the tests the average possibility of the automatic isolating of the calibration object is 86.1% in the absence of the type 1 errors. The algorithm was implemented in the automatic calibration module within the mobile software for the log deck volume measurement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ilsche, Thomas; Schuchart, Joseph; Cope, Joseph
Event tracing is an important tool for understanding the performance of parallel applications. As concurrency increases in leadership-class computing systems, the quantity of performance log data can overload the parallel file system, perturbing the application being observed. In this work we present a solution for event tracing at leadership scales. We enhance the I/O forwarding system software to aggregate and reorganize log data prior to writing to the storage system, significantly reducing the burden on the underlying file system for this type of traffic. Furthermore, we augment the I/O forwarding system with a write buffering capability to limit the impactmore » of artificial perturbations from log data accesses on traced applications. To validate the approach, we modify the Vampir tracing tool to take advantage of this new capability and show that the approach increases the maximum traced application size by a factor of 5x to more than 200,000 processors.« less
Ground station software for receiving and handling Irecin telemetry data
NASA Astrophysics Data System (ADS)
Ferrante, M.; Petrozzi, M.; Di Ciolo, L.; Ortenzi, A.; Troso, G
2004-11-01
The on board resources, needed to perform the mission tasks, are very limited in nano-satellites. This paper proposes a software system to receive, manage and process in Real Time the Telemetry data coming from IRECIN nanosatellite and transmit operator manual commands and operative procedures. During the receiving phase, it shows the IRECIN subsystem physical values, visualizes the IRECIN attitude, and performs other suitable functions. The IRECIN Ground Station program is in charge to exchange information between IRECIN and the Ground segment. It carries out, in real time during IRECIN transmission phase, IRECIN attitude drawing, sun direction drawing, power supply received from Sun, visualization of the telemetry data, visualization of Earth magnetic field and more other functions. The received data are memorized and interpreted by a module, parser, and distribute to the suitable modules. Moreover it allows sending manual and automatic commands. Manual commands are delivered by an operator, on the other hand, automatic commands are provided by pre-configured operative procedures. Operative procedures development is realized in a previous phase called configuration phase. This program is also in charge to carry out a test session by mean the scheduler and commanding modules allowing execution of specific tasks without operator control. A log module to memorize received and transmitted data is realized. A phase to analyze, filter and visualize in off line the collected data, called post analysis, is based on the data extraction form the log module. At the same time, the Ground Station Software can work in network allowing managing, receiving and sending data/commands from different sites. The proposed system constitutes the software of IRECIN Ground Station. IRECIN is a modular nanosatellite weighting less than 2 kg, constituted by sixteen external sides with surface-mounted solar cells and three internal Al plates, kept together by four steel bars. Lithium-ions batteries are used. Attitude is determined by two three-axis magnetometers and the solar panels data. Control is provided by an active magnetic control system. The spacecraft will be spin- stabilized with the spin-axis normal to the orbit. All IRECIN electronic components are SMD technology in order to reduce weight and size. The realized Electronic board are completely developed, realized and tested at the Vitrociset S.P.A. under control of Research and Develop Group
Architectural Analysis of Complex Evolving Systems of Systems
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Stratton, William C.; Sibol, Deane E.; Ray, Arnab; Ackemann, Chris; Yonkwa, Lyly; Ganesan, Dharma
2009-01-01
The goal of this collaborative project between FC-MD, APL, and GSFC and supported by NASA IV&V Software Assurance Research Program (SARP), was to develop a tool, Dynamic SAVE, or Dyn-SAVE for short, for analyzing architectures of systems of systems. The project team was comprised of the principal investigator (PI) from FC-MD and four other FC-MD scientists (part time) and several FC-MD students (full time), as well as, two APL software architects (part time), and one NASA POC (part time). The PI and FC-MD scientists together with APL architects were responsible for requirements analysis, and for applying and evaluating the Dyn-SAVE tool and method. The PI and a group of FC-MD scientists were responsible for improving the method and conducting outreach activities, while another group of FC-MD scientists were responsible for development and improvement of the tool. Oversight and reporting was conducted by the PI and NASA POC. The project team produced many results including several prototypes of the Dyn-SAVE tool and method, several case studies documenting how the tool and method was applied to APL s software systems, and several published papers in highly respected conferences and journals. Dyn-SAVE as developed and enhanced throughout this research period, is a software tool intended for software developers and architects, software integration testers, and persons who need to analyze software systems from the point of view of how it communicates with other systems. Using the tool, the user specifies the planned communication behavior of the system modeled as a sequence diagram. The user then captures and imports the actual communication behavior of the system, which is then converted and visualized as a sequence diagram by Dyn-SAVE. After mapping the planned to the actual and specifying parameter and timing constraints, Dyn-SAVE detects and highlights deviations between the planned and the actual behavior. Requirements based on the need to analyze two inter-system communication protocols that are representative of protocols used in the Aerospace industry have been specified. The protocols are related: APL s Common Ground System (CGS) as used in the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) and the Radiation Belt Space Probes (RBSP) missions. The analyzed communications were implementations of the Telemetry protocol and the CCSDS File Delivery Protocol (CFDP) protocol. Based on these requirements, three prototypes of Dyn-SAVE were developed and applied to these protocols. The application of Dyn-SAVE to these protocols resulted in the detection of several issues. Dyn-SAVE was also applied to several Testbeds that have previously been used for experimentation earlier on this project, as well as, to other protocols and logs for testing its broader applicability. For example, Dyn-SAVE was used to analyze 1) the communication pattern between a web browser and a web server, 2) the system log of a computer in order to detect offnominal computer shut-down behavior, and 3) the actual test cases of NASA Goddard s Core Flight System (CFS) and automatically generated test cases in order to determine the overlap between the two sets of test cases. In all cases, Dyn-SAVE assisted in providing insightful conclusions about each of the cases identified above.
R.D. Ottmar; M.F. Burns; J.N. Hall; A.D. Hanson
1993-01-01
CONSUME is a user-friendly computer program designed for resource managers with some working knowledge of IBM-PC applications. The software predicts the amount of fuel consumption on logged units based on weather data, the amount and fuel moisture of fuels, and a number of other factors. Using these predictions, the resource manager can accurately determine when and...
Computer Simulations: An Integrating Tool.
ERIC Educational Resources Information Center
Bilan, Bohdan J.
This introduction to computer simulations as an integrated learning experience reports on their use with students in grades 5 through 10 using commercial software packages such as SimCity, SimAnt, SimEarth, and Civilization. Students spent an average of 60 hours with the simulation games and reported their experiences each week in a personal log.…
Agentless Cloud-Wide Monitoring of Virtual Disk State
2015-10-01
packages include Apache, MySQL , PHP, Ruby on Rails, Java Application Servers, and many others. Figure 2.12 shows the results of a run of the Software...Linux, Apache, MySQL , PHP (LAMP) set of applications. Thus, many file-level update logs will contain the same versions of files repeated across many
Exploring Dynamical Assessments of Affect, Behavior, and Cognition and Math State Test Achievement
ERIC Educational Resources Information Center
San Pedro, Maria Ofelia Z.; Snow, Erica L.; Baker, Ryan S.; McNamara, Danielle S.; Heffernan, Neil T.
2015-01-01
There is increasing evidence that fine-grained aspects of student performance and interaction within educational software are predictive of long-term learning. Machine learning models have been used to provide assessments of affect, behavior, and cognition based on analyses of system log data, estimating the probability of a student's particular…
Gruber, T
1996-01-01
The author presents guidelines to help a security department select a computer system to track security activities--whether it's a commercial software product, an in-house developed program, or a do-it-yourself designed system. Computerized security activity reporting, he believes, is effective and beneficial.
NASA Astrophysics Data System (ADS)
Puskarczyk, Edyta
2018-03-01
The main goal of the study was to enhance and improve information about the Ordovician and Silurian gas-saturated shale formations. Author focused on: firstly, identification of the shale gas formations, especially the sweet spots horizons, secondly, classification and thirdly, the accurate characterization of divisional intervals. Data set comprised of standard well logs from the selected well. Shale formations are represented mainly by claystones, siltstones, and mudstones. The formations are also partially rich in organic matter. During the calculations, information about lithology of stratigraphy weren't taken into account. In the analysis, selforganizing neural network - Kohonen Algorithm (ANN) was used for sweet spots identification. Different networks and different software were tested and the best network was used for application and interpretation. As a results of Kohonen networks, groups corresponding to the gas-bearing intervals were found. The analysis showed diversification between gas-bearing formations and surrounding beds. It is also shown that internal diversification in sweet spots is present. Kohonen algorithm was also used for geological interpretation of well log data and electrofacies prediction. Reliable characteristic into groups shows that Ja Mb and Sa Fm which are usually treated as potential sweet spots only partially have good reservoir conditions. It is concluded that ANN appears to be useful and quick tool for preliminary classification of members and sweet spots identification.
Development of a case tool to support decision based software development
NASA Technical Reports Server (NTRS)
Wild, Christian J.
1993-01-01
A summary of the accomplishments of the research over the past year are presented. Achievements include: made demonstrations with DHC, a prototype supporting decision based software development (DBSD) methodology, for Paramax personnel at ODU; met with Paramax personnel to discuss DBSD issues, the process of integrating DBSD and Refinery and the porting process model; completed and submitted a paper describing DBSD paradigm to IFIP '92; completed and presented a paper describing the approach for software reuse at the Software Reuse Workshop in April 1993; continued to extend DHC with a project agenda, facility necessary for a better project management; completed a primary draft of the re-engineering process model for porting; created a logging form to trace all the activities involved in the process of solving the reengineering problem, and developed a primary chart with the problems involved by the reengineering process.
Service Management Database for DSN Equipment
NASA Technical Reports Server (NTRS)
Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed
2009-01-01
This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.
Multichannel Networked Phasemeter Readout and Analysis
NASA Technical Reports Server (NTRS)
Edmonds, Karina
2008-01-01
Netmeter software reads a data stream from up to 250 networked phasemeters, synchronizes the data, saves the reduced data to disk (after applying a low-pass filter), and provides a Web server interface for remote control. Unlike older phasemeter software that requires a special, real-time operating system, this program can run on any general-purpose computer. It needs about five percent of the CPU (central processing unit) to process 20 channels because it adds built-in data logging and network-based GUIs (graphical user interfaces) that are implemented in Scalable Vector Graphics (SVG). Netmeter runs on Linux and Windows. It displays the instantaneous displacements measured by several phasemeters at a user-selectable rate, up to 1 kHz. The program monitors the measure and reference channel frequencies. For ease of use, levels of status in Netmeter are color coded: green for normal operation, yellow for network errors, and red for optical misalignment problems. Netmeter includes user-selectable filters up to 4 k samples, and user-selectable averaging windows (after filtering). Before filtering, the program saves raw data to disk using a burst-write technique.
Viewpoints: A High-Performance High-Dimensional Exploratory Data Analysis Tool
NASA Astrophysics Data System (ADS)
Gazis, P. R.; Levit, C.; Way, M. J.
2010-12-01
Scientific data sets continue to increase in both size and complexity. In the past, dedicated graphics systems at supercomputing centers were required to visualize large data sets, but as the price of commodity graphics hardware has dropped and its capability has increased, it is now possible, in principle, to view large complex data sets on a single workstation. To do this in practice, an investigator will need software that is written to take advantage of the relevant graphics hardware. The Viewpoints visualization package described herein is an example of such software. Viewpoints is an interactive tool for exploratory visual analysis of large high-dimensional (multivariate) data. It leverages the capabilities of modern graphics boards (GPUs) to run on a single workstation or laptop. Viewpoints is minimalist: it attempts to do a small set of useful things very well (or at least very quickly) in comparison with similar packages today. Its basic feature set includes linked scatter plots with brushing, dynamic histograms, normalization, and outlier detection/removal. Viewpoints was originally designed for astrophysicists, but it has since been used in a variety of fields that range from astronomy, quantum chemistry, fluid dynamics, machine learning, bioinformatics, and finance to information technology server log mining. In this article, we describe the Viewpoints package and show examples of its usage.
Introduction to the Space Physics Analysis Network (SPAN)
NASA Technical Reports Server (NTRS)
Green, J. L. (Editor); Peters, D. J. (Editor)
1985-01-01
The Space Physics Analysis Network or SPAN is emerging as a viable method for solving an immediate communication problem for the space scientist. SPAN provides low-rate communication capability with co-investigators and colleagues, and access to space science data bases and computational facilities. The SPAN utilizes up-to-date hardware and software for computer-to-computer communications allowing binary file transfer and remote log-on capability to over 25 nationwide space science computer systems. SPAN is not discipline or mission dependent with participation from scientists in such fields as magnetospheric, ionospheric, planetary, and solar physics. Basic information on the network and its use are provided. It is anticipated that SPAN will grow rapidly over the next few years, not only from the standpoint of more network nodes, but as scientists become more proficient in the use of telescience, more capability will be needed to satisfy the demands.
NASA Astrophysics Data System (ADS)
2007-09-01
WE RECOMMEND Energy Foresight Valuable and original GCSE curriculum support on DVD Developing Scientific Literacy: Using News Media in the Classroom This book helpfully evaluates science stories in today's media Radioactivity Explained and Electricity Explained Interactive software ideal for classroom use TEP Generator Wind-up generator specially designed for schools SEP Energymeter A joule meter with more uses than its appearance suggests Into the Cool: Energy Flow, Thermodynamics and Life This book explores the physics behind biology CmapTools Handy software for mapping knowledge and resources LogIT Black Box This hub contains multiple sensors for endless experimental fun WEB WATCH Water Web 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shiinoki, T; Hanazawa, H; Park, S
2015-06-15
Purpose: We aim to achieve new four-dimensional radiotherapy (4DRT) using the next generation real-time tumor-tracking (RTRT) system and flattening-filter-free techniques. To achieve new 4DRT, it is necessary to understand the respiratory motion of tumor. The purposes of this study were: 1.To develop the respiratory motion analysis tool using log files. 2.To evaluate the reproducibility of tumor motion probability distribution function (PDF) during stereotactic body RT (SBRT) of lung tumor. Methods: Seven patients having fiducial markers closely implanted to the lung tumor were enrolled in this study. The positions of fiducial markers were measured using the RTRT system (Mitsubishi Electronics Co.,more » JP) and recorded as two types of log files during the course of SBRT. For each patients, tumor motion range and tumor motion PDFs in left-right (LR), anterior-posterior (AP) and superior-inferior (SI) directions were calculated using log files of all beams per fraction (PDFn). Fractional PDF reproducibility (Rn) was calculated as Kullback-Leibler (KL) divergence between PDF1 and PDFn of tumor motion. The mean of Rn (Rm) was calculated for each patient and correlated to the patient’s mean tumor motion range (Am). The change of Rm during the course of SBRT was also evluated. These analyses were performed using in-house developed software. Results: The Rm were 0.19 (0.07–0.30), 0.14 (0.07–0.32) and 0.16 (0.09–0.28) in LR, AP and SI directions, respectively. The Am were 5.11 mm (2.58–9.99 mm), 7.81 mm (2.87–15.57 mm) and 11.26 mm (3.80–21.27 mm) in LR, AP and SI directions, respectively. The PDF reproducibility decreased as the tumor motion range increased in AP and SI direction. That decreased slightly through the course of RT in SI direction. Conclusion: We developed the respiratory motion analysis tool for 4DRT using log files and quantified the range and reproducibility of respiratory motion for lung tumors.« less
Bengtsson, Henrik; Jönsson, Göran; Vallon-Christersson, Johan
2004-11-12
Non-linearities in observed log-ratios of gene expressions, also known as intensity dependent log-ratios, can often be accounted for by global biases in the two channels being compared. Any step in a microarray process may introduce such offsets and in this article we study the biases introduced by the microarray scanner and the image analysis software. By scanning the same spotted oligonucleotide microarray at different photomultiplier tube (PMT) gains, we have identified a channel-specific bias present in two-channel microarray data. For the scanners analyzed it was in the range of 15-25 (out of 65,535). The observed bias was very stable between subsequent scans of the same array although the PMT gain was greatly adjusted. This indicates that the bias does not originate from a step preceding the scanner detector parts. The bias varies slightly between arrays. When comparing estimates based on data from the same array, but from different scanners, we have found that different scanners introduce different amounts of bias. So do various image analysis methods. We propose a scanning protocol and a constrained affine model that allows us to identify and estimate the bias in each channel. Backward transformation removes the bias and brings the channels to the same scale. The result is that systematic effects such as intensity dependent log-ratios are removed, but also that signal densities become much more similar. The average scan, which has a larger dynamical range and greater signal-to-noise ratio than individual scans, can then be obtained. The study shows that microarray scanners may introduce a significant bias in each channel. Such biases have to be calibrated for, otherwise systematic effects such as intensity dependent log-ratios will be observed. The proposed scanning protocol and calibration method is simple to use and is useful for evaluating scanner biases or for obtaining calibrated measurements with extended dynamical range and better precision. The cross-platform R package aroma, which implements all described methods, is available for free from http://www.maths.lth.se/bioinformatics/.
Thresholds of logging intensity to maintain tropical forest biodiversity.
Burivalova, Zuzana; Sekercioğlu, Cağan Hakkı; Koh, Lian Pin
2014-08-18
Primary tropical forests are lost at an alarming rate, and much of the remaining forest is being degraded by selective logging. Yet, the impacts of logging on biodiversity remain poorly understood, in part due to the seemingly conflicting findings of case studies: about as many studies have reported increases in biodiversity after selective logging as have reported decreases. Consequently, meta-analytical studies that treat selective logging as a uniform land use tend to conclude that logging has negligible effects on biodiversity. However, selectively logged forests might not all be the same. Through a pantropical meta-analysis and using an information-theoretic approach, we compared and tested alternative hypotheses for key predictors of the richness of tropical forest fauna in logged forest. We found that the species richness of invertebrates, amphibians, and mammals decreases as logging intensity increases and that this effect varies with taxonomic group and continental location. In particular, mammals and amphibians would suffer a halving of species richness at logging intensities of 38 m(3) ha(-1) and 63 m(3) ha(-1), respectively. Birds exhibit an opposing trend as their total species richness increases with logging intensity. An analysis of forest bird species, however, suggests that this pattern is largely due to an influx of habitat generalists into heavily logged areas while forest specialist species decline. Our study provides a quantitative analysis of the nuanced responses of species along a gradient of logging intensity, which could help inform evidence-based sustainable logging practices from the perspective of biodiversity conservation. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Penasa, Luca; Franceschi, Marco; Preto, Nereo; Girardeau-Montaut, Daniel
2015-04-01
Three-dimensional Virtual Outcrop Models (VOMs), often produced using terrestrial laser scanning or photogrammetry, have become popular in the Geosciences. The main feature of a VOM is that it allows for a quantification of the 3D geometry and/or distribution of geologic features that range from rock properties to structural elements. This actually generated much of the interest in VOMs by the oil and gas industry. The potential importance of a VOM in stratigraphy, however, does not seems completely disclosed yet. Indeed outcrops are the primary sources of data for a number of stratigraphic studies (e.g. palaeontology, sedimentology, cyclostratigraphy, geochemistry...). All the observations are typically reported on stratigraphic logs which constitute an idealized representation of the stratigraphic series, drawn by the researcher on the basis of the features that has to be highlighted. The observations are localized by means of manual measurements and a certain amount of subjectivity in log drawing is involved. These facts can prevent the log from being properly pinned to the real outcrop. Moreover, the integration of stratigraphic logs made by different researchers studying the same outcrop may be difficult. The exposure conditions of outcrops can change through time, to the point that they can become unaccessible or even be destroyed. In such a case, linking the stratigraphic log to its physical counterpart becomes impossible. This can be particularly relevant when a classical outcrop or even a GSSP is considered. A VOM may prove useful to tackle these issues, by providing a more objective stratigraphic reference for measurements and by preserving an outcrop through time as a visual representation, thus permitting reference and accurate comparison between observations made through time. Finally, a VOM itself may contain relevant stratigraphic information (e.g. scalar fields associated with the point cloud as intensity, rgb data or hyperspectral information from passive remote sensing devices). This information requires to be merged with geological data collected in the field, in a consistent and reproducible way. We present Vombat, a proof-of-concept of open-source software to illustrate some of the possibilities in terms of information storage, visualization and exploitation of outcrop stratigraphic information. Our solution integrates with CloudCompare, a software that permits to visualize and edit point clouds. A dedicated algorithm estimates stratigraphic attitudes from point cloud data, without the need of exposed planar bedding surfaces. These attitudes can be used to define a virtual stratigraphic section. Composite sections can then be realized defining stratigraphic constraints between different reference frames. Any observation can be displayed in a stratigraphic framework that is directly generated from a VOM. The virtual outcrop, the samples and the stratigraphic reference frames can be saved into an XML file. In the future, the adoption of a standard format (e.g. GeoSciML) will permit easier exchange of stratigraphic data among researchers. The software constitutes a first step towards the full exploitation of VOMs in stratigraphy, is stored at http://github.com/luca-penasa/vombat and is open source. Comments and suggestions are most welcome and will help focusing and refining the software and its tools.
Siochi, R
2012-06-01
To develop a quality initiative discovery framework using process improvement techniques, software tools and operating principles. Process deviations are entered into a radiotherapy incident reporting database. Supervisors use an in-house Event Analysis System (EASy) to discuss incidents with staff. Major incidents are analyzed with an in-house Fault Tree Analysis (FTA). A meta-Analysis is performed using association, text mining, key word clustering, and differential frequency analysis. A key operating principle encourages the creation of forcing functions via rapid application development. 504 events have been logged this past year. The results for the key word analysis indicate that the root cause for the top ranked key words was miscommunication. This was also the root cause found from association analysis, where 24% of the time that an event involved a physician it also involved a nurse. Differential frequency analysis revealed that sharp peaks at week 27 were followed by 3 major incidents, two of which were dose related. The peak was largely due to the front desk which caused distractions in other areas. The analysis led to many PI projects but there is still a major systematic issue with the use of forms. The solution we identified is to implement Smart Forms to perform error checking and interlocking. Our first initiative replaced our daily QA checklist with a form that uses custom validation routines, preventing therapists from proceeding with treatments until out of tolerance conditions are corrected. PITSTOP has increased the number of quality initiatives in our department, and we have discovered or confirmed common underlying causes of a variety of seemingly unrelated errors. It has motivated the replacement of all forms with smart forms. © 2012 American Association of Physicists in Medicine.
Petrophysical evaluation of subterranean formations
Klein, James D; Schoderbek, David A; Mailloux, Jason M
2013-05-28
Methods and systems are provided for evaluating petrophysical properties of subterranean formations and comprehensively evaluating hydrate presence through a combination of computer-implemented log modeling and analysis. Certain embodiments include the steps of running a number of logging tools in a wellbore to obtain a variety of wellbore data and logs, and evaluating and modeling the log data to ascertain various petrophysical properties. Examples of suitable logging techniques that may be used in combination with the present invention include, but are not limited to, sonic logs, electrical resistivity logs, gamma ray logs, neutron porosity logs, density logs, NRM logs, or any combination or subset thereof.
Open source software to control Bioflo bioreactors.
Burdge, David A; Libourel, Igor G L
2014-01-01
Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.
Open Source Software to Control Bioflo Bioreactors
Burdge, David A.; Libourel, Igor G. L.
2014-01-01
Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828
nStudy: A System for Researching Information Problem Solving
ERIC Educational Resources Information Center
Winne, Philip H.; Nesbit, John C.; Popowich, Fred
2017-01-01
A bottleneck in gathering big data about learning is instrumentation designed to record data about processes students use to learn and information on which those processes operate. The software system nStudy fills this gap. nStudy is an extension to the Chrome web browser plus a server side database for logged trace data plus peripheral modules…
USE OF ROUGH SETS AND SPECTRAL DATA FOR BUILDING PREDICTIVE MODELS OF REACTION RATE CONSTANTS
A model for predicting the log of the rate constants for alkaline hydrolysis of organic esters has been developed with the use of gas-phase min-infrared library spectra and a rule-building software system based on the mathematical theory of rough sets. A diverse set of 41 esters ...
Holistic Framework For Establishing Interoperability of Heterogeneous Software Development Tools
2003-06-01
cannot be obtained or is zero while in autocontrol , CARA will terminate auto-control 1.637 16 5 FEAT12 EMF Present, Log Manual Mode If back EMF...auto- control mode a ’Terminate Autocontrol ’ button should be made available. 16.37 9 5 300 FEAT Tag Name Requirement Text AHP Priority Rqts
NASA Astrophysics Data System (ADS)
Thieman, J.; Higgins, C.; Lauffer, G.; Ulivastro, R.; Flagg, R.; Sky, J.
2003-04-01
The Radio JOVE project (http://radiojove.gsfc.nasa.gov) began over four years ago as an education-centered program to inspire secondary school students' interest in space science through hands-on radio astronomy. Students build a radio receiver and antenna kit capable of receiving Jovian, solar, and galactic emissions at a frequency of 20.1 MHz. More than 500 of these kits have been distributed to students and interested observers (ages 10 through adult) in 24 countries. Many students and teachers do not have the time or feel comfortable building a kit of their own. The Radio JOVE project has made it possible to monitor data and streaming audio from professional radio telescopes in Florida (16 element 10-40 MHz log spiral array - http://jupiter.kochi-ct.jp) and Hawaii (17-30 MHz log periodic antenna - http://jupiter.wcc.hawaii.edu/newradiojove/main.html) using standard web browsers and/or freely downloadable software. Radio-Skypipe software (http://radiosky.com) emulates a chart recorder for ones own radio telescope. It will also display the signals being received by other observers worldwide who send out their data over the Internet using the same software package. A built-in chat feature allows the users to discuss their observations and results in real time. New software is being developed to allow network users to interactively view a multi-frequency spectroscopic display of the Hawaii radio telescope. This software may also be useful for research applications. Observers in the U.S. and Europe have been contributing data to a central archive of Jupiter and Solar observations (http://jovearchive.gsfc.nasa.gov/). We believe these data to be of value to the research community and would like to have students more directly connected to ongoing research projects to enhance their interest in participating. We welcome ideas for expanding the application of these data.
NASA Technical Reports Server (NTRS)
Scott, David W.; Underwood, Debrah (Technical Monitor)
2002-01-01
At the Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) for International Space Station (ISS), each flight controller maintains detailed logs of activities and communications at their console position. These logs are critical for accurately controlling flight in real-time as well as providing a historical record and troubleshooting tool. This paper describes logging methods and electronic formats used at the POIC and provides food for thought on their strengths and limitations, plus proposes some innovative extensions. It also describes an inexpensive PC-based scheme for capturing and/or transcribing audio clips from communications consoles. Flight control activity (e.g. interpreting computer displays, entering data/issuing electronic commands, and communicating with others) can become extremely intense. It's essential to document it well, but the effort to do so may conflict with actual activity. This can be more than just annoying, as what's in the logs (or just as importantly not in them) often feeds back directly into the quality of future operations, whether short-term or long-term. In earlier programs, such as Spacelab, log keeping was done on paper, often using position-specific shorthand, and the other reader was at the mercy of the writer's penmanship. Today, user-friendly software solves the legibility problem and can automate date/time entry, but some content may take longer to finish due to individual typing speed and less use of symbols. File layout can be used to great advantage in making types of information easy to find, and creating searchable master logs for a given position is very easy and a real lifesaver in reconstructing events or researching a given topic. We'll examine log formats from several console position, and the types of information that are included and (just as importantly) excluded. We'll also look at when a summary or synopsis is effective, and when extensive detail is needed.
NASA Astrophysics Data System (ADS)
Neighbour, Gordon
2013-04-01
In 2012 Computing and Information Technology was disapplied from the English National Curriculum and therefore no longer has a compulsory programme of study. Data logging and data modelling are still essential components of the curriculum in the Computing and Information Technology classroom. Once the students have mastered the basics of both spreadsheet and information handling software they need to be further challenged. All too often the data used with relation to data-logging and data-handling is not realistic enough to really challenge very able students. However, using data from seismology allows students to manipulate "real" data and enhances their experience of geo-science, developing their skills and then allowing them to build on this work in both the science and geography classroom. This new scheme of work "Seismology at School" has allowed the students to work and develop skills beyond those normally expected for their age group and has allowed them to better appreciate their learning experience of "Natural Hazards" in the science and geography classroom in later years. The students undertake research to help them develop their understanding of earthquakes. This includes using materials from other nations within the European Economic Area, to also develop and challenge their use of Modern Foreign Languages. They are then challenged to create their own seismometers using simple kits and 'free' software - this "problem-solving" approach to their work is designed to enhance team-work and to extend the challenge they experience in the classroom. The students are then are asked to manipulate a "real" set of data using international earthquake data from the most recent whole year. This allows the students to make use of many of the analytical and statistical functions of both spreadsheet software and information handling software in a meaningful way. The students will need to have developed a hypothesis which their work should have provided either validation for or against. They are required to document their progress throughout the project and submit their work as an electronic portfolio for marking and this thus challenges their organisational abilities. Finally through the project it is hoped to develop and extend partnerships with other schools in the European Economic Area so that the students are able to work with students from these areas to further appreciate the teaching of "Natural Hazards" in other cultures within the EEA.
Hierarchy Software Development Framework (h-dp-fwk) project
NASA Astrophysics Data System (ADS)
Zaytsev, A.
2010-04-01
Hierarchy Software Development Framework provides a lightweight tool for building portable modular applications for performing automated data analysis tasks in a batch mode. The history of design and development activities devoted to the project has begun in March 2005 and from the very beginning it was targeting the case of building experimental data processing applications for the CMD-3 experiment which is being commissioned at Budker Institute of Nuclear Physics (BINP, Novosibirsk, Russia). Its design addresses the generic case of modular data processing application operating within the well defined distributed computing environment. The main features of the framework are modularity, built-in message and data exchange mechanisms, XInclude and XML schema enabled XML configuration management tools, dedicated log management tools, internal debugging tools, both dynamic and static module chains support, internal DSO version and consistency checking, well defined API for developing specialized frameworks. It is supported on Scientific Linux 4 and 5 and planned to be ported to other platforms as well. The project is provided with the comprehensive set of technical documentation and users' guides. The licensing schema for the source code, binaries and documentation implies that the product is free for non-commercial use. Although the development phase is not over and many features are to be implemented yet the project is considered ready for public use and creating applications in various fields including development of events reconstruction software for small and moderate scale HEP experiments.
New Mexico Play Fairway Analysis: Gamma Ray Logs and Heat Generation Calculations for SW New Mexico
Shari Kelley
2015-10-23
For the New Mexico Play fairway Analysis project, gamma ray geophysical well logs from oil wells penetrating the Proterozoic basement in southwestern New Mexico were digitized. Only the portion of the log in the basement was digitized. The gamma ray logs are converted to heat production using the equation (Bucker and Rybach, 1996) : A[µW/m3] = 0.0158 (Gamma Ray [API] – 0.8).
Guidance and Control Software Project Data - Volume 3: Verification Documents
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J. (Editor)
2008-01-01
The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.
TU-CD-304-11: Veritas 2.0: A Cloud-Based Tool to Facilitate Research and Innovation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishra, P; Patankar, A; Etmektzoglou, A
Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verifiedmore » via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto.« less
Redelinghuys, Michelle; Norton, Gavin R; Maseko, Muzi J; Majane, Olebogeng H I; Woodiwiss, Angela J
2012-06-01
To compare the intra-familial aggregation and heritability of central (aortic) (PPc) versus peripheral (brachial) (PPp) pulse pressure after imputing pretreatment blood pressures (BPs) in treated participants in a community of black African ancestry. Central PPc [generalized transfer function (GTF) and radial P2-derived] was determined with applanation tonometry at the radial artery (SphygmoCor software) in 946 participants from 258 families with 23 families including three generations from an urban developing community of black Africans. In the 24.1% of participants receiving antihypertensive treatment, pretreatment brachial BP was imputed from published overall averaged effects of therapy grouped by class and dose, specific for groups of black African descent. From these data PPc was estimated from proportionate differences in central aortic and brachial PP. Heritability estimates were determined from SAGE software. Echocardiography was evaluated in 507 participants in order to determine stroke volume. With adjustments for confounders, parent-child (P < 0.05) and sibling-sibling (P < 0.0005) correlations were noted for log PPc, whilst for log PPp only sibling-sibling correlations were noted. No mother-father correlations were noted for either PPc or PPp. Independent of confounders the heritability for log GTF-derived (h = 0.33 ± 0.07, P < 0.0001) and P2-derived (h = 0.30 ± 0.07, P < 0.0001) PPc was greater than the heritability for log PPp (h = 0.11 ± 0.06, P < 0.05) (P < 0.05 for comparison of heritability estimates). After imputing pretreatment BP values, central aortic PP is significantly more inherited than brachial PP. These data suggest that in groups of African descent the genetic determinants of PP may be underestimated when employing brachial rather than central aortic PP measurements.
NASA Astrophysics Data System (ADS)
Fussi, F. Fabio; Fumagalli, Letizia; Fava, Francesco; Di Mauro, Biagio; Kane, Cheik Hamidou; Niang, Magatte; Wade, Souleye; Hamidou, Barry; Colombo, Roberto; Bonomi, Tullia
2017-12-01
A method is proposed that uses analysis of borehole stratigraphic logs for the characterization of shallow aquifers and for the assessment of areas suitable for manual drilling. The model is based on available borehole-log parameters: depth to hard rock, depth to water, thickness of laterite and hydraulic transmissivity of the shallow aquifer. The model is applied to a study area in northwestern Senegal. A dataset of boreholes logs has been processed using a software package (TANGAFRIC) developed during the research. After a manual procedure to assign a standard category describing the lithological characteristics, the next step is the automated extraction of different textural parameters and the estimation of hydraulic conductivity using reference values available in the literature. The hydraulic conductivity values estimated from stratigraphic data have been partially validated, by comparing them with measured values from a series of pumping tests carried out in large-diameter wells. The results show that this method is able to produce a reliable interpretation of the shallow hydrogeological context using information generally available in the region. The research contributes to improving the identification of areas where conditions are suitable for manual drilling. This is achieved by applying the described method, based on a structured and semi-quantitative approach, to classify the zones of suitability for given manual drilling techniques using data available in most African countries. Ultimately, this work will support proposed international programs aimed at promoting low-cost water supply in Africa and enhancing access to safe drinking water for the population.
Log-normal frailty models fitted as Poisson generalized linear mixed models.
Hirsch, Katharina; Wienke, Andreas; Kuss, Oliver
2016-12-01
The equivalence of a survival model with a piecewise constant baseline hazard function and a Poisson regression model has been known since decades. As shown in recent studies, this equivalence carries over to clustered survival data: A frailty model with a log-normal frailty term can be interpreted and estimated as a generalized linear mixed model with a binary response, a Poisson likelihood, and a specific offset. Proceeding this way, statistical theory and software for generalized linear mixed models are readily available for fitting frailty models. This gain in flexibility comes at the small price of (1) having to fix the number of pieces for the baseline hazard in advance and (2) having to "explode" the data set by the number of pieces. In this paper we extend the simulations of former studies by using a more realistic baseline hazard (Gompertz) and by comparing the model under consideration with competing models. Furthermore, the SAS macro %PCFrailty is introduced to apply the Poisson generalized linear mixed approach to frailty models. The simulations show good results for the shared frailty model. Our new %PCFrailty macro provides proper estimates, especially in case of 4 events per piece. The suggested Poisson generalized linear mixed approach for log-normal frailty models based on the %PCFrailty macro provides several advantages in the analysis of clustered survival data with respect to more flexible modelling of fixed and random effects, exact (in the sense of non-approximate) maximum likelihood estimation, and standard errors and different types of confidence intervals for all variance parameters. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Gordeev, V. F.; Malyshkov, S. Yu.; Botygin, I. A.; Sherstnev, V. S.; Sherstneva, A. I.
2017-11-01
The general trend of modern ecological geophysics is changing priorities towards rapid assessment, management and prediction of ecological and engineering soil stability as well as developing brand new geophysical technologies. The article describes researches conducted by using multi-canal geophysical logger MGR-01 (developed by IMCES SB RAS), which allows to measure flux density of very low-frequency electromagnetic radiation. It is shown that natural pulsed electromagnetic fields of the earthen lithosphere can be a source of new information on Earth's crust and processes in it, including earthquakes. The device is intended for logging electromagnetic processes in Earth's crust, geophysical exploration, finding structural and lithological inhomogeneities, monitoring the geodynamic movement of Earth's crust, express assessment of seismic hazards. The data is gathered automatically from observation point network in Siberia
Bolaños, Federico; LeDue, Jeff M; Murphy, Timothy H
2017-01-30
Automation of animal experimentation improves consistency, reduces potential for error while decreasing animal stress and increasing well-being. Radio frequency identification (RFID) tagging can identify individual mice in group housing environments enabling animal-specific tracking of physiological parameters. We describe a simple protocol to radio frequency identification (RFID) tag and detect mice. RFID tags were injected sub-cutaneously after brief isoflurane anesthesia and do not require surgical steps such as suturing or incisions. We employ glass-encapsulated 125kHz tags that can be read within 30.2±2.4mm of the antenna. A raspberry pi single board computer and tag reader enable automated logging and cross platform support is possible through Python. We provide sample software written in Python to provide a flexible and cost effective system for logging the weights of multiple mice in relation to pre-defined targets. The sample software can serve as the basis of any behavioral or physiological task where users will need to identify and track specific animals. Recently, we have applied this system of tagging to automated mouse brain imaging within home-cages. We provide a cost effective solution employing open source software to facilitate adoption in applications such as automated imaging or tracking individual animal weights during tasks where food or water restriction is employed as motivation for a specific behavior. Copyright © 2016 Elsevier B.V. All rights reserved.
Data Management Applications for the Service Preparation Subsystem
NASA Technical Reports Server (NTRS)
Luong, Ivy P.; Chang, George W.; Bui, Tung; Allen, Christopher; Malhotra, Shantanu; Chen, Fannie C.; Bui, Bach X.; Gutheinz, Sandy C.; Kim, Rachel Y.; Zendejas, Silvino C.;
2009-01-01
These software applications provide intuitive User Interfaces (UIs) with a consistent look and feel for interaction with, and control of, the Service Preparation Subsystem (SPS). The elements of the UIs described here are the File Manager, Mission Manager, and Log Monitor applications. All UIs provide access to add/delete/update data entities in a complex database schema without requiring technical expertise on the part of the end users. These applications allow for safe, validated, catalogued input of data. Also, the software has been designed in multiple, coherent layers to promote ease of code maintenance and reuse in addition to reducing testing and accelerating maturity.
NASA Astrophysics Data System (ADS)
Blume, F.; Berglund, H.; Estey, L.
2012-12-01
In December 2005, the L2C signal was introduced to improve the accuracy, tracking and redundancy of the GPS system for civilian users. The L2C signal also provides improved SNR data when compared with the L2P(Y) legacy signal. However, GNSS network operators have been hesitant to use the new signal as it is not well determined how positions derived from L2 carrier phase measurements are affected. L2C carrier phase is in quadrature with L2P(Y); some manufacturers correct for this when logging L2C phase while others do not. In cases where both L2C and L2P(Y) are logged simultaneously, translation software must be used carefully in order to select which phase is used in positioning. Modifications were made to UNAVCO's teqc pre-processing software to eliminate confusion, however GNSS networks such as the IGS still suffer occasional data loss due to improperly configured GPS receivers or data flow routines. To date L2C analyses have been restricted to special applications such as snow depth and soil moisture using SNR data, as some high-precision data analysis packages are not compatible with L2C. We use several different methods to determine the effect that tracking and logging L2C has on carrier phase measurements and positioning for various receiver models and configurations. Twenty-four hour zero-length baseline solutions using L2 show sub- millimeter differences in mean positions for both horizontal and vertical components. Direct comparisons of the L2 phase observable from RINEX files with and without the L2C observable show sub-millicycle differences. The magnitude of the variations increased at low elevations. The behavior of the L2P(Y) phase observations or positions from a given receiver were not affected by the enabling of L2C tracking. We find that the use of the L2C-derived carrier phase in real-time applications can be disastrous in cases where receiver brands are mixed between those that correct for quadrature and those that do not (Figure 1). Until standards are implemented for universal phase corrections in either receivers or software the use of L2C should be avoided by real-time network operators. The complexity involved in the adoption of a single new signal on an existing GPS frequency over a period of 7 years has implications for the use of multi-GNSS systems and modernized GPS in geodetic networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebe, Kazuyu, E-mail: nrr24490@nifty.com; Tokuyama, Katsuichi; Baba, Ryuta
Purpose: To develop and evaluate a new video image-based QA system, including in-house software, that can display a tracking state visually and quantify the positional accuracy of dynamic tumor tracking irradiation in the Vero4DRT system. Methods: Sixteen trajectories in six patients with pulmonary cancer were obtained with the ExacTrac in the Vero4DRT system. Motion data in the cranio–caudal direction (Y direction) were used as the input for a programmable motion table (Quasar). A target phantom was placed on the motion table, which was placed on the 2D ionization chamber array (MatriXX). Then, the 4D modeling procedure was performed on themore » target phantom during a reproduction of the patient’s tumor motion. A substitute target with the patient’s tumor motion was irradiated with 6-MV x-rays under the surrogate infrared system. The 2D dose images obtained from the MatriXX (33 frames/s; 40 s) were exported to in-house video-image analyzing software. The absolute differences in the Y direction between the center of the exposed target and the center of the exposed field were calculated. Positional errors were observed. The authors’ QA results were compared to 4D modeling function errors and gimbal motion errors obtained from log analyses in the ExacTrac to verify the accuracy of their QA system. The patients’ tumor motions were evaluated in the wave forms, and the peak-to-peak distances were also measured to verify their reproducibility. Results: Thirteen of sixteen trajectories (81.3%) were successfully reproduced with Quasar. The peak-to-peak distances ranged from 2.7 to 29.0 mm. Three trajectories (18.7%) were not successfully reproduced due to the limited motions of the Quasar. Thus, 13 of 16 trajectories were summarized. The mean number of video images used for analysis was 1156. The positional errors (absolute mean difference + 2 standard deviation) ranged from 0.54 to 1.55 mm. The error values differed by less than 1 mm from 4D modeling function errors and gimbal motion errors in the ExacTrac log analyses (n = 13). Conclusions: The newly developed video image-based QA system, including in-house software, can analyze more than a thousand images (33 frames/s). Positional errors are approximately equivalent to those in ExacTrac log analyses. This system is useful for the visual illustration of the progress of the tracking state and for the quantification of positional accuracy during dynamic tumor tracking irradiation in the Vero4DRT system.« less
NASA Technical Reports Server (NTRS)
Lala, J. H.; Smith, T. B., III
1983-01-01
The software developed for the Fault-Tolerant Multiprocessor (FTMP) is described. The FTMP executive is a timer-interrupt driven dispatcher that schedules iterative tasks which run at 3.125, 12.5, and 25 Hz. Major tasks which run under the executive include system configuration control, flight control, and display. The flight control task includes autopilot and autoland functions for a jet transport aircraft. System Displays include status displays of all hardware elements (processors, memories, I/O ports, buses), failure log displays showing transient and hard faults, and an autopilot display. All software is in a higher order language (AED, an ALGOL derivative). The executive is a fully distributed general purpose executive which automatically balances the load among available processor triads. Provisions for graceful performance degradation under processing overload are an integral part of the scheduling algorithms.
Using Web Server Logs in Evaluating Instructional Web Sites.
ERIC Educational Resources Information Center
Ingram, Albert L.
2000-01-01
Web server logs contain a great deal of information about who uses a Web site and how they use it. This article discusses the analysis of Web logs for instructional Web sites; reviews the data stored in most Web server logs; demonstrates what further information can be gleaned from the logs; and discusses analyzing that information for the…
Browser-Based Online Applications: Something for Everyone!
ERIC Educational Resources Information Center
Descy, Don E.
2007-01-01
Just as many people log onto a Web mail site (Gmail, Yahoo, MSN, etc.) to read, write and store their email, there are Web sites out there with word processing, database, and a myriad of other software applications that are not downloadable but used on the site through a Web browser. The user does not have to download the applications to a…
Logistics Force Planner Assistant (Log Planner)
1989-09-01
elements. The system is implemented on a MS-DOS based microcomputer, using the "Knowledge Pro’ software tool., 20 DISTRIBUTION/AVAILABILITY OF... service support structure. 3. A microcomputer-based knowledge system was developed and successfully demonstrated. Four modules of information are...combat service support (CSS) units planning process to Army Staff logistics planners. Personnel newly assigned to logistics planning need an
Electrofacies analysis for coal lithotype profiling based on high-resolution wireline log data
NASA Astrophysics Data System (ADS)
Roslin, A.; Esterle, J. S.
2016-06-01
The traditional approach to coal lithotype analysis is based on a visual characterisation of coal in core, mine or outcrop exposures. As not all wells are fully cored, the petroleum and coal mining industries increasingly use geophysical wireline logs for lithology interpretation.This study demonstrates a method for interpreting coal lithotypes from geophysical wireline logs, and in particular discriminating between bright or banded, and dull coal at similar densities to a decimetre level. The study explores the optimum combination of geophysical log suites for training the coal electrofacies interpretation, using neural network conception, and then propagating the results to wells with fewer wireline data. This approach is objective and has a recordable reproducibility and rule set.In addition to conventional gamma ray and density logs, laterolog resistivity, microresistivity and PEF data were used in the study. Array resistivity data from a compact micro imager (CMI tool) were processed into a single microresistivity curve and integrated with the conventional resistivity data in the cluster analysis. Microresistivity data were tested in the analysis to test the hypothesis that the improved vertical resolution of microresistivity curve can enhance the accuracy of the clustering analysis. The addition of PEF log allowed discrimination between low density bright to banded coal electrofacies and low density inertinite-rich dull electrofacies.The results of clustering analysis were validated statistically and the results of the electrofacies results were compared to manually derived coal lithotype logs.
SU-E-T-261: Plan Quality Assurance of VMAT Using Fluence Images Reconstituted From Log-Files
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katsuta, Y; Shimizu, E; Matsunaga, K
2014-06-01
Purpose: A successful VMAT plan delivery includes precise modulations of dose rate, gantry rotational and multi-leaf collimator (MLC) shapes. One of the main problem in the plan quality assurance is dosimetric errors associated with leaf-positional errors are difficult to analyze because they vary with MU delivered and leaf number. In this study, we calculated integrated fluence error image (IFEI) from log-files and evaluated plan quality in the area of all and individual MLC leaves scanned. Methods: The log-file reported the expected and actual position for inner 20 MLC leaves and the dose fraction every 0.25 seconds during prostate VMAT onmore » Elekta Synergy. These data were imported to in-house software that developed to calculate expected and actual fluence images from the difference of opposing leaf trajectories and dose fraction at each time. The IFEI was obtained by adding all of the absolute value of the difference between expected and actual fluence images corresponding. Results: In the area all MLC leaves scanned in the IFEI, the average and root mean square (rms) were 2.5 and 3.6 MU, the area of errors below 10, 5 and 3 MU were 98.5, 86.7 and 68.1 %, the 95 % of area was covered with less than error of 7.1 MU. In the area individual MLC leaves scanned in the IFEI, the average and rms value were 2.1 – 3.0 and 3.1 – 4.0 MU, the area of errors below 10, 5 and 3 MU were 97.6 – 99.5, 81.7 – 89.5 and 51.2 – 72.8 %, the 95 % of area was covered with less than error of 6.6 – 8.2 MU. Conclusion: The analysis of the IFEI reconstituted from log-file was provided detailed information about the delivery in the area of all and individual MLC leaves scanned.« less
Maran, E; Novic, M; Barbieri, P; Zupan, J
2004-01-01
The present study focuses on fish antibiotics which are an important group of pharmaceuticals used in fish farming to treat infections and, until recently, most of them have been exposed to the environment with very little attention. Information about the environmental behaviour and the description of the environmental fate of medical substances are difficult or expensive to obtain. The experimental information in terms of properties is reported when available, in other cases, it is estimated by standard tools as those provided by the United States Environmental Protection Agency EPISuite software and by custom quantitative structure-activity relationship (QSAR) applications. In this study, a QSAR screening of 15 fish antibiotics and 132 xenobiotic molecules was performed with two aims: (i) to develop a model for the estimation of octanol--water partition coefficient (logP) and (ii) to estimate the relative binding affinity to oestrogen receptor (log RBA) using a model constructed on the activities of 132 xenobiotic compounds. The custom models are based on constitutional, topological, electrostatic and quantum chemical descriptors computed by the CODESSA software. Kohonen neural networks (self organising maps) were used to study similarity between the considered chemicals while counter-propagation artificial neural networks were used to estimate the properties.
ERIC Educational Resources Information Center
Kerr, Deirdre; Chung, Gregory K. W. K.; Iseli, Markus R.
2011-01-01
Analyzing log data from educational video games has proven to be a challenging endeavor. In this paper, we examine the feasibility of using cluster analysis to extract information from the log files that is interpretable in both the context of the game and the context of the subject area. If cluster analysis can be used to identify patterns of…
Konaté, Ahmed Amara; Ma, Huolin; Pan, Heping; Qin, Zhen; Ahmed, Hafizullah Abba; Dembele, N'dji Dit Jacques
2017-10-01
The availability of a deep well that penetrates deep into the Ultra High Pressure (UHP) metamorphic rocks is unusual and consequently offers a unique chance to study the metamorphic rocks. One such borehole is located in the southern part of Donghai County in the Sulu UHP metamorphic belt of Eastern China, from the Chinese Continental Scientific Drilling Main hole. This study reports the results obtained from the analysis of oxide log data. A geochemical logging tool provides in situ, gamma ray spectroscopy measurements of major and trace elements in the borehole. Dry weight percent oxide concentration logs obtained for this study were SiO 2 , K 2 O, TiO 2 , H 2 O, CO 2 , Na 2 O, Fe 2 O 3 , FeO, CaO, MnO, MgO, P 2 O 5 and Al 2 O 3 . Cross plot and Principal Component Analysis methods were applied for lithology characterization and mineralogy description respectively. Cross plot analysis allows lithological variations to be characterized. Principal Component Analysis shows that the oxide logs can be summarized by two components related to the feldspar and hydrous minerals. This study has shown that geochemical logging tool data is accurate and adequate to be tremendously useful in UHP metamorphic rocks analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.
Computer work duration and its dependence on the used pause definition.
Richter, Janneke M; Slijper, Harm P; Over, Eelco A B; Frens, Maarten A
2008-11-01
Several ergonomic studies have estimated computer work duration using registration software. In these studies, an arbitrary pause definition (Pd; the minimal time between two computer events to constitute a pause) is chosen and the resulting duration of computer work is estimated. In order to uncover the relationship between the used pause definition and the computer work duration (PWT), we used registration software to record usage patterns of 571 computer users across almost 60,000 working days. For a large range of Pds (1-120 s), we found a shallow, log-linear relationship between PWT and Pds. For keyboard and mouse use, a second-order function fitted the data best. We found that these relationships were dependent on the amount of computer work and subject characteristics. Comparison of exposure duration from studies using different pause definitions should take this into account, since it could lead to misclassification. Software manufacturers and ergonomists assessing computer work duration could use the found relationships for software design and study comparison.
NASA Technical Reports Server (NTRS)
Herskovits, E. H.; Itoh, R.; Melhem, E. R.
2001-01-01
OBJECTIVE: The objective of our study was to determine the effects of MR sequence (fluid-attenuated inversion-recovery [FLAIR], proton density--weighted, and T2-weighted) and of lesion location on sensitivity and specificity of lesion detection. MATERIALS AND METHODS: We generated FLAIR, proton density-weighted, and T2-weighted brain images with 3-mm lesions using published parameters for acute multiple sclerosis plaques. Each image contained from zero to five lesions that were distributed among cortical-subcortical, periventricular, and deep white matter regions; on either side; and anterior or posterior in position. We presented images of 540 lesions, distributed among 2592 image regions, to six neuroradiologists. We constructed a contingency table for image regions with lesions and another for image regions without lesions (normal). Each table included the following: the reviewer's number (1--6); the MR sequence; the side, position, and region of the lesion; and the reviewer's response (lesion present or absent [normal]). We performed chi-square and log-linear analyses. RESULTS: The FLAIR sequence yielded the highest true-positive rates (p < 0.001) and the highest true-negative rates (p < 0.001). Regions also differed in reviewers' true-positive rates (p < 0.001) and true-negative rates (p = 0.002). The true-positive rate model generated by log-linear analysis contained an additional sequence-location interaction. The true-negative rate model generated by log-linear analysis confirmed these associations, but no higher order interactions were added. CONCLUSION: We developed software with which we can generate brain images of a wide range of pulse sequences and that allows us to specify the location, size, shape, and intrinsic characteristics of simulated lesions. We found that the use of FLAIR sequences increases detection accuracy for cortical-subcortical and periventricular lesions over that associated with proton density- and T2-weighted sequences.
NASA Astrophysics Data System (ADS)
Tian, Xiang-Dong
The purpose of this research is to simulate induction and measuring-while-drilling (MWD) logs. In simulation of logs, there are two tasks. The first task, the forward modeling procedure, is to compute the logs from known formation. The second task, the inversion procedure, is to determine the unknown properties of the formation from the measured field logs. In general, the inversion procedure requires the solution of a forward model. In this study, a stable numerical method to simulate induction and MWD logs is presented. The proposed algorithm is based on a horizontal eigenmode expansion method. Vertical propagation of modes is modeled by a three-layer module. The multilayer cases are treated as a cascade of these modules. The mode tracing algorithm possesses stable characteristics that are superior to other methods. This method is applied to simulate the logs in the formations with both vertical and horizontal layers, and also used to study the groove effects of the MWD tool. The results are very good. Two-dimensional inversion of induction logs is an nonlinear problem. Nonlinear functions of the apparent conductivity are expanded into a Taylor series. After truncating the high order terms in this Taylor series, the nonlinear functions are linearized. An iterative procedure is then devised to solve the inversion problem. In each iteration, the Jacobian matrix is calculated, and a small variation computed using the least-squares method is used to modify the background medium. Finally, the inverted medium is obtained. The horizontal eigenstate method is used to solve the forward problem. It is found that a good inverted formation can be obtained by using measurements. In order to help the user simulate the induction logs conveniently, a Wellog Simulator, based on the X-window system, is developed. The application software (FORTRAN codes) embedded in the Simulator is designed to simulate the responses of the induction tools in the layered formation with dipping beds. The graphic user-interface part of the Wellog Simulator is implemented with C and Motif. Through the user interface, the user can prepare the simulation data, select the tools, simulate the logs and plot the results.
Pouillot, Régis; Van Doren, Jane M; Woods, Jacquelina; Plante, Daniel; Smith, Mark; Goblick, Gregory; Roberts, Christopher; Locas, Annie; Hajen, Walter; Stobo, Jeffrey; White, John; Holtzman, Jennifer; Buenaventura, Enrico; Burkhardt, William; Catford, Angela; Edwards, Robyn; DePaola, Angelo; Calci, Kevin R
2015-07-01
Human norovirus (NoV) is the leading cause of foodborne illness in the United States and Canada. Wastewater treatment plant (WWTP) effluents impacting bivalve mollusk-growing areas are potential sources of NoV contamination. We have developed a meta-analysis that evaluates WWTP influent concentrations and log10 reductions of NoV genotype I (NoV GI; in numbers of genome copies per liter [gc/liter]), NoV genotype II (NoV GII; in gc/liter), and male-specific coliphage (MSC; in number of PFU per liter), a proposed viral surrogate for NoV. The meta-analysis included relevant data (2,943 measurements) reported in the scientific literature through September 2013 and previously unpublished surveillance data from the United States and Canada. Model results indicated that the mean WWTP influent concentration of NoV GII (3.9 log10 gc/liter; 95% credible interval [CI], 3.5, 4.3 log10 gc/liter) is larger than the value for NoV GI (1.5 log10 gc/liter; 95% CI, 0.4, 2.4 log10 gc/liter), with large variations occurring from one WWTP to another. For WWTPs with mechanical systems and chlorine disinfection, mean log10 reductions were -2.4 log10 gc/liter (95% CI, -3.9, -1.1 log10 gc/liter) for NoV GI, -2.7 log10 gc/liter (95% CI, -3.6, -1.9 log10 gc/liter) for NoV GII, and -2.9 log10 PFU per liter (95% CI, -3.4, -2.4 log10 PFU per liter) for MSCs. Comparable values for WWTPs with lagoon systems and chlorine disinfection were -1.4 log10 gc/liter (95% CI, -3.3, 0.5 log10 gc/liter) for NoV GI, -1.7 log10 gc/liter (95% CI, -3.1, -0.3 log10 gc/liter) for NoV GII, and -3.6 log10 PFU per liter (95% CI, -4.8, -2.4 PFU per liter) for MSCs. Within WWTPs, correlations exist between mean NoV GI and NoV GII influent concentrations and between the mean log10 reduction in NoV GII and the mean log10 reduction in MSCs. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
HS.Register - An Audit-Trail Tool to Respond to the General Data Protection Regulation (GDPR).
Gonçalves-Ferreira, Duarte; Leite, Mariana; Santos-Pereira, Cátia; Correia, Manuel E; Antunes, Luis; Cruz-Correia, Ricardo
2018-01-01
Introduction The new General Data Protection Regulation (GDPR) compels health care institutions and their software providers to properly document all personal data processing and provide clear evidence that their systems are inline with the GDPR. All applications involved in personal data processing should therefore produce meaningful event logs that can later be used for the effective auditing of complex processes. Aim This paper aims to describe and evaluate HS.Register, a system created to collect and securely manage at scale audit logs and data produced by a large number of systems. Methods HS.Register creates a single audit log by collecting and aggregating all kinds of meaningful event logs and data (e.g. ActiveDirectory, syslog, log4j, web server logs, REST, SOAP and HL7 messages). It also includes specially built dashboards for easy auditing and monitoring of complex processes, crossing different systems in an integrated way, as well as providing tools for helping on the auditing and on the diagnostics of difficult problems, using a simple web application. HS.Register is currently installed at five large Portuguese Hospitals and is composed of the following open-source components: HAproxy, RabbitMQ, Elasticsearch, Logstash and Kibana. Results HS.Register currently collects and analyses an average of 93 million events per week and it is being used to document and audit HL7 communications. Discussion Auditing tools like HS.Register are likely to become mandatory in the near future to allow for traceability and detailed auditing for GDPR compliance.
Interpretation of well logs in a carbonate aquifer
MacCary, L.M.
1978-01-01
This report describes the log analysis of the Randolph and Sabial core holes in the Edwards aquifer in Texas, with particular attention to the principles that can be applied generally to any carbonate system. The geologic and hydrologic data were obtained during the drilling of the two holes, from extensive laboratory analysis of the cores, and from numerous geophysical logs run in the two holes. Some logging methods are inherently superiors to others for the analysis of limestone and dolomite aquifers. Three such systems are the dentistry, neutron, and acoustic-velocity (sonic) logs. Most of the log analysis described here is based on the interpretation of suites of logs from these three systems. In certain instances, deeply focused resistivity logs can be used to good advantage in carbonate rock studies; this technique is used to computer the water resistivity in the Randolph core hole. The rocks penetrated by the Randolph core hole are typical of those carbonates that have undergone very little solution by recent ground-water circulation. There are few large solutional openings; the water is saline; and the rocks are dark, dolomitic, have pore space that is interparticle or intercrystalline, and contain unoxidized organic material. The total porosity of rocks in the saline zone is higher than that of rocks in the fresh-water aquifer; however, the intrinsic permeability is much less in the saline zone because there are fewer large solutional openings. The Sabinal core hole penetrates a carbonate environment that has experienced much solution by ground water during recent geologic time. The rocks have high secondary porosities controlled by sedimentary structures within the rock; the water is fresh; and the dominant rock composition is limestone. The relative percentages of limestone and dolomite, the average matrix (grain) densities of the rock mixtures , and the porosity of the rock mass can be calculated from density, neutron, and acoustic logs. With supporting data from resistivity logs, the formation water quality can be estimated, as well as the relative cementation or tortuosity of the rock. Many of these properties calculated from logs can be verified by analysis of the core available from test holes drilled in the saline and fresh water zones.
Thomas Harless; Francis G. Wagner; Phillip Steele; Fred Taylor; Vikram Yadama; Charles W. McMillin
1991-01-01
A precise research methodology is described by which internal log-defect locations may help select hardwood log ortentation and sawing procedure to improve lumber value. Procedures for data collection, data handling, simulated sawing, and data analysis are described. A single test log verified the methodology. Results from this log showed significant differences in...
DOE Office of Scientific and Technical Information (OSTI.GOV)
DiCostanzo, D; Ayan, A; Woollard, J
Purpose: To automate the daily verification of each patient’s treatment by utilizing the trajectory log files (TLs) written by the Varian TrueBeam linear accelerator while reducing the number of false positives including jaw and gantry positioning errors, that are displayed in the Treatment History tab of Varian’s Chart QA module. Methods: Small deviations in treatment parameters are difficult to detect in weekly chart checks, but may be significant in reducing delivery errors, and would be critical if detected daily. Software was developed in house to read TLs. Multiple functions were implemented within the software that allow it to operate viamore » a GUI to analyze TLs, or as a script to run on a regular basis. In order to determine tolerance levels for the scripted analysis, 15,241 TLs from seven TrueBeams were analyzed. The maximum error of each axis for each TL was written to a CSV file and statistically analyzed to determine the tolerance for each axis accessible in the TLs to flag for manual review. The software/scripts developed were tested by varying the tolerance values to ensure veracity. After tolerances were determined, multiple weeks of manual chart checks were performed simultaneously with the automated analysis to ensure validity. Results: The tolerance values for the major axis were determined to be, 0.025 degrees for the collimator, 1.0 degree for the gantry, 0.002cm for the y-jaws, 0.01cm for the x-jaws, and 0.5MU for the MU. The automated verification of treatment parameters has been in clinical use for 4 months. During that time, no errors in machine delivery of the patient treatments were found. Conclusion: The process detailed here is a viable and effective alternative to manually checking treatment parameters during weekly chart checks.« less
Web Log Analysis: A Study of Instructor Evaluations Done Online
ERIC Educational Resources Information Center
Klassen, Kenneth J.; Smith, Wayne
2004-01-01
This paper focuses on developing a relatively simple method for analyzing web-logs. It also explores the challenges and benefits of web-log analysis. The study of student behavior on this site provides insights into website design and the effectiveness of this site in particular. Another benefit realized from the paper is the ease with which these…
NASA Astrophysics Data System (ADS)
Chu, A.
2016-12-01
Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work implements three of the homogeneous ETAS models described in Ogata (1998). With a model's log-likelihood function, my software finds the Maximum-Likelihood Estimates (MLEs) of the model's parameters to estimate the homogeneous background rate and the temporal and spatial parameters that govern triggering effects. EM-algorithm is employed for its advantages of stability and robustness (Veen and Schoenberg, 2008). My work also presents comparisons among the three models in robustness, convergence speed, and implementations from theory to computing practice. Up-to-date regional seismic data of seismic active areas such as Southern California and Japan are used to demonstrate the comparisons. Data analysis has been done using computer languages Java and R. Java has the advantages of being strong-typed and easiness of controlling memory resources, while R has the advantages of having numerous available functions in statistical computing. Comparisons are also made between the two programming languages in convergence and stability, computational speed, and easiness of implementation. Issues that may affect convergence such as spatial shapes are discussed.
Development of a Multi-frequency Interferometer Telescope for Radio Astronomy (MITRA)
NASA Astrophysics Data System (ADS)
Ingala, Dominique Guelord Kumamputu
2015-03-01
This dissertation describes the development and construction of the Multi-frequency Interferometer Telescope for Radio Astronomy (MITRA) at the Durban University of Technology. The MITRA station consists of 2 antenna arrays separated by a baseline distance of 8 m. Each array consists of 8 Log-Periodic Dipole Antennas (LPDAs) operating from 200 MHz to 800 MHz. The design and construction of the LPDA antenna and receiver system is described. The receiver topology provides an equivalent noise temperature of 113.1 K and 55.1 dB of gain. The Intermediate Frequency (IF) stage was designed to produce a fixed IF frequency of 800 MHz. The digital Back-End and correlator were implemented using a low cost Software Defined Radio (SDR) platform and Gnu-Radio software. Gnu-Octave was used for data analysis to generate the relevant received signal parameters including total power, real, and imaginary, magnitude and phase components. Measured results show that interference fringes were successfully detected within the bandwidth of the receiver using a Radio Frequency (RF) generator as a simulated source. This research was presented at the IEEE Africon 2013 / URSI Session Mauritius, and published in the proceedings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCord, Jason
WLS gathers all known relevant contextual data along with standard event log information, processes it into an easily consumable format for analysis by 3rd party tools, and forwards the logs to any compatible log server.
Modeling and Evaluation of Geophysical Methods for Monitoring and Tracking CO2 Migration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daniels, Jeff
2012-11-30
Geological sequestration has been proposed as a viable option for mitigating the vast amount of CO{sub 2} being released into the atmosphere daily. Test sites for CO{sub 2} injection have been appearing across the world to ascertain the feasibility of capturing and sequestering carbon dioxide. A major concern with full scale implementation is monitoring and verifying the permanence of injected CO{sub 2}. Geophysical methods, an exploration industry standard, are non-invasive imaging techniques that can be implemented to address that concern. Geophysical methods, seismic and electromagnetic, play a crucial role in monitoring the subsurface pre- and post-injection. Seismic techniques have beenmore » the most popular but electromagnetic methods are gaining interest. The primary goal of this project was to develop a new geophysical tool, a software program called GphyzCO2, to investigate the implementation of geophysical monitoring for detecting injected CO{sub 2} at test sites. The GphyzCO2 software consists of interconnected programs that encompass well logging, seismic, and electromagnetic methods. The software enables users to design and execute 3D surface-to-surface (conventional surface seismic) and borehole-to-borehole (cross-hole seismic and electromagnetic methods) numerical modeling surveys. The generalized flow of the program begins with building a complex 3D subsurface geological model, assigning properties to the models that mimic a potential CO{sub 2} injection site, numerically forward model a geophysical survey, and analyze the results. A test site located in Warren County, Ohio was selected as the test site for the full implementation of GphyzCO2. Specific interest was placed on a potential reservoir target, the Mount Simon Sandstone, and cap rock, the Eau Claire Formation. Analysis of the test site included well log data, physical property measurements (porosity), core sample resistivity measurements, calculating electrical permittivity values, seismic data collection, and seismic interpretation. The data was input into GphyzCO2 to demonstrate a full implementation of the software capabilities. Part of the implementation investigated the limits of using geophysical methods to monitor CO{sub 2} injection sites. The results show that cross-hole EM numerical surveys are limited to under 100 meter borehole separation. Those results were utilized in executing numerical EM surveys that contain hypothetical CO{sub 2} injections. The outcome of the forward modeling shows that EM methods can detect the presence of CO{sub 2}.« less
Multicriteria evaluation of simulated logging scenarios in a tropical rain forest.
Huth, Andreas; Drechsler, Martin; Köhler, Peter
2004-07-01
Forest growth models are useful tools for investigating the long-term impacts of logging. In this paper, the results of the rain forest growth model FORMIND were assessed by a multicriteria decision analysis. The main processes covered by FORMIND include tree growth, mortality, regeneration and competition. Tree growth is calculated based on a carbon balance approach. Trees compete for light and space; dying large trees fall down and create gaps in the forest. Sixty-four different logging scenarios for an initially undisturbed forest stand at Deramakot (Malaysia) were simulated. The scenarios differ regarding the logging cycle, logging method, cutting limit and logging intensity. We characterise the impacts with four criteria describing the yield, canopy opening and changes in species composition. Multicriteria decision analysis was used for the first time to evaluate the scenarios and identify the efficient ones. Our results plainly show that reduced-impact logging scenarios are more 'efficient' than the others, since in these scenarios forest damage is minimised without significantly reducing yield. Nevertheless, there is a trade-off between yield and achieving a desired ecological state of logged forest; the ecological state of the logged forests can only be improved by reducing yields and enlarging the logging cycles. Our study also demonstrates that high cutting limits or low logging intensities cannot compensate for the high level of damage caused by conventional logging techniques.
Financial and Economic Analysis of Reduced Impact Logging
Tom Holmes
2016-01-01
Concern regarding extensive damage to tropical forests resulting from logging increased dramatically after World War II when mechanized logging systems developed in industrialized countries were deployed in the tropics. As a consequence, tropical foresters began developing logging procedures that were more environmentally benign, and by the 1990s, these practices began...
An Analysis of the Differences among Log Scaling Methods and Actual Log Volume
R. Edward Thomas; Neal D. Bennett
2017-01-01
Log rules estimate the volume of green lumber that can be expected to result from the sawing of a log. As such, this ability to reliably predict lumber recovery forms the foundation of log sales and purchase. The more efficient a sawmill, the less the scaling methods reflect the actual volume recovery and the greater the overrun factor. Using high-resolution scanned...
Camminatiello, Ida; D'Ambra, Antonello; Sarnacchiaro, Pasquale
2014-01-01
In this paper we are proposing a general framework for the analysis of the complete set of log Odds Ratios (ORs) generated by a two-way contingency table. Starting from the RC (M) association model and hypothesizing a Poisson distribution for the counts of the two-way contingency table we are obtaining the weighted Log Ratio Analysis that we are extending to the study of log ORs. Particularly we are obtaining an indirect representation of the log ORs and some synthesis measures. Then for studying the matrix of log ORs we are performing a generalized Singular Value Decomposition that allows us to obtain a direct representation of log ORs. We also expect to get summary measures of association too. We have considered the matrix of complete set of ORs, because, it is linked to the two-way contingency table in terms of variance and it allows us to represent all the ORs on a factorial plan. Finally, a two-way contingency table, which crosses pollution of the Sarno river and sampling points, is to be analyzed to illustrate the proposed framework.
Software Architecture Evolution
2013-12-01
system’s major components occurring via a Java Message Service message bus [69]. This architecture was designed to promote loose coupling of soft- ware...play reconfiguration of the system. The components were Java -based and platform-independent; the interfaces by which they communicated were based on...The MPCS database, a MySQL database used for storing telemetry as well as some other information, such as logs and commanding data [68]. This
NASA Astrophysics Data System (ADS)
Vazquez-Quino, L. A.; Huerta-Hernandez, C. I.; Rangaraj, D.
2017-05-01
MobiusFX, an add-on software module from Mobius Medical Systems for IMRT and VMAT QA, uses measurements in linac treatment logs to calculate and verify the 3D dose delivered to patients. In this study, 10 volumetric-modulated arc therapy (VMAT) prostate plans were planned and delivered in a Varian TrueBeam linac. The plans consisted of beams with 6 and 10 MV energy and 2 to 3 arcs per plan. The average gamma value with criterion of 3% and 3mm MobiusFX and TPS: 99.96%, 2% and 2mm MobiusFX and TPS: 98.70 %. Further comparison with ArcCheck measurements was conducted.
Software development kit for a compact cryo-refrigerator
NASA Astrophysics Data System (ADS)
Gardiner, J.; Hamilton, J.; Lawton, J.; Knight, K.; Wilson, A.; Spagna, S.
2017-12-01
This paper introduces a Software Development Kit (SDK) that enables the creation of custom software applications that automate the control of a cryo-refrigerator (Quantum Design model GA-1) in third party instruments. A remote interface allows real time tracking and logging of critical system diagnostics such as pressures, temperatures, valve states and run modes. The helium compressor scroll capsule speed and Gifford-McMahon (G-M) cold head speed can be manually adjusted over a serial communication line via a CAN interface. This configuration optimizes cooling power, while reducing wear on moving components thus extending service life. Additionally, a proportional speed control mode allows for automated throttling of speeds based on temperature or pressure feedback from a 3rd party device. Warm up and cool down modes allow 1st and 2nd stage temperatures to be adjusted without the use of external heaters.
Sample Analysis at Mars Instrument Simulator
NASA Technical Reports Server (NTRS)
Benna, Mehdi; Nolan, Tom
2013-01-01
The Sample Analysis at Mars Instrument Simulator (SAMSIM) is a numerical model dedicated to plan and validate operations of the Sample Analysis at Mars (SAM) instrument on the surface of Mars. The SAM instrument suite, currently operating on the Mars Science Laboratory (MSL), is an analytical laboratory designed to investigate the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. SAMSIM was developed using Matlab and Simulink libraries of MathWorks Inc. to provide MSL mission planners with accurate predictions of the instrument electrical, thermal, mechanical, and fluid responses to scripted commands. This tool is a first example of a multi-purpose, full-scale numerical modeling of a flight instrument with the purpose of supplementing or even eliminating entirely the need for a hardware engineer model during instrument development and operation. SAMSIM simulates the complex interactions that occur between the instrument Command and Data Handling unit (C&DH) and all subsystems during the execution of experiment sequences. A typical SAM experiment takes many hours to complete and involves hundreds of components. During the simulation, the electrical, mechanical, thermal, and gas dynamics states of each hardware component are accurately modeled and propagated within the simulation environment at faster than real time. This allows the simulation, in just a few minutes, of experiment sequences that takes many hours to execute on the real instrument. The SAMSIM model is divided into five distinct but interacting modules: software, mechanical, thermal, gas flow, and electrical modules. The software module simulates the instrument C&DH by executing a customized version of the instrument flight software in a Matlab environment. The inputs and outputs to this synthetic C&DH are mapped to virtual sensors and command lines that mimic in their structure and connectivity the layout of the instrument harnesses. This module executes, and thus validates, complex command scripts prior to their up-linking to the SAM instrument. As an output, this module generates synthetic data and message logs at a rate that is similar to the actual instrument.
Jorgensen, Donald G.; Petricola, Mario
1994-01-01
A program of borehole-geophysical logging was implemented to supply geologic and geohydrologic information for a regional ground-water investigation of Abu Dhabi Emirate. Analysis of geophysical logs was essential to provide information on geohydrologic properties because drill cuttings were not always adequate to define lithologic boundaries. The standard suite of logs obtained at most project test holes consisted of caliper, spontaneous potential, gamma ray, dual induction, microresistivity, compensated neutron, compensated density, and compensated sonic. Ophiolitic detritus from the nearby Oman Mountains has unusual petrophysical properties that complicated the interpretation of geophysical logs. The density of coarse ophiolitic detritus is typically greater than 3.0 grams per cubic centimeter, porosity values are large, often exceeding 45 percent, and the clay fraction included unusual clays, such as lizardite. Neither the spontaneous-potential log nor the natural gamma-ray log were useable clay indicators. Because intrinsic permeability is a function of clay content, additional research in determining clay content was critical. A research program of geophysical logging was conducted to determine the petrophysical properties of the shallow subsurface formations. The logging included spectral-gamma and thermal-decay-time logs. These logs, along with the standard geophysical logs, were correlated to mineralogy and whole-rock chemistry as determined from sidewall cores. Thus, interpretation of lithology and fluids was accomplished. Permeability and specific yield were calculated from geophysical-log data and correlated to results from an aquifer test. On the basis of results from the research logging, a method of lithologic and water-resistivity interpretation was developed for the test holes at which the standard suite of logs were obtained. In addition, a computer program was developed to assist in the analysis of log data. Geohydrologic properties were estimated, including volume of clay matrix, volume of matrix other than clay, density of matrix other than clay, density of matrix, intrinsic permeability, specific yield, and specific storage. Geophysical logs were used to (1) determine lithology, (2) correlate lithologic and permeable zones, (3) calibrate seismic reprocessing, (4) calibrate transient-electromagnetic surveys, and (5) calibrate uphole-survey interpretations. Logs were used at the drill site to (1) determine permeability zones, (2) determine dissolved-solids content, which is a function of water resistivity, and (3) design wells accordingly. Data and properties derived from logs were used to determine transmissivity and specific yield of aquifer materials.
An open-source data storage and visualization back end for experimental data.
Nielsen, Kenneth; Andersen, Thomas; Jensen, Robert; Nielsen, Jane H; Chorkendorff, Ib
2014-04-01
In this article, a flexible free and open-source software system for data logging and presentation will be described. The system is highly modular and adaptable and can be used in any laboratory in which continuous and/or ad hoc measurements require centralized storage. A presentation component for the data back end has furthermore been written that enables live visualization of data on any device capable of displaying Web pages. The system consists of three parts: data-logging clients, a data server, and a data presentation Web site. The logging of data from independent clients leads to high resilience to equipment failure, whereas the central storage of data dramatically eases backup and data exchange. The visualization front end allows direct monitoring of acquired data to see live progress of long-duration experiments. This enables the user to alter experimental conditions based on these data and to interfere with the experiment if needed. The data stored consist both of specific measurements and of continuously logged system parameters. The latter is crucial to a variety of automation and surveillance features, and three cases of such features are described: monitoring system health, getting status of long-duration experiments, and implementation of instant alarms in the event of failure.
Keystroke Analysis: Reflections on Procedures and Measures
ERIC Educational Resources Information Center
Baaijen, Veerle M.; Galbraith, David; de Glopper, Kees
2012-01-01
Although keystroke logging promises to provide a valuable tool for writing research, it can often be difficult to relate logs to underlying processes. This article describes the procedures and measures that the authors developed to analyze a sample of 80 keystroke logs, with a view to achieving a better alignment between keystroke-logging measures…
Electronic Warfare M-on-N Digital Simulation Logging Requirements and HDF5: A Preliminary Analysis
2017-04-12
LOGGING STREAM The goal of this report is to investigate logging of EW simulations not at the level of implementation in a database management ...differences of the logging stream and relational models. A hierarchical navigation query style appears very natural for our application. Yet the
Selective logging in the Brazilian Amazon.
G. P. Asner; D. E. Knapp; E. N. Broadbent; P. J. C. Oliveira; M Keller; J. N. Silva
2005-01-01
Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square...
Technoeconomic analysis of conventional logging systems operating from stump to landing
Raymond L. Sarles; William G. Luppold; William G. Luppold
1986-01-01
Analyzes technical and economic factors for six conventional logging systems suitable for operation in eastern forests. Discusses financial risks and business implications for loggers investing in high-production, state-of-the-art logging systems. Provides logging contractors with information useful as a preliminary guide for selection of equipment and systems....
Network monitoring in the Tier2 site in Prague
NASA Astrophysics Data System (ADS)
Eliáš, Marek; Fiala, Lukáš; Horký, Jiří; Chudoba, Jiří; Kouba, Tomáš; Kundrát, Jan; Švec, Jan
2011-12-01
Network monitoring provides different types of view on the network traffic. It's output enables computing centre staff to make qualified decisions about changes in the organization of computing centre network and to spot possible problems. In this paper we present network monitoring framework used at Tier-2 in Prague in Institute of Physics (FZU). The framework consists of standard software and custom tools. We discuss our system for hardware failures detection using syslog logging and Nagios active checks, bandwidth monitoring of physical links and analysis of NetFlow exports from Cisco routers. We present tool for automatic detection of network layout based on SNMP. This tool also records topology changes into SVN repository. Adapted weathermap4rrd is used to visualize recorded data to get fast overview showing current bandwidth usage of links in network.
Software systems for operation, control, and monitoring of the EBEX instrument
NASA Astrophysics Data System (ADS)
Milligan, Michael; Ade, Peter; Aubin, François; Baccigalupi, Carlo; Bao, Chaoyun; Borrill, Julian; Cantalupo, Christopher; Chapman, Daniel; Didier, Joy; Dobbs, Matt; Grainger, Will; Hanany, Shaul; Hillbrand, Seth; Hubmayr, Johannes; Hyland, Peter; Jaffe, Andrew; Johnson, Bradley; Kisner, Theodore; Klein, Jeff; Korotkov, Andrei; Leach, Sam; Lee, Adrian; Levinson, Lorne; Limon, Michele; MacDermid, Kevin; Matsumura, Tomotake; Miller, Amber; Pascale, Enzo; Polsgrove, Daniel; Ponthieu, Nicolas; Raach, Kate; Reichborn-Kjennerud, Britt; Sagiv, Ilan; Tran, Huan; Tucker, Gregory S.; Vinokurov, Yury; Yadav, Amit; Zaldarriaga, Matias; Zilic, Kyle
2010-07-01
We present the hardware and software systems implementing autonomous operation, distributed real-time monitoring, and control for the EBEX instrument. EBEX is a NASA-funded balloon-borne microwave polarimeter designed for a 14 day Antarctic flight that circumnavigates the pole. To meet its science goals the EBEX instrument autonomously executes several tasks in parallel: it collects attitude data and maintains pointing control in order to adhere to an observing schedule; tunes and operates up to 1920 TES bolometers and 120 SQUID amplifiers controlled by as many as 30 embedded computers; coordinates and dispatches jobs across an onboard computer network to manage this detector readout system; logs over 3 GiB/hour of science and housekeeping data to an onboard disk storage array; responds to a variety of commands and exogenous events; and downlinks multiple heterogeneous data streams representing a selected subset of the total logged data. Most of the systems implementing these functions have been tested during a recent engineering flight of the payload, and have proven to meet the target requirements. The EBEX ground segment couples uplink and downlink hardware to a client-server software stack, enabling real-time monitoring and command responsibility to be distributed across the public internet or other standard computer networks. Using the emerging dirfile standard as a uniform intermediate data format, a variety of front end programs provide access to different components and views of the downlinked data products. This distributed architecture was demonstrated operating across multiple widely dispersed sites prior to and during the EBEX engineering flight.
Computer-based learning of spelling skills in children with and without dyslexia.
Kast, Monika; Baschera, Gian-Marco; Gross, Markus; Jäncke, Lutz; Meyer, Martin
2011-12-01
Our spelling training software recodes words into multisensory representations comprising visual and auditory codes. These codes represent information about letters and syllables of a word. An enhanced version, developed for this study, contains an additional phonological code and an improved word selection controller relying on a phoneme-based student model. We investigated the spelling behavior of children by means of learning curves based on log-file data of the previous and the enhanced software version. First, we compared the learning progress of children with dyslexia working either with the previous software (n = 28) or the adapted version (n = 37). Second, we investigated the spelling behavior of children with dyslexia (n = 37) and matched children without dyslexia (n = 25). To gain deeper insight into which factors are relevant for acquiring spelling skills, we analyzed the influence of cognitive abilities, such as attention functions and verbal memory skills, on the learning behavior. All investigations of the learning process are based on learning curve analyses of the collected log-file data. The results evidenced that those children with dyslexia benefit significantly from the additional phonological cue and the corresponding phoneme-based student model. Actually, children with dyslexia improve their spelling skills to the same extent as children without dyslexia and were able to memorize phoneme to grapheme correspondence when given the correct support and adequate training. In addition, children with low attention functions benefit from the structured learning environment. Generally, our data showed that memory sources are supportive cognitive functions for acquiring spelling skills and for using the information cues of a multi-modal learning environment.
Kargarian-Marvasti, Sadegh; Rimaz, Shahnaz; Abolghasemi, Jamileh; Heydari, Iraj
2017-01-01
Cox proportional hazard model is the most common method for analyzing the effects of several variables on survival time. However, under certain circumstances, parametric models give more precise estimates to analyze survival data than Cox. The purpose of this study was to investigate the comparative performance of Cox and parametric models in a survival analysis of factors affecting the event time of neuropathy in patients with type 2 diabetes. This study included 371 patients with type 2 diabetes without neuropathy who were registered at Fereydunshahr diabetes clinic. Subjects were followed up for the development of neuropathy between 2006 to March 2016. To investigate the factors influencing the event time of neuropathy, significant variables in univariate model ( P < 0.20) were entered into the multivariate Cox and parametric models ( P < 0.05). In addition, Akaike information criterion (AIC) and area under ROC curves were used to evaluate the relative goodness of fitted model and the efficiency of each procedure, respectively. Statistical computing was performed using R software version 3.2.3 (UNIX platforms, Windows and MacOS). Using Kaplan-Meier, survival time of neuropathy was computed 76.6 ± 5 months after initial diagnosis of diabetes. After multivariate analysis of Cox and parametric models, ethnicity, high-density lipoprotein and family history of diabetes were identified as predictors of event time of neuropathy ( P < 0.05). According to AIC, "log-normal" model with the lowest Akaike's was the best-fitted model among Cox and parametric models. According to the results of comparison of survival receiver operating characteristics curves, log-normal model was considered as the most efficient and fitted model.
Checking Flight Rules with TraceContract: Application of a Scala DSL for Trace Analysis
NASA Technical Reports Server (NTRS)
Barringer, Howard; Havelund, Klaus; Morris, Robert A.
2011-01-01
Typically during the design and development of a NASA space mission, rules and constraints are identified to help reduce reasons for failure during operations. These flight rules are usually captured in a set of indexed tables, containing rule descriptions, rationales for the rules, and other information. Flight rules can be part of manual operations procedures carried out by humans. However, they can also be automated, and either implemented as on-board monitors, or as ground based monitors that are part of a ground data system. In the case of automated flight rules, one considerable expense to be addressed for any mission is the extensive process by which system engineers express flight rules in prose, software developers translate these requirements into code, and then both experts verify that the resulting application is correct. This paper explores the potential benefits of using an internal Scala DSL for general trace analysis, named TRACECONTRACT, to write executable specifications of flight rules. TRACECONTRACT can generally be applied to analysis of for example log files or for monitoring executing systems online.
Rodrigues, Dario B; Maccarini, Paolo F; Salahi, Sara; Oliveira, Tiago R; Pereira, Pedro J S; Limao-Vieira, Paulo; Snow, Brent W; Reudink, Doug; Stauffer, Paul R
2014-07-01
We present the modeling efforts on antenna design and frequency selection to monitor brain temperature during prolonged surgery using noninvasive microwave radiometry. A tapered log-spiral antenna design is chosen for its wideband characteristics that allow higher power collection from deep brain. Parametric analysis with the software HFSS is used to optimize antenna performance for deep brain temperature sensing. Radiometric antenna efficiency (η) is evaluated in terms of the ratio of power collected from brain to total power received by the antenna. Anatomical information extracted from several adult computed tomography scans is used to establish design parameters for constructing an accurate layered 3-D tissue phantom. This head phantom includes separate brain and scalp regions, with tissue equivalent liquids circulating at independent temperatures on either side of an intact skull. The optimized frequency band is 1.1-1.6 GHz producing an average antenna efficiency of 50.3% from a two turn log-spiral antenna. The entire sensor package is contained in a lightweight and low-profile 2.8 cm diameter by 1.5 cm high assembly that can be held in place over the skin with an electromagnetic interference shielding adhesive patch. The calculated radiometric equivalent brain temperature tracks within 0.4 °C of the measured brain phantom temperature when the brain phantom is lowered 10 °C and then returned to the original temperature (37 °C) over a 4.6-h experiment. The numerical and experimental results demonstrate that the optimized 2.5-cm log-spiral antenna is well suited for the noninvasive radiometric sensing of deep brain temperature.
Quantitative Measures for Software Independent Verification and Validation
NASA Technical Reports Server (NTRS)
Lee, Alice
1996-01-01
As software is maintained or reused, it undergoes an evolution which tends to increase the overall complexity of the code. To understand the effects of this, we brought in statistics experts and leading researchers in software complexity, reliability, and their interrelationships. These experts' project has resulted in our ability to statistically correlate specific code complexity attributes, in orthogonal domains, to errors found over time in the HAL/S flight software which flies in the Space Shuttle. Although only a prototype-tools experiment, the result of this research appears to be extendable to all other NASA software, given appropriate data similar to that logged for the Shuttle onboard software. Our research has demonstrated that a more complete domain coverage can be mathematically demonstrated with the approach we have applied, thereby ensuring full insight into the cause-and-effects relationship between the complexity of a software system and the fault density of that system. By applying the operational profile we can characterize the dynamic effects of software path complexity under this same approach We now have the ability to measure specific attributes which have been statistically demonstrated to correlate to increased error probability, and to know which actions to take, for each complexity domain. Shuttle software verifiers can now monitor the changes in the software complexity, assess the added or decreased risk of software faults in modified code, and determine necessary corrections. The reports, tool documentation, user's guides, and new approach that have resulted from this research effort represent advances in the state of the art of software quality and reliability assurance. Details describing how to apply this technique to other NASA code are contained in this document.
Wavelet extractor: A Bayesian well-tie and wavelet extraction program
NASA Astrophysics Data System (ADS)
Gunning, James; Glinsky, Michael E.
2006-06-01
We introduce a new open-source toolkit for the well-tie or wavelet extraction problem of estimating seismic wavelets from seismic data, time-to-depth information, and well-log suites. The wavelet extraction model is formulated as a Bayesian inverse problem, and the software will simultaneously estimate wavelet coefficients, other parameters associated with uncertainty in the time-to-depth mapping, positioning errors in the seismic imaging, and useful amplitude-variation-with-offset (AVO) related parameters in multi-stack extractions. It is capable of multi-well, multi-stack extractions, and uses continuous seismic data-cube interpolation to cope with the problem of arbitrary well paths. Velocity constraints in the form of checkshot data, interpreted markers, and sonic logs are integrated in a natural way. The Bayesian formulation allows computation of full posterior uncertainties of the model parameters, and the important problem of the uncertain wavelet span is addressed uses a multi-model posterior developed from Bayesian model selection theory. The wavelet extraction tool is distributed as part of the Delivery seismic inversion toolkit. A simple log and seismic viewing tool is included in the distribution. The code is written in Java, and thus platform independent, but the Seismic Unix (SU) data model makes the inversion particularly suited to Unix/Linux environments. It is a natural companion piece of software to Delivery, having the capacity to produce maximum likelihood wavelet and noise estimates, but will also be of significant utility to practitioners wanting to produce wavelet estimates for other inversion codes or purposes. The generation of full parameter uncertainties is a crucial function for workers wishing to investigate questions of wavelet stability before proceeding to more advanced inversion studies.
Reliable data storage system design and implementation for acoustic logging while drilling
NASA Astrophysics Data System (ADS)
Hao, Xiaolong; Ju, Xiaodong; Wu, Xiling; Lu, Junqiang; Men, Baiyong; Yao, Yongchao; Liu, Dong
2016-12-01
Owing to the limitations of real-time transmission, reliable downhole data storage and fast ground reading have become key technologies in developing tools for acoustic logging while drilling (LWD). In order to improve the reliability of the downhole storage system in conditions of high temperature, intensive shake and periodic power supply, improvements were made in terms of hardware and software. In hardware, we integrated the storage system and data acquisition control module into one circuit board, to reduce the complexity of the storage process, by adopting the controller combination of digital signal processor and field programmable gate array. In software, we developed a systematic management strategy for reliable storage. Multiple-backup independent storage was employed to increase the data redundancy. A traditional error checking and correction (ECC) algorithm was improved and we embedded the calculated ECC code into all management data and waveform data. A real-time storage algorithm for arbitrary length data was designed to actively preserve the storage scene and ensure the independence of the stored data. The recovery procedure of management data was optimized to realize reliable self-recovery. A new bad block management idea of static block replacement and dynamic page mark was proposed to make the period of data acquisition and storage more balanced. In addition, we developed a portable ground data reading module based on a new reliable high speed bus to Ethernet interface to achieve fast reading of the logging data. Experiments have shown that this system can work stably below 155 °C with a periodic power supply. The effective ground data reading rate reaches 1.375 Mbps with 99.7% one-time success rate at room temperature. This work has high practical application significance in improving the reliability and field efficiency of acoustic LWD tools.
Using metrics to describe the participative stances of members within discussion forums.
Jones, Ray; Sharkey, Siobhan; Smithson, Janet; Ford, Tamsin; Emmens, Tobit; Hewis, Elaine; Sheaves, Bryony; Owens, Christabel
2011-01-10
Researchers using forums and online focus groups need to ensure they are safe and need tools to make best use of the data. We explored the use of metrics that would allow better forum management and more effective analysis of participant contributions. To report retrospectively calculated metrics from self-harm discussion forums and to assess whether metrics add to other methods such as discourse analysis. We asked (1) which metrics are most useful to compare and manage forums, and (2) how metrics can be used to identify the participative stances of members to help manage discussion forums. We studied the use of metrics in discussion forums on self-harm. SharpTalk comprised five discussion forums, all using the same software but with different forum compositions. SharpTalk forums were similar to most moderated forums but combined support and general social chat with online focus groups discussing issues on self-harm. Routinely recorded time-stamp data were used to derive metrics of episodes, time online, pages read, and postings. We compared metrics from the forums with views from discussion threads and from moderators. We identified patterns of participants' online behavior by plotting scattergrams and identifying outliers and clusters within different metrics. In comparing forums, important metrics seem to be number of participants, number of active participants, total time of all participants logged on in each 24 hours, and total number of postings by all participants in 24 hours. In examining participative stances, the important metrics were individuals' time logged per 24 hours, number of episodes, mean length of episodes, number of postings per 24 hours, and location within the forum of those postings. Metric scattergrams identified several participative stances: (1) the "caretaker," who was "always around," logged on for a much greater time than most other participants, posting but mainly in response to others and rarely initiating threads, (2) the "butterfly," who "flitted in and out," had a large number of short episodes, (3) two "discussants," who initiated many more discussion threads than anybody else and posted proportionately less in the support room, (4) "here for you," who posted frequently in the support room in response to other participants' threads, and (5) seven "people in distress," who posted many comments in the support room in comparison with their total postings and tended to post on their own threads. Real-time metrics may be useful: (1) by offering additional ways of comparing different discussion forums helping with their management, and (2) by identifying participative stances of individuals so allowing better moderation and support of forums, and more effective use of the data collected. For this to happen, researchers need to publish metrics for their discussion forums and software developers need to offer more real-time metrics facilities.
NASA Technical Reports Server (NTRS)
Wilson, Larry
1991-01-01
There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.
Rapid Diagnostics of Onboard Sequences
NASA Technical Reports Server (NTRS)
Starbird, Thomas W.; Morris, John R.; Shams, Khawaja S.; Maimone, Mark W.
2012-01-01
Keeping track of sequences onboard a spacecraft is challenging. When reviewing Event Verification Records (EVRs) of sequence executions on the Mars Exploration Rover (MER), operators often found themselves wondering which version of a named sequence the EVR corresponded to. The lack of this information drastically impacts the operators diagnostic capabilities as well as their situational awareness with respect to the commands the spacecraft has executed, since the EVRs do not provide argument values or explanatory comments. Having this information immediately available can be instrumental in diagnosing critical events and can significantly enhance the overall safety of the spacecraft. This software provides auditing capability that can eliminate that uncertainty while diagnosing critical conditions. Furthermore, the Restful interface provides a simple way for sequencing tools to automatically retrieve binary compiled sequence SCMFs (Space Command Message Files) on demand. It also enables developers to change the underlying database, while maintaining the same interface to the existing applications. The logging capabilities are also beneficial to operators when they are trying to recall how they solved a similar problem many days ago: this software enables automatic recovery of SCMF and RML (Robot Markup Language) sequence files directly from the command EVRs, eliminating the need for people to find and validate the corresponding sequences. To address the lack of auditing capability for sequences onboard a spacecraft during earlier missions, extensive logging support was added on the Mars Science Laboratory (MSL) sequencing server. This server is responsible for generating all MSL binary SCMFs from RML input sequences. The sequencing server logs every SCMF it generates into a MySQL database, as well as the high-level RML file and dictionary name inputs used to create the SCMF. The SCMF is then indexed by a hash value that is automatically included in all command EVRs by the onboard flight software. Second, both the binary SCMF result and the RML input file can be retrieved simply by specifying the hash to a Restful web interface. This interface enables command line tools as well as large sophisticated programs to download the SCMF and RMLs on-demand from the database, enabling a vast array of tools to be built on top of it. One such command line tool can retrieve and display RML files, or annotate a list of EVRs by interleaving them with the original sequence commands. This software has been integrated with the MSL sequencing pipeline where it will serve sequences useful in diagnostics, debugging, and situational awareness throughout the mission.
Web processing service for landslide hazard assessment
NASA Astrophysics Data System (ADS)
Sandric, I.; Ursaru, P.; Chitu, D.; Mihai, B.; Savulescu, I.
2012-04-01
Hazard analysis requires heavy computation and specialized software. Web processing services can offer complex solutions that can be accessed through a light client (web or desktop). This paper presents a web processing service (both WPS and Esri Geoprocessing Service) for landslides hazard assessment. The web processing service was build with Esri ArcGIS Server solution and Python, developed using ArcPy, GDAL Python and NumPy. A complex model for landslide hazard analysis using both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation was build and published as WPS and Geoprocessing service using ArcGIS Standard Enterprise 10.1. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. All these parameters can be served by the client from other WFS services or by uploading and processing the data on the server. The user can select the option of creating the first and second derivatives from the DEM automatically on the server or to upload the data already calculated. One of the main dynamic factors from the landslide analysis model is leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index can be derived from various satellite images or downloaded as a product. The upload of such data (time series) is possible using a NetCDF file format. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created and it can be downloaded from the server with the log file
Lehmann, Eldon D
2003-01-01
AIDA is a diabetes-computing program freely available at www.2aida.org on the Web. The software is intended to serve as an educational support tool and can be used by anyone who has an interest in diabetes, whether they be patients, relatives, health-care professionals, or students. In previous "Diabetes Information Technology & WebWatch" columns various indicators of usage of the AIDA program have been reviewed, and various comments from users of the software have been documented. The purpose of this column is to overview a proof-of-concept semi-automated analysis about why people are downloading the latest version of the AIDA educational diabetes program. AIDA permits the interactive simulation of plasma insulin and blood glucose profiles for teaching, demonstration, self-learning, and research purposes. It has been made freely available, without charge, on the Internet as a noncommercial contribution to continuing diabetes education. Since its launch in 1996 over 300,000 visits have been logged at the main AIDA Website-www.2aida.org-and over 60,000 copies of the AIDA program have been downloaded free-of-charge. This column documents the results of a semi-automated analysis of comments left by Website visitors while they were downloading the AIDA software, before they had a chance to use the program. The Internet-based survey methodology and semi-automated analysis were both found to be robust and reliable. Over a 5-month period (from October 3, 2001 to February 28, 2002) 400 responses were received. During the corresponding period 1,770 actual visits were made to the Website survey page-giving a response rate to this proof-of-concept study of 22.6%. Responses were received from participants in over 54 countries-with nearly half of these (n = 194; 48.5%) originating from the United States, United Kingdom, and Canada; 208 responses (52.0%) were received from patients with diabetes, 50 (12.5%) from doctors, 49 (12.3%) from relatives of patients, with fewer responses from students, diabetes educators, nurses, pharmacists, and other end users. The semi-automated analysis adopted for this study has re-affirmed the feasibility of using the Internet to obtain free-text comments, at no real cost, from a substantial number of medical software downloaders/users. The survey has also offered some insight into why members of the public continue to turn to the Internet for medical information. Furthermore it has provided useful information about why people are actually downloading the AIDA v4.3a interactive educational "virtual diabetes patient" simulator.
Quantification of residual dose estimation error on log file-based patient dose calculation.
Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Matsunaga, Kenichi; Matsushita, Haruo; Majima, Kazuhiro; Jingu, Keiichi
2016-05-01
The log file-based patient dose estimation includes a residual dose estimation error caused by leaf miscalibration, which cannot be reflected on the estimated dose. The purpose of this study is to determine this residual dose estimation error. Modified log files for seven head-and-neck and prostate volumetric modulated arc therapy (VMAT) plans simulating leaf miscalibration were generated by shifting both leaf banks (systematic leaf gap errors: ±2.0, ±1.0, and ±0.5mm in opposite directions and systematic leaf shifts: ±1.0mm in the same direction) using MATLAB-based (MathWorks, Natick, MA) in-house software. The generated modified and non-modified log files were imported back into the treatment planning system and recalculated. Subsequently, the generalized equivalent uniform dose (gEUD) was quantified for the definition of the planning target volume (PTV) and organs at risks. For MLC leaves calibrated within ±0.5mm, the quantified residual dose estimation errors that obtained from the slope of the linear regression of gEUD changes between non- and modified log file doses per leaf gap are in head-and-neck plans 1.32±0.27% and 0.82±0.17Gy for PTV and spinal cord, respectively, and in prostate plans 1.22±0.36%, 0.95±0.14Gy, and 0.45±0.08Gy for PTV, rectum, and bladder, respectively. In this work, we determine the residual dose estimation errors for VMAT delivery using the log file-based patient dose calculation according to the MLC calibration accuracy. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Locating knots by industrial tomography- A feasibility study
Fred W. Taylor; Francis G. Wagner; Charles W. McMillin; Ira L. Morgan; Forrest F. Hopkins
1984-01-01
Industrial photon tomography was used to scan four southern pine logs and one red oak log. The logs were scanned at 16 cross-sectional slice planes located 1 centimeter apart along their longitudinal axes. Tomographic reconstructions were made from the scan data collected at these slice planes, and a cursory image analysis technique was developed to locate the log...
Incidence of Russian log export tax: A vertical log-lumber model
Ying Lin; Daowei Zhang
2017-01-01
In 2007, Russia imposed an ad valorem tax on its log exports that lasted until 2012. In this paper, weuse a Muth-type equilibrium displacement model to investigate the market and welfare impacts of this tax, utilizing a vertical linkage between log and lumber markets and considering factor substitution. Our theoretical analysis indicates...
Reynolds, Richard J.; Anderson, J. Alton; Williams, John H.
2015-01-01
The geophysical logs and their analyses are available for display and download from the U.S. Geological Survey, New York Water Science Center, online geophysical log archive (http://ny.water.usgs.gov/maps/geologs/) in LAS (Log ASCII Standard), PDF, and WellCad formats.
Interactive machine learning for postprocessing CT images of hardwood logs
Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt
2003-01-01
This paper concerns the nondestructive evaluation of hardwood logs through the analysis of computed tomography (CT) images. Several studies have shown that the commercial value of resulting boards can be increased substantially if log sawing strategies are chosen using prior knowledge of internal log defects. Although CT imaging offers a potential means of obtaining...
Evaluation of residual oil saturation after waterflood in a carbonate reservoir
DOE Office of Scientific and Technical Information (OSTI.GOV)
Verma, M.K.; Boucherit, M.; Bouvier, L.
Four different approaches, including special core analysis (SCAL), log-inject-log, thermal-decay-time (TDT) logs, and material balance, were used to narrow the range of residual oil saturation (ROS) after waterflood, S[sub orw], in a carbonate reservoir in Qatar to between 23% and 27%. An equation was developed that relates S[sub orw] with connate-water saturation, S[sub wi], and porosity. This paper presents the results of S[sub orw] determinations with four different techniques: core waterflood followed by centrifuging, log-inject-log, TDT logging, and material balance.
A virtual screening method for inhibitory peptides of Angiotensin I-converting enzyme.
Wu, Hongxi; Liu, Yalan; Guo, Mingrong; Xie, Jingli; Jiang, XiaMin
2014-09-01
Natural small peptides from foods have been proven to be efficient inhibitors of Angiotensin I-converting enzyme (ACE) for the regulation of blood pressure. The traditional ACE inhibitory peptides screening method is both time consuming and money costing, to the contrary, virtual screening method by computation can break these limitations. We establish a virtual screening method to obtain ACE inhibitory peptides with the help of Libdock module of Discovery Studio 3.5 software. A significant relationship between Libdock score and experimental IC(50) was found, Libdock score = 10.063 log(1/IC(50)) + 68.08 (R(2) = 0.62). The credibility of the relationship was confirmed by testing the coincidence of the estimated log(1/IC(50)) and measured log(1/IC(50)) (IC(50) is 50% inhibitory concentration toward ACE, in μmol/L) of 5 synthetic ACE inhibitory peptides, which was virtual hydrolyzed and screened from a kind of seafood, Phascolosoma esculenta. Accordingly, Libdock method is a valid IC(50) estimation tool and virtual screening method for small ACE inhibitory peptides. © 2014 Institute of Food Technologists®
CILogon: An Integrated Identity and Access Management Platform for Science
NASA Astrophysics Data System (ADS)
Basney, J.
2016-12-01
When scientists work together, they use web sites and other software to share their ideas and data. To ensure the integrity of their work, these systems require the scientists to log in and verify that they are part of the team working on a particular science problem. Too often, the identity and access verification process is a stumbling block for the scientists. Scientific research projects are forced to invest time and effort into developing and supporting Identity and Access Management (IAM) services, distracting them from the core goals of their research collaboration. CILogon provides an IAM platform that enables scientists to work together to meet their IAM needs more effectively so they can allocate more time and effort to their core mission of scientific research. The CILogon platform enables federated identity management and collaborative organization management. Federated identity management enables researchers to use their home organization identities to access cyberinfrastructure, rather than requiring yet another username and password to log on. Collaborative organization management enables research projects to define user groups for authorization to collaboration platforms (e.g., wikis, mailing lists, and domain applications). CILogon's IAM platform serves the unique needs of research collaborations, namely the need to dynamically form collaboration groups across organizations and countries, sharing access to data, instruments, compute clusters, and other resources to enable scientific discovery. CILogon provides a software-as-a-service platform to ease integration with cyberinfrastructure, while making all software components publicly available under open source licenses to enable re-use. Figure 1 illustrates the components and interfaces of this platform. CILogon has been operational since 2010 and has been used by over 7,000 researchers from more than 170 identity providers to access cyberinfrastructure including Globus, LIGO, Open Science Grid, SeedMe, and XSEDE. The "CILogon 2.0" platform, launched in 2016, adds support for virtual organization (VO) membership management, identity linking, international collaborations, and standard integration protocols, through integration with the Internet2 COmanage collaboration software.
Integration of an Autopilot for a Micro Air Vehicle
NASA Technical Reports Server (NTRS)
Platanitis, George; Shkarayev, Sergey
2005-01-01
Two autopilots providing autonomous flight capabilities are presented herein. The first is the Pico-Pilot, demonstrated for the 12-inch size class of micro air vehicles. The second is the MicroPilot MP2028(sup g), where its integration into a 36-inch Zagi airframe (tailless, elevons only configuration) is investigated and is the main focus of the report. Analytical methods, which include the use of the Advanced Aircraft Analysis software from DARCorp, were used to determine the stability and control derivatives, which were then validated through wind tunnel experiments. From the aerodynamic data, the linear, perturbed equations of motion from steady-state flight conditions may be cast in terms of these derivatives. Using these linear equations, transfer functions for the control and navigation systems were developed and feedback control laws based on Proportional, Integral, and Derivative (PID) control design were developed to control the aircraft. The PID gains may then be programmed into the autopilot software and uploaded to the microprocessor of the autopilot. The Pico-Pilot system was flight tested and shown to be successful in navigating a 12-inch MAV through a course defined by a number of waypoints with a high degree of accuracy, and in 20 mph winds. The system, though, showed problems with control authority in the roll and pitch motion of the aircraft: causing oscillations in these directions, but the aircraft maintained its heading while following the prescribed course. Flight tests were performed in remote control mode to evaluate handling, adjust trim, and test data logging for the Zagi with integrated MP2028(sup g). Ground testing was performed to test GPS acquisition, data logging, and control response in autonomous mode. Technical difficulties and integration limitations with the autopilot prevented fully autonomous flight from taking place, but the integration methodologies developed for this autopilot are, in general, applicable for unmanned air vehicles within the 36-inch size class or larger that use a PID control based autopilot.
A new method for correlation analysis of compositional (environmental) data - a worked example.
Reimann, C; Filzmoser, P; Hron, K; Kynčlová, P; Garrett, R G
2017-12-31
Most data in environmental sciences and geochemistry are compositional. Already the unit used to report the data (e.g., μg/l, mg/kg, wt%) implies that the analytical results for each element are not free to vary independently of the other measured variables. This is often neglected in statistical analysis, where a simple log-transformation of the single variables is insufficient to put the data into an acceptable geometry. This is also important for bivariate data analysis and for correlation analysis, for which the data need to be appropriately log-ratio transformed. A new approach based on the isometric log-ratio (ilr) transformation, leading to so-called symmetric coordinates, is presented here. Summarizing the correlations in a heat-map gives a powerful tool for bivariate data analysis. Here an application of the new method using a data set from a regional geochemical mapping project based on soil O and C horizon samples is demonstrated. Differences to 'classical' correlation analysis based on log-transformed data are highlighted. The fact that some expected strong positive correlations appear and remain unchanged even following a log-ratio transformation has probably led to the misconception that the special nature of compositional data can be ignored when working with trace elements. The example dataset is employed to demonstrate that using 'classical' correlation analysis and plotting XY diagrams, scatterplots, based on the original or simply log-transformed data can easily lead to severe misinterpretations of the relationships between elements. Copyright © 2017 Elsevier B.V. All rights reserved.
SP_Ace: Stellar Parameters And Chemical abundances Estimator
NASA Astrophysics Data System (ADS)
Boeche, C.; Grebel, E. K.
2018-05-01
SP_Ace (Stellar Parameters And Chemical abundances Estimator) estimates the stellar parameters Teff, log g, [M/H], and elemental abundances. It employs 1D stellar atmosphere models in Local Thermodynamic Equilibrium (LTE). The code is highly automated and suitable for analyzing the spectra of large spectroscopic surveys with low or medium spectral resolution (R = 2000-20 000). A web service for calculating these values with the software is also available.
Multilingual Speech and Language Processing
2003-04-01
client software handles the user end of the transaction. Historically, four clients were provided: e-mail, web, FrameMaker , and command line. By...command-line client and an API. The API allows integration of CyberTrans into a number of processes including word processing packages ( FrameMaker ...preservation and logging, and others. The available clients remain e-mail, Web and FrameMaker . Platforms include both Unix and PC for clients, with
Optical functional performance of the osteo-odonto-keratoprosthesis.
Lee, Richard M H; Ong, Gek L; Lam, Fook Chang; White, Joy; Crook, David; Liu, Christopher S C; Hull, Chris C
2014-10-01
The aim of this study was to evaluate optical and visual functional performance of the osteo-odonto-keratoprosthesis (OOKP). Optical design and analysis was performed with customized optical design software. Nine patients with implanted OOKP devices and 9 age-matched control patients were assessed. Contrast sensitivity was assessed and glare effect was measured with a brightness acuity test. All OOKP patients underwent kinetic Goldmann perimetry and wavefront aberrometry and completed the National Eye Institute Visual Function Questionnaire-25 (NEI VFQ-25). Optical analysis showed that the optical cylinder is near diffraction-limited. A reduction in median visual acuity (VA) with increasing glare settings was observed from 0.04 logMAR (without glare) to 0.20 logMAR (with glare at "high" setting) and significantly reduced statistically when compared with the control group at all levels of glare (P < 0.05). Contrast sensitivity was significantly reduced when compared with age-matched controls at medium and high spatial frequencies (P < 0.05). Median Goldmann perimetry was 65 degrees (interquartile range, 64-74 degrees; V-4e isopters) and 69 degrees excluding 2 glaucomatous subjects. Several vision-related NEI VFQ-25 subscales correlated significantly with VA at various brightness acuity test levels and contrast sensitivity at medium spatial frequencies, including dependency, general vision, near activities and distance activities. The OOKP optical cylinder provides patients with a good level of VA that is significantly reduced by glare. We have shown in vivo that updates to the optical cylinder design have improved the patient's field of view. Reduction of glare and refinement of cylinder alignment methods may further improve visual function and patient satisfaction.
LaManna, Joseph A.; Martin, Thomas E.
2017-01-01
Understanding the causes underlying changes in species diversity is a fundamental pursuit of ecology. Animal species richness and composition often change with decreased forest structural complexity associated with logging. Yet differences in latitude and forest type may strongly influence how species diversity responds to logging. We performed a meta-analysis of logging effects on local species richness and composition of birds across the world and assessed responses by different guilds (nesting strata, foraging strata, diet, and body size). This approach allowed identification of species attributes that might underlie responses to this anthropogenic disturbance. We only examined studies that allowed forests to regrow naturally following logging, and accounted for logging intensity, spatial extent, successional regrowth after logging, and the change in species composition expected due to random assembly from regional species pools. Selective logging in the tropics and clearcut logging in temperate latitudes caused loss of species from nearly all forest strata (ground to canopy), leading to substantial declines in species richness (up to 27% of species). Few species were lost or gained following any intensity of logging in lower-latitude temperate forests, but the relative abundances of these species changed substantially. Selective logging at higher-temperate latitudes generally replaced late-successional specialists with early-successional specialists, leading to no net changes in species richness but large changes in species composition. Removing less basal area during logging mitigated the loss of avian species from all forests and, in some cases, increased diversity in temperate forests. This meta-analysis provides insights into the important role of habitat specialization in determining differential responses of animal communities to logging across tropical and temperate latitudes.
LaManna, Joseph A; Martin, Thomas E
2017-08-01
Understanding the causes underlying changes in species diversity is a fundamental pursuit of ecology. Animal species richness and composition often change with decreased forest structural complexity associated with logging. Yet differences in latitude and forest type may strongly influence how species diversity responds to logging. We performed a meta-analysis of logging effects on local species richness and composition of birds across the world and assessed responses by different guilds (nesting strata, foraging strata, diet, and body size). This approach allowed identification of species attributes that might underlie responses to this anthropogenic disturbance. We only examined studies that allowed forests to regrow naturally following logging, and accounted for logging intensity, spatial extent, successional regrowth after logging, and the change in species composition expected due to random assembly from regional species pools. Selective logging in the tropics and clearcut logging in temperate latitudes caused loss of species from nearly all forest strata (ground to canopy), leading to substantial declines in species richness (up to 27% of species). Few species were lost or gained following any intensity of logging in lower-latitude temperate forests, but the relative abundances of these species changed substantially. Selective logging at higher-temperate latitudes generally replaced late-successional specialists with early-successional specialists, leading to no net changes in species richness but large changes in species composition. Removing less basal area during logging mitigated the loss of avian species from all forests and, in some cases, increased diversity in temperate forests. This meta-analysis provides insights into the important role of habitat specialization in determining differential responses of animal communities to logging across tropical and temperate latitudes. © 2016 Cambridge Philosophical Society.
Using low-grade hardwoods for CLT production: a yield analysis
R. Edward Thomas; Urs Buehlmann
2017-01-01
Low-grade hardwood logs are the by-product of logging operations and, more frequently today, urban tree removals. The market prices for these logs is low, as is the value recovered from their logs when producing traditional forest products such as pallet parts, railroad ties, landscaping mulch, or chips for pulp. However, the emergence of cross-laminated timber (CLT)...
Predicting internal yellow-poplar log defect features using surface indicators
R. Edward Thomas
2008-01-01
Determining the defects that are located within the log is crucial to understanding the tree/log resource for efficient processing. However, existing means of doing this non-destructively requires the use of expensive X-ray/CT, MRI, or microwave technology. These methods do not lend themselves to fast, efficient, and cost-effective analysis of logs and tree stems in...
NASA Astrophysics Data System (ADS)
Fussi, Fabio; Bonomi, Tullia; Fava, Francesco; Hamidou, Barry; Hamidou Khane, Cheikh; Faye, Gayane; Wade, Souleye; Colombo, Roberto
2014-05-01
Background In order to increase access to drinking water in Africa there is more and more interest in the promotion of manual drilling techniques, without need of expensive drilling equipment, but they can be applied only in those areas with suitable hydrogeological conditions: thick layers of unconsolidated sediments and shallow groundwater level. Mapping of suitable zones for manual drilling at national level in Africa is a crucial activity and local institutions and UNICEF are implementing specific programs for its promotion, but the limitation in available data concerning shallow hydrogeological aquifers are limited. The research has been developed in the project "Use of remote sensing and terrain modeling to identify suitable zones for manual drilling in Africa and support low cost water supply", within the scientific cooperation between the University of Milano-Bicocca, Universite' Cheick Anta Diop (Dakar Senegal) , SNAPE - Service Nationale de Points d'Eau (Conakry Guinea), UNICEF Senegal and UNICEF Guinea. The project is funded by NERC (National Environmental Research Council, UK). Objective of the research: The presented work is only the starting point of the project aiming to elaborate an automatic procedures to manage and improve the existing database of borehole logs in Senegal and Guinea for the interpretation of shallow hydrogeological conditions and identification of suitable zones for manual drilling, in two pilot areas: Louga (Northwestern Senegal) and Faranah/Kankan (Eastern Guinea). Within the objective of the project is also considered the integration of Remote Sensing to support hydrogeological interpretation, especially where borehole logs are not present. Methodology Focus is to create a hydrogeological database, TANGAFRIC, to organize, codify and elaborate hydrogeological data. The metodology derives from the software TANGRAM (www.tangram.samit.unimib.it) produced by the University of Milano Bicocca, with innovative aspect of stratigraphic data codification, quantification and processing, connected to a hydraulic conductivity value associated to each primary lithology. Results Starting from the database of borehole logs available at national level in Senegal and Guinea (about 1400 borehole logs in Senegal and 800 in Guinea, with 20000 definitions), their structure and information have been compared and a new common database has been set up; it has a consistent structure with the structure of existing national database and data can be easily imported and exported. From this joint, the new software TANGAFRIC has been created with different purposes: -to organize in the same way wells data, since the two countries have different administrative divisions (ID code, name of village, district, regions, coordinates); -to add new wells data, not existing in the previous databases; -to codify the stratigraphic layer of each well logs with a 5-digit alphanumeric codes, using a list of categories describing texture, status and color or each layers, identified from the most recurrent lithological classes and attributes; -to attribute a specific value of hydraulic conductivity to each texture, from well data, field pumping test, bibliographic review. TANGAFRIC includes one module for data input and a second module to process the data, and extract specific parameters concerning mean texture, hydraulic conductivity and transmissivity in selected depth ranges. This is made possible by attributing a weight to the digits of the code for textures. The program calculates the percentage of the chosen lithology, as related to each individual layer, and also a weighted average of hydraulic conductivity. It has been possible to produce maps showing the distribution of main texture classes, thickness of saturated unconsolidated sediments and expected transmissivity. Furthermore, these parameters have been used to estimate the suitability for manual drilling under the hydrogeological coniditions described in each borehole logs.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swift, T.E.; Marlow, R.E.; Wilhelm, M.H.
1981-11-01
This report describes part of the work done to fulfill a contract awarded to Gruy Federal, Inc., by the Department of Energy (DOE) on Feburary 12, 1979. The work includes pressure-coring and associated logging and testing programs to provide data on in-situ oil saturation, porosity and permeability distribution, and other data needed for resource characterization of fields and reservoirs in which CO/sub 2/ injection might have a high probability of success. This report details the second such project. Core porosities agreed well with computed log porosities. Core water saturation and computed log porosities agree fairly well from 3692 to 3712more » feet, poorly from 3712 to 3820 feet and in a general way from 4035 to 4107 feet. Computer log analysis techniques incorporating the a, m, and n values obtained from Core Laboratories analysis did not improve the agreement of log versus core derived water saturations. However, both core and log analysis indicated the ninth zone had the highest residual hydrocarbon saturations and production data confirmed the validity of oil saturation determinations. Residual oil saturation, for the perforated and tested intervals were 259 STB/acre-ft for the interval from 4035 to 4055 feet, and 150 STB/acre-ft for the interval from 3692 to 3718 feet. Nine BOPD was produced from the interval 4035 to 4055 feet and no oil was produced from interval 3692 to 3718 feet, qualitatively confirming the relative oil saturations as calculated. The low oil production in the zone from 4022 to 4055 and the lack of production from 3692 to 3718 feet indicated the zone to be at or near residual waterflood conditions as determined by log analysis. This project demonstrates the usefulness of integrating pressure core, log, and production data to realistically evaluate a reservoir for carbon dioxide flood.« less
Opening the Black Box of Electronic Health: Collecting, Analyzing, and Interpreting Log Data
Kelders, Saskia; Poel, Mannes; van Gemert-Pijnen, Lisette
2017-01-01
In electronic health (eHealth) research, limited insight has been obtained on process outcomes or how the use of technology has contributed to the users’ ability to have a healthier life, improved well-being, or activate new attitudes in their daily tasks. As a result, eHealth is often perceived as a black box. To open this black box of eHealth, methodologies must extend beyond the classic effect evaluations. The analyses of log data (anonymous records of real-time actions performed by each user) can provide continuous and objective insights into the actual usage of the technology. However, the possibilities of log data in eHealth research have not been exploited to their fullest extent. The aim of this paper is to describe how log data can be used to improve the evaluation and understand the use of eHealth technology with a broader approach than only descriptive statistics. This paper serves as a starting point for using log data analysis in eHealth research. Here, we describe what log data is and provide an overview of research questions to evaluate the system, the context, the users of a technology, as well as the underpinning theoretical constructs. We also explain the requirements for log data, the starting points for the data preparation, and methods for data collection. Finally, we describe methods for data analysis and draw a conclusion regarding the importance of the results for both scientific and practical applications. The analysis of log data can be of great value for opening the black box of eHealth. A deliberate log data analysis can give new insights into how the usage of the technology contributes to found effects and can thereby help to improve the persuasiveness and effectiveness of eHealth technology and the underpinning behavioral models. PMID:28784592
Harrison, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.; Robbins, Lisa L.
2007-01-01
In May of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Siesta Key, Florida. This report serves as an archive of unprocessed digital chirp seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Harrison, Arnell S.; Dadisman, Shawn V.; Ferina, Nick F.; Wiese, Dana S.; Flocks, James G.
2007-01-01
In June of 2006, the U.S. Geological Survey conducted a geophysical survey offshore of Isles Dernieres, Louisiana. This report serves as an archive of unprocessed digital CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic UNIX (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Remote Control and Monitoring of VLBI Experiments by Smartphones
NASA Astrophysics Data System (ADS)
Ruztort, C. H.; Hase, H.; Zapata, O.; Pedreros, F.
2012-12-01
For the remote control and monitoring of VLBI operations, we developed a software optimized for smartphones. This is a new tool based on a client-server architecture with a Web interface optimized for smartphone screens and cellphone networks. The server uses variables of the Field System and its station specific parameters stored in the shared memory. The client running on the smartphone by a Web interface analyzes and visualizes the current status of the radio telescope, receiver, schedule, and recorder. In addition, it allows commands to be sent remotely to the Field System computer and displays the log entries. The user has full access to the entire operation process, which is important in emergency cases. The software also integrates a webcam interface.
Biermann, Martin
2014-04-01
Clinical trials aiming for regulatory approval of a therapeutic agent must be conducted according to Good Clinical Practice (GCP). Clinical Data Management Systems (CDMS) are specialized software solutions geared toward GCP-trials. They are however less suited for data management in small non-GCP research projects. For use in researcher-initiated non-GCP studies, we developed a client-server database application based on the public domain CakePHP framework. The underlying MySQL database uses a simple data model based on only five data tables. The graphical user interface can be run in any web browser inside the hospital network. Data are validated upon entry. Data contained in external database systems can be imported interactively. Data are automatically anonymized on import, and the key lists identifying the subjects being logged to a restricted part of the database. Data analysis is performed by separate statistics and analysis software connecting to the database via a generic Open Database Connectivity (ODBC) interface. Since its first pilot implementation in 2011, the solution has been applied to seven different clinical research projects covering different clinical problems in different organ systems such as cancer of the thyroid and the prostate glands. This paper shows how the adoption of a generic web application framework is a feasible, flexible, low-cost, and user-friendly way of managing multidimensional research data in researcher-initiated non-GCP clinical projects. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Huiqin; Wang, Xue; Lynette, Kibe; Cao, Minghua
2018-06-01
The performance of multiple-input multiple-output wireless optical communication systems that adopt Q-ary pulse position modulation over spatial correlated log-normal fading channel is analyzed in terms of its un-coded bit error rate and ergodic channel capacity. The analysis is based on the Wilkinson's method which approximates the distribution of a sum of correlated log-normal random variables to a log-normal random variable. The analytical and simulation results corroborate the increment of correlation coefficients among sub-channels lead to system performance degradation. Moreover, the receiver diversity has better performance in resistance of spatial correlation caused channel fading.
Comparison of logging residue from lump sum and log scale timber sales.
James O Howard; Donald J. DeMars
1985-01-01
Data from 1973 and 1980 logging residues studies were used to compare the volume of residue from lump sum and log scale timber sales. Covariance analysis was used to adjust the mean volume for each data set for potential variation resulting from differences in stand conditions. Mean residue volumes from the two sale types were significantly different at the 5-percent...
Tractor-logging costs and production in old-growth redwood forests
Kenneth N. Boe
1963-01-01
A cost accounting analysis of full-scale logging operations in old-growth redwood during 2 years revealed that it cost $12.24 per M bd. ft. (gross Scribner log scale) to get logs on trucks. Road development costs averaged another $5.19 per M bd. ft. Felling-bucking production was calculated by average tree d.b.h. Both skidding and loading outputs per hour were...
Financial feasibility of a log sort yard handling small-diameter logs: A preliminary study
Han-Sup Han; E. M. (Ted) Bilek; John (Rusty) Dramm; Dan Loeffler; Dave Calkin
2011-01-01
The value and use of the trees removed in fuel reduction thinning and restoration treatments could be enhanced if the wood were effectively evaluated and sorted for quality and highest value before delivery to the next manufacturing destination. This article summarizes a preliminary financial feasibility analysis of a log sort yard that would serve as a log market to...
Ruhong Li; J. Buongiorno; J.A. Turner; S. Zhu; J. Prestemon
2008-01-01
We assessed the impact on the world forest sector of a progressive elimination of illegal logging. The analysis compared predictions from 2007 to 2020, with and without a gradual reduction of illegally logged industrial roundwood from 2007 to 2011. A large part of the curtailment of timber supply due to the stoppage of illegal logging would be compensated by increased...
Larson, Elaine L; Cohen, Bevin; Baxter, Kathleen A
2012-11-01
Minimal research has been published evaluating the effectiveness of hand hygiene delivery systems (ie, rubs, foams, or wipes) at removing viruses from hands. The purposes of this study were to determine the effect of several alcohol-based hand sanitizers in removing influenza A (H1N1) virus, and to compare the effectiveness of foam, gel, and hand wipe products. Hands of 30 volunteers were inoculated with H1N1 and randomized to treatment with foam, gel, or hand wipe applied to half of each volunteer's finger pads. The log(10) count of each subject's treated and untreated finger pads were averaged. Log(10) reductions were calculated from these differences and averaged within treatment group. Between-treatment analysis compared changes from the untreated finger pads using analysis of covariance with treatment as a factor and the average log(10) untreated finger pads as the covariate. Log(10) counts on control finger pads were 2.7-5.3 log(10) of the 50% infectious dose for tissue culture (TCID(50)/0.1 mL) (mean, 3.8 ± 0.5 log(10) TCID(50)/0.1 mL), and treated finger pad counts for all test products were 0.5-1.9 log(10) TCID(50)/0.1 mL (mean, 0.53 ± 0.17 log(10) TCID(50)/0.1 mL). Treatments with all products resulted in a significant reduction in viral titers (>3 logs) at their respective exposure times that were statistically comparable. All 3 delivery systems (foam, gel, and wipe) produced significantly reduced viral counts on hands. Copyright © 2012 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Lyon, R. J. P.; Lanz, K.
1985-01-01
Geologists in exploration need to be able to determine the mineral composition of a given outcrop, and then proceed to another in order to carry out the process of geologic mapping. Since April 1984 researchers have been developing a portable microcomputer-based imaging system (with a grey-scale of 16 shades of amber), which were demonstrated during the November 1984 GSA field trip in the field at Yerington, NV. A color-version of the same technology was recently demonstrated. The portable computer selected is a COLBY 10-Megabyte, hard disk-equipped repackaged-IBM/XT, which operates on either 110/220 VAC or on 12VDC from the cigarette lighter in a field vehicle. A COMPAQ PLUS or an IBM Portable will also work on modified software. The underlying concept is that the atmospheric transmission and surface albedo/slope terms are multiplicative, relating the spectral irradiance to the spectral color of the surface materials. Thus, the spectral color of a pixel remains after averaged log-albedo and log-irradiance have been estimated. All these steps can be carried out on the COLBY microcomputer, using 80 image lines of the 128-channel, 12-bit imagery. Results are shown for such an 80-line segment, showing the identification of an O-H bearing mineral group (of slightly varying specific characters) on the flight line.
QaaS (quality as a service) model for web services using big data technologies
NASA Astrophysics Data System (ADS)
Ahmad, Faisal; Sarkar, Anirban
2017-10-01
Quality of service (QoS) determines the service usability and utility and both of which influence the service selection process. The QoS varies from one service provider to other. Each web service has its own methodology for evaluating QoS. The lack of transparent QoS evaluation model makes the service selection challenging. Moreover, most QoS evaluation processes do not consider their historical data which not only helps in getting more accurate QoS but also helps for future prediction, recommendation and knowledge discovery. QoS driven service selection demands a model where QoS can be provided as a service to end users. This paper proposes a layered QaaS (quality as a service) model in the same line as PaaS and software as a service, where users can provide QoS attributes as inputs and the model returns services satisfying the user's QoS expectation. This paper covers all the key aspects in this context, like selection of data sources, its transformation, evaluation, classification and storage of QoS. The paper uses server log as the source for evaluating QoS values, common methodology for its evaluation and big data technologies for its transformation and analysis. This paper also establishes the fact that Spark outperforms the Pig with respect to evaluation of QoS from logs.
Experimental Evaluation of Verification and Validation Tools on Martian Rover Software
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Venet, Arnaud; Visser, Willem
2003-01-01
To achieve its science objectives in deep space exploration, NASA has a need for science platform vehicles to autonomously make control decisions in a time frame that excludes intervention from Earth-based controllers. Round-trip light-time is one significant factor motivating autonomy capability, another factor is the need to reduce ground support operations cost. An unsolved problem potentially impeding the adoption of autonomy capability is the verification and validation of such software systems, which exhibit far more behaviors (and hence distinct execution paths in the software) than is typical in current deepspace platforms. Hence the need for a study to benchmark advanced Verification and Validation (V&V) tools on representative autonomy software. The objective of the study was to access the maturity of different technologies, to provide data indicative of potential synergies between them, and to identify gaps in the technologies with respect to the challenge of autonomy V&V. The study consisted of two parts: first, a set of relatively independent case studies of different tools on the same autonomy code, second a carefully controlled experiment with human participants on a subset of these technologies. This paper describes the second part of the study. Overall, nearly four hundred hours of data on human use of three different advanced V&V tools were accumulated, with a control group that used conventional testing methods. The experiment simulated four independent V&V teams debugging three successive versions of an executive controller for a Martian Rover. Defects were carefully seeded into the three versions based on a profile of defects from CVS logs that occurred in the actual development of the executive controller. The rest of the document is structured a s follows. In section 2 and 3, we respectively describe the tools used in the study and the rover software that was analyzed. In section 4 the methodology for the experiment is described; this includes the code preparation, seeding of defects, participant training and experimental setup. Next we give a qualitative overview of how the experiment went from the point of view of each technology; model checking (section 5), static analysis (section 6), runtime analysis (section 7) and testing (section 8). The find section gives some preliminary quantitative results on how the tools compared.
Borehole geophysics applied to ground-water investigations
Keys, W.S.
1990-01-01
The purpose of this manual is to provide hydrologists, geologists, and others who have the necessary background in hydrogeology with the basic information needed to apply the most useful borehole-geophysical-logging techniques to the solution of problems in ground-water hydrology. Geophysical logs can provide information on the construction of wells and on the character of the rocks and fluids penetrated by those wells, as well as on changes in the character of these factors over time. The response of well logs is caused by petrophysical factors, by the quality, temperature, and pressure of interstitial fluids, and by ground-water flow. Qualitative and quantitative analysis of analog records and computer analysis of digitized logs are used to derive geohydrologic information. This information can then be extrapolated vertically within a well and laterally to other wells using logs. The physical principles by which the mechanical and electronic components of a logging system measure properties of rocks, fluids, and wells, as well as the principles of measurement, must be understood if geophysical logs are to be interpreted correctly. Plating a logging operation involves selecting the equipment and the logs most likely to provide the needed information. Information on well construction and geohydrology is needed to guide this selection. Quality control of logs is an important responsibility of both the equipment operator and the log analyst and requires both calibration and well-site standardization of equipment. Logging techniques that are widely used in ground-water hydrology or that have significant potential for application to this field include spontaneous potential, resistance, resistivity, gamma, gamma spectrometry, gamma-gamma, neutron, acoustic velocity, acoustic televiewer, caliper, and fluid temperature, conductivity, and flow. The following topics are discussed for each of these techniques: principles and instrumentation, calibration and standardization, volume of investigation, extraneous effects, and interpretation and applications.
Borehole geophysics applied to ground-water investigations
Keys, W.S.
1988-01-01
The purpose of this manual is to provide hydrologists, geologists, and others who have the necessary training with the basic information needed to apply the most useful borehole-geophysical-logging techniques to the solution of problems in ground-water hydrology. Geophysical logs can provide information on the construction of wells and on the character of the rocks and fluids penetrated by those wells, in addition to changes in the character of these factors with time. The response of well logs is caused by: petrophysical factors; the quality; temperature, and pressure of interstitial fluids; and ground-water flow. Qualitative and quantitative analysis of the analog records and computer analysis of digitized logs are used to derive geohydrologic information. This information can then be extrapolated vertically within a well and laterally to other wells using logs.The physical principles by which the mechanical and electronic components of a logging system measure properties of rocks, fluids and wells, and the principles of measurement need to be understood to correctly interpret geophysical logs. Planning the logging operation involves selecting the equipment and the logs most likely to provide the needed information. Information on well construction and geohydrology are needed to guide this selection. Quality control of logs is an important responsibility of both the equipment operator and log analyst and requires both calibration and well-site standardization of equipment.Logging techniques that are widely used in ground-water hydrology or that have significant potential for application to this field include: spontaneous potential, resistance, resistivity, gamma, gamma spectrometry, gamma-gamma, neutron, acoustic velocity, acoustic televiewer, caliper, and fluid temperature, conductivity, and flow. The following topics are discussed for each of these techniques: principles and instrumentation, calibration and standardization, volume of investigation, extraneous effects, and interpretation and applications.
Measurement, Modeling, and Analysis of a Large-scale Blog Sever Workload
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeon, Myeongjae; Hwang, Jeaho; Kim, Youngjae
2010-01-01
Despite the growing popularity of Online Social Networks (OSNs), the workload characteristics of OSN servers, such as those hosting blog services, are not well understood. Understanding workload characteristics is important for opti- mizing and improving the performance of current systems and software based on observed trends. Thus, in this paper, we characterize the system workload of the largest blog hosting servers in South Korea, Tistory1. In addition to understanding the system workload of the blog hosting server, we have developed synthesized workloads and obtained the following major findings: (i) the transfer size of non-multimedia files and blog articles can bemore » modeled by a truncated Pareto distribution and a log-normal distribution respectively, and (ii) users accesses to blog articles do not show temporal locality, but they are strongly biased toward those posted along with images or audio.« less
Practical Pocket PC Application w/Biometric Security
NASA Technical Reports Server (NTRS)
Logan, Julian
2004-01-01
I work in the Flight Software Engineering Branch, where we provide design and development of embedded real-time software applications for flight and supporting ground systems to support the NASA Aeronautics and Space Programs. In addition, this branch evaluates, develops and implements new technologies for embedded real-time systems, and maintains a laboratory for applications of embedded technology. The majority of microchips that are used in modern society have been programmed using embedded technology. These small chips can be found in microwaves, calculators, home security systems, cell phones and more. My assignment this summer entails working with an iPAQ HP 5500 Pocket PC. This top-of-the-line hand-held device is one of the first mobile PC's to introduce biometric security capabilities. Biometric security, in this case a fingerprint authentication system, is on the edge of technology as far as securing information. The benefits of fingerprint authentication are enormous. The most significant of them are that it is extremely difficult to reproduce someone else's fingerprint, and it is equally difficult to lose or forget your own fingerprint as opposed to a password or pin number. One of my goals for this summer is to integrate this technology with another Pocket PC application. The second task for the summer is to develop a simple application that provides an Astronaut EVA (Extravehicular Activity) Log Book capability. The Astronaut EVA Log Book is what an astronaut would use to report the status of field missions, crew physical health, successes, future plans, etc. My goal is to develop a user interface into which these data fields can be entered and stored. The applications that I am developing are created using eMbedded Visual C++ 4.0 with the Pocket PC 2003 Software Development Kit provided by Microsoft.
BOREHOLE FLOWMETERS: FIELD APPLICATION AND DATA ANALYSIS
This paper reviews application of borehole flowmeters in granular and fractured rocks. Basic data obtained in the field are the ambient flow log and the pumping-induced flow log. These basic logs may then be used to calculate other quantities of interest. The paper describes the ...
Study on fracture identification of shale reservoir based on electrical imaging logging
NASA Astrophysics Data System (ADS)
Yu, Zhou; Lai, Fuqiang; Xu, Lei; Liu, Lin; Yu, Tong; Chen, Junyu; Zhu, Yuantong
2017-05-01
In recent years, shale gas exploration has made important development, access to a major breakthrough, in which the study of mud shale fractures is extremely important. The development of fractures has an important role in the development of gas reservoirs. Based on the core observation and the analysis of laboratory flakes and laboratory materials, this paper divides the lithology of the shale reservoirs of the XX well in Zhanhua Depression. Based on the response of the mudstone fractures in the logging curve, the fracture development and logging Response to the relationship between the conventional logging and electrical imaging logging to identify the fractures in the work, the final completion of the type of fractures in the area to determine and quantify the calculation of fractures. It is concluded that the fracture type of the study area is high and the microstructures are developed from the analysis of the XX wells in Zhanhua Depression. The shape of the fractures can be clearly seen by imaging logging technology to determine its type.
Kim, Jinsuh; Leira, Enrique C; Callison, Richard C; Ludwig, Bryan; Moritani, Toshio; Magnotta, Vincent A; Madsen, Mark T
2010-05-01
We developed fully automated software for dynamic susceptibility contrast (DSC) MR perfusion-weighted imaging (PWI) to efficiently and reliably derive critical hemodynamic information for acute stroke treatment decisions. Brain MR PWI was performed in 80 consecutive patients with acute nonlacunar ischemic stroke within 24h after onset of symptom from January 2008 to August 2009. These studies were automatically processed to generate hemodynamic parameters that included cerebral blood flow and cerebral blood volume, and the mean transit time (MTT). To develop reliable software for PWI analysis, we used computationally robust algorithms including the piecewise continuous regression method to determine bolus arrival time (BAT), log-linear curve fitting, arrival time independent deconvolution method and sophisticated motion correction methods. An optimal arterial input function (AIF) search algorithm using a new artery-likelihood metric was also developed. Anatomical locations of the automatically determined AIF were reviewed and validated. The automatically computed BAT values were statistically compared with estimated BAT by a single observer. In addition, gamma-variate curve-fitting errors of AIF and inter-subject variability of AIFs were analyzed. Lastly, two observes independently assessed the quality and area of hypoperfusion mismatched with restricted diffusion area from motion corrected MTT maps and compared that with time-to-peak (TTP) maps using the standard approach. The AIF was identified within an arterial branch and enhanced areas of perfusion deficit were visualized in all evaluated cases. Total processing time was 10.9+/-2.5s (mean+/-s.d.) without motion correction and 267+/-80s (mean+/-s.d.) with motion correction on a standard personal computer. The MTT map produced with our software adequately estimated brain areas with perfusion deficit and was significantly less affected by random noise of the PWI when compared with the TTP map. Results of image quality assessment by two observers revealed that the MTT maps exhibited superior quality over the TTP maps (88% good rating of MTT as compared to 68% of TTP). Our software allowed fully automated deconvolution analysis of DSC PWI using proven efficient algorithms that can be applied to acute stroke treatment decisions. Our streamlined method also offers promise for further development of automated quantitative analysis of the ischemic penumbra. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Massiot, Cécile; Townend, John; Nicol, Andrew; McNamara, David D.
2017-08-01
Acoustic borehole televiewer (BHTV) logs provide measurements of fracture attributes (orientations, thickness, and spacing) at depth. Orientation, censoring, and truncation sampling biases similar to those described for one-dimensional outcrop scanlines, and other logging or drilling artifacts specific to BHTV logs, can affect the interpretation of fracture attributes from BHTV logs. K-means, fuzzy K-means, and agglomerative clustering methods provide transparent means of separating fracture groups on the basis of their orientation. Fracture spacing is calculated for each of these fracture sets. Maximum likelihood estimation using truncated distributions permits the fitting of several probability distributions to the fracture attribute data sets within truncation limits, which can then be extrapolated over the entire range where they naturally occur. Akaike Information Criterion (AIC) and Schwartz Bayesian Criterion (SBC) statistical information criteria rank the distributions by how well they fit the data. We demonstrate these attribute analysis methods with a data set derived from three BHTV logs acquired from the high-temperature Rotokawa geothermal field, New Zealand. Varying BHTV log quality reduces the number of input data points, but careful selection of the quality levels where fractures are deemed fully sampled increases the reliability of the analysis. Spacing data analysis comprising up to 300 data points and spanning three orders of magnitude can be approximated similarly well (similar AIC rankings) with several distributions. Several clustering configurations and probability distributions can often characterize the data at similar levels of statistical criteria. Thus, several scenarios should be considered when using BHTV log data to constrain numerical fracture models.
Wu, Danny T Y; Smart, Nikolas; Ciemins, Elizabeth L; Lanham, Holly J; Lindberg, Curt; Zheng, Kai
2017-01-01
To develop a workflow-supported clinical documentation system, it is a critical first step to understand clinical workflow. While Time and Motion studies has been regarded as the gold standard of workflow analysis, this method can be resource consuming and its data may be biased due to the cognitive limitation of human observers. In this study, we aimed to evaluate the feasibility and validity of using EHR audit trail logs to analyze clinical workflow. Specifically, we compared three known workflow changes from our previous study with the corresponding EHR audit trail logs of the study participants. The results showed that EHR audit trail logs can be a valid source for clinical workflow analysis, and can provide an objective view of clinicians' behaviors, multi-dimensional comparisons, and a highly extensible analysis framework.
2006-06-30
ratios (Lg/Pn, Lg/Pg, Lg/Sn, and Lg/ Pcod ,), a constant window length was used to measure the phase amplitudes. The phases were identified by correlating...log(ýo-) = (Yp + yp,,o-dLg)(log( 2) -log ))+Q( + p.o-71Lg)(x 2 - (2.9) ’ 4 pcod ,2 APCoda Therefore, the difference of log(ALg/Apcoda) between
BOREHOLE FLOWMETERS: FIELD APPLICATION AND DATA ANALYSIS
This paper reviews application of borehole flowmeters in granular and fractured rocks. asic data obtained in the field are the ambient flow log and the pumping-induced flow log. hese basic logs may then be used to calculate other quantities of interest. he paper describes the app...
DOT National Transportation Integrated Search
2012-02-01
Minimizing transportation cost is essential in the forest products industry. Logs and wood chips are relatively low value. Logs are a dense heavy weight product to transport, while chips are light and bulky. These handling characteristics along with ...
A Testbed for Evaluating Lunar Habitat Autonomy Architectures
NASA Technical Reports Server (NTRS)
Lawler, Dennis G.
2008-01-01
A lunar outpost will involve a habitat with an integrated set of hardware and software that will maintain a safe environment for human activities. There is a desire for a paradigm shift whereby crew will be the primary mission operators, not ground controllers. There will also be significant periods when the outpost is uncrewed. This will require that significant automation software be resident in the habitat to maintain all system functions and respond to faults. JSC is developing a testbed to allow for early testing and evaluation of different autonomy architectures. This will allow evaluation of different software configurations in order to: 1) understand different operational concepts; 2) assess the impact of failures and perturbations on the system; and 3) mitigate software and hardware integration risks. The testbed will provide an environment in which habitat hardware simulations can interact with autonomous control software. Faults can be injected into the simulations and different mission scenarios can be scripted. The testbed allows for logging, replaying and re-initializing mission scenarios. An initial testbed configuration has been developed by combining an existing life support simulation and an existing simulation of the space station power distribution system. Results from this initial configuration will be presented along with suggested requirements and designs for the incremental development of a more sophisticated lunar habitat testbed.
Cooperative Work and Sustainable Scientific Software Practices in R
NASA Astrophysics Data System (ADS)
Weber, N.
2013-12-01
Most scientific software projects are dependent on the work of many diverse people, institutions and organizations. Incentivizing these actors to cooperatively develop software that is both reliable, and sustainable is complicated by the fact that the reward structures of these various actors greatly differ: research scientists want results from a software or model run in order to publish papers, produce new data, or test a hypothesis; software engineers and research centers want compilable, well documented code that is refactorable, reusable and reproducible in future research scenarios. While much research has been done on incentives and motivations for participating in open source software projects or cyberinfrastrcture development, little work has been done on what motivates or incentivizes developers to maintain scientific software projects beyond their original application. This poster will present early results of research into the incentives and motivation for cooperative scientific software development. In particular, this work focuses on motivations for the maintenance and repair of libraries on the software platform R. Our work here uses a sample of R packages that were created by research centers, or are specific to earth, environmental and climate science applications. We first mined 'check' logs from the Comprehensive R Archive Network (CRAN) to determine the amount of time a package has existed, the number of versions it has gone through over this time, the number of releases, and finally the contact information for each official package 'maintainer'. We then sent a survey to each official maintainer, asking them questions about what role they played in developing the original package, and what their motivations were for sustaining the project over time. We will present early results from this mining and our survey of R maintainers.
Distributed and Collaborative Software Analysis
NASA Astrophysics Data System (ADS)
Ghezzi, Giacomo; Gall, Harald C.
Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of
Avian responses to selective logging shaped by species traits and logging practices
Burivalova, Zuzana; Lee, Tien Ming; Giam, Xingli; Şekercioğlu, Çağan Hakkı; Wilcove, David S.; Koh, Lian Pin
2015-01-01
Selective logging is one of the most common forms of forest use in the tropics. Although the effects of selective logging on biodiversity have been widely studied, there is little agreement on the relationship between life-history traits and tolerance to logging. In this study, we assessed how species traits and logging practices combine to determine species responses to selective logging, based on over 4000 observations of the responses of nearly 1000 bird species to selective logging across the tropics. Our analysis shows that species traits, such as feeding group and body mass, and logging practices, such as time since logging and logging intensity, interact to influence a species' response to logging. Frugivores and insectivores were most adversely affected by logging and declined further with increasing logging intensity. Nectarivores and granivores responded positively to selective logging for the first two decades, after which their abundances decrease below pre-logging levels. Larger species of omnivores and granivores responded more positively to selective logging than smaller species from either feeding group, whereas this effect of body size was reversed for carnivores, herbivores, frugivores and insectivores. Most importantly, species most negatively impacted by selective logging had not recovered approximately 40 years after logging cessation. We conclude that selective timber harvest has the potential to cause large and long-lasting changes in avian biodiversity. However, our results suggest that the impacts can be mitigated to a certain extent through specific forest management strategies such as lengthening the rotation cycle and implementing reduced impact logging. PMID:25994673
NASA Tech Briefs, December 2005
NASA Technical Reports Server (NTRS)
2005-01-01
Topics covered include: Video Mosaicking for Inspection of Gas Pipelines; Shuttle-Data-Tape XML Translator; Highly Reliable, High-Speed, Unidirectional Serial Data Links; Data-Analysis System for Entry, Descent, and Landing; Hybrid UV Imager Containing Face-Up AlGaN/GaN Photodiodes; Multiple Embedded Processors for Fault-Tolerant Computing; Hybrid Power Management; Magnetometer Based on Optoelectronic Microwave Oscillator; Program Predicts Time Courses of Human/ Computer Interactions; Chimera Grid Tools; Astronomer's Proposal Tool; Conservative Patch Algorithm and Mesh Sequencing for PAB3D; Fitting Nonlinear Curves by Use of Optimization Techniques; Tool for Viewing Faults Under Terrain; Automated Synthesis of Long Communication Delays for Testing; Solving Nonlinear Euler Equations With Arbitrary Accuracy; Self-Organizing-Map Program for Analyzing Multivariate Data; Tool for Sizing Analysis of the Advanced Life Support System; Control Software for a High-Performance Telerobot; Java Radar Analysis Tool; Architecture for Verifiable Software; Tool for Ranking Research Options; Enhanced, Partially Redundant Emergency Notification System; Close-Call Action Log Form; Task Description Language; Improved Small-Particle Powders for Plasma Spraying; Bonding-Compatible Corrosion Inhibitor for Rinsing Metals; Wipes, Coatings, and Patches for Detecting Hydrazines; Rotating Vessels for Growing Protein Crystals; Oscillating-Linear-Drive Vacuum Compressor for CO2; Mechanically Biased, Hinged Pairs of Piezoelectric Benders; Apparatus for Precise Indium-Bump Bonding of Microchips; Radiation Dosimetry via Automated Fluorescence Microscopy; Multistage Magnetic Separator of Cells and Proteins; Elastic-Tether Suits for Artificial Gravity and Exercise; Multichannel Brain-Signal-Amplifying and Digitizing System; Ester-Based Electrolytes for Low-Temperature Li-Ion Cells; Hygrometer for Detecting Water in Partially Enclosed Volumes; Radio-Frequency Plasma Cleaning of a Penning Malmberg Trap; Reduction of Flap Side Edge Noise - the Blowing Flap; and Preventing Accidental Ignition of Upper-Stage Rocket Motors.
Chakraborty, Chiranjib; Mallick, Bidyut; Sharma, Ashish Ranjan; Sharma, Garima; Jagga, Supriya; Doss, C George Priya; Nam, Ju-Suk; Lee, Sang-Soo
2017-01-01
Druggability of a target protein depends on the interacting micro-environment between the target protein and drugs. Therefore, a precise knowledge of the interacting micro-environment between the target protein and drugs is requisite for drug discovery process. To understand such micro-environment, we performed in silico interaction analysis between a human target protein, Dipeptidyl Peptidase-IV (DPP-4), and three anti-diabetic drugs (saxagliptin, linagliptin and vildagliptin). During the theoretical and bioinformatics analysis of micro-environmental properties, we performed drug-likeness study, protein active site predictions, docking analysis and residual interactions with the protein-drug interface. Micro-environmental landscape properties were evaluated through various parameters such as binding energy, intermolecular energy, electrostatic energy, van der Waals'+H-bond+desolvo energy (E VHD ) and ligand efficiency (LE) using different in silico methods. For this study, we have used several servers and software, such as Molsoft prediction server, CASTp server, AutoDock software and LIGPLOT server. Through micro-environmental study, highest log P value was observed for linagliptin (1.07). Lowest binding energy was also observed for linagliptin with DPP-4 in the binding plot. We also identified the number of H-bonds and residues involved in the hydrophobic interactions between the DPP-4 and the anti-diabetic drugs. During interaction, two H-bonds and nine residues, two H-bonds and eleven residues as well as four H-bonds and nine residues were found between the saxagliptin, linagliptin as well as vildagliptin cases and DPP-4, respectively. Our in silico data obtained for drug-target interactions and micro-environmental signature demonstrates linagliptin as the most stable interacting drug among the tested anti-diabetic medicines.
Paillet, Frederick L.; Crowder, R.E.
1996-01-01
Quantitative analysis of geophysical logs in ground-water studies often involves at least as broad a range of applications and variation in lithology as is typically encountered in petroleum exploration, making such logs difficult to calibrate and complicating inversion problem formulation. At the same time, data inversion and analysis depend on inversion model formulation and refinement, so that log interpretation cannot be deferred to a geophysical log specialist unless active involvement with interpretation can be maintained by such an expert over the lifetime of the project. We propose a generalized log-interpretation procedure designed to guide hydrogeologists in the interpretation of geophysical logs, and in the integration of log data into ground-water models that may be systematically refined and improved in an iterative way. The procedure is designed to maximize the effective use of three primary contributions from geophysical logs: (1) The continuous depth scale of the measurements along the well bore; (2) The in situ measurement of lithologic properties and the correlation with hydraulic properties of the formations over a finite sample volume; and (3) Multiple independent measurements that can potentially be inverted for multiple physical or hydraulic properties of interest. The approach is formulated in the context of geophysical inversion theory, and is designed to be interfaced with surface geophysical soundings and conventional hydraulic testing. The step-by-step procedures given in our generalized interpretation and inversion technique are based on both qualitative analysis designed to assist formulation of the interpretation model, and quantitative analysis used to assign numerical values to model parameters. The approach bases a decision as to whether quantitative inversion is statistically warranted by formulating an over-determined inversion. If no such inversion is consistent with the inversion model, quantitative inversion is judged not possible with the given data set. Additional statistical criteria such as the statistical significance of regressions are used to guide the subsequent calibration of geophysical data in terms of hydraulic variables in those situations where quantitative data inversion is considered appropriate.
Cuffney, Thomas F.; Brightbill, Robin A.
2011-01-01
The Invertebrate Data Analysis System (IDAS) software was developed to provide an accurate, consistent, and efficient mechanism for analyzing invertebrate data collected as part of the U.S. Geological Survey National Water-Quality Assessment (NAWQA) Program. The IDAS software is a stand-alone program for personal computers that run Microsoft Windows(Registered). It allows users to read data downloaded from the NAWQA Program Biological Transactional Database (Bio-TDB) or to import data from other sources either as Microsoft Excel(Registered) or Microsoft Access(Registered) files. The program consists of five modules: Edit Data, Data Preparation, Calculate Community Metrics, Calculate Diversities and Similarities, and Data Export. The Edit Data module allows the user to subset data on the basis of taxonomy or sample type, extract a random subsample of data, combine or delete data, summarize distributions, resolve ambiguous taxa (see glossary) and conditional/provisional taxa, import non-NAWQA data, and maintain and create files of invertebrate attributes that are used in the calculation of invertebrate metrics. The Data Preparation module allows the user to select the type(s) of sample(s) to process, calculate densities, delete taxa on the basis of laboratory processing notes, delete pupae or terrestrial adults, combine lifestages or keep them separate, select a lowest taxonomic level for analysis, delete rare taxa on the basis of the number of sites where a taxon occurs and (or) the abundance of a taxon in a sample, and resolve taxonomic ambiguities by one of four methods. The Calculate Community Metrics module allows the user to calculate 184 community metrics, including metrics based on organism tolerances, functional feeding groups, and behavior. The Calculate Diversities and Similarities module allows the user to calculate nine diversity and eight similarity indices. The Data Export module allows the user to export data to other software packages (CANOCO, Primer, PC-ORD, MVSP) and produce tables of community data that can be imported into spreadsheet, database, graphics, statistics, and word-processing programs. The IDAS program facilitates the documentation of analyses by keeping a log of the data that are processed, the files that are generated, and the program settings used to process the data. Though the IDAS program was developed to process NAWQA Program invertebrate data downloaded from Bio-TDB, the Edit Data module includes tools that can be used to convert non-NAWQA data into Bio-TDB format. Consequently, the data manipulation, analysis, and export procedures provided by the IDAS program can be used to process data generated outside of the NAWQA Program.
pFlogger: The Parallel Fortran Logging Utility
NASA Technical Reports Server (NTRS)
Clune, Tom; Cruz, Carlos A.
2017-01-01
In the context of high performance computing (HPC), software investments in support of text-based diagnostics, which monitor a running application, are typically limited compared to those for other types of IO. Examples of such diagnostics include reiteration of configuration parameters, progress indicators, simple metrics (e.g., mass conservation, convergence of solvers, etc.), and timers. To some degree, this difference in priority is justifiable as other forms of output are the primary products of a scientific model and, due to their large data volume, much more likely to be a significant performance concern. In contrast, text-based diagnostic content is generally not shared beyond the individual or group running an application and is most often used to troubleshoot when something goes wrong. We suggest that a more systematic approach enabled by a logging facility (or 'logger)' similar to those routinely used by many communities would provide significant value to complex scientific applications. In the context of high-performance computing, an appropriate logger would provide specialized support for distributed and shared-memory parallelism and have low performance overhead. In this paper, we present our prototype implementation of pFlogger - a parallel Fortran-based logging framework, and assess its suitability for use in a complex scientific application.
Autonomous Real Time Requirements Tracing
NASA Technical Reports Server (NTRS)
Plattsmier, George I.; Stetson, Howard K.
2014-01-01
One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto-Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner- TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders
Autonomous Real Time Requirements Tracing
NASA Technical Reports Server (NTRS)
Plattsmier, George; Stetson, Howard
2014-01-01
One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto- Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner-TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders.
Greenville Bridge Reach, Bendway Weirs
2006-09-01
However, these receivers are more expensive and heavier due to the radio and batteries. For this study, two Magellan GPS ProMARK X-CP receivers were...used to collect float data. The Magellan GPS ProMARK X-CP is a small, robust, light receiver that can log 9 hr of both pseudorange and car- rier phase...require a high degree of accu- racy. Using post-processing software, pseudorange GPS data recorded by the ProMARK X-CP can be post-processed
2015-01-13
applying formal methods to systems software, e.g., IronClad [16] and seL4 [19], promise that this vision is not a fool’s er- rand after all. In this...kernel seL4 [19] is fully verified for functional correct- ness and it runs with other deprivileged services. How- ever, the verification process used...portion, which is non-trivial for theorem proving-based approaches. In our COSS example, adding the trusted network logging extensions to seL4 will
ANTP Protocol Suite Software Implementation Architecture in Python
2011-06-03
a popular platform of networking programming, an area in which C has traditionally dominated. 2 NetController AeroRP AeroNP AeroNP API AeroTP...visualisation of the running system. For example using the Google Maps API , the main logging web page can show all the running nodes in the system. By...communication between AeroNP and AeroRP and runs on the operating system as daemon. Furthermore, it creates an API interface to mange the communication between
Analysis of Hospital Processes with Process Mining Techniques.
Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises
2015-01-01
Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.
Study on the stability of adrenaline and on the determination of its acidity constants
NASA Astrophysics Data System (ADS)
Corona-Avendaño, S.; Alarcón-Angeles, G.; Rojas-Hernández, A.; Romero-Romo, M. A.; Ramírez-Silva, M. T.
2005-01-01
In this work, the results are presented concerning the influence of time on the spectral behaviour of adrenaline (C 9H 13NO 3) (AD) and of the determination of its acidity constants by means of spectrophotometry titrations and point-by-point analysis, using for the latter freshly prepared samples for each analysis at every single pH. As the catecholamines are sensitive to light, all samples were protected against it during the course of the experiments. Each method rendered four acidity constants corresponding each to the four acid protons belonging to the functional groups present in the molecule; for the point-by-point analysis the values found were: log β 1=38.25±0.21 , log β 2=29.65±0.17 , log β 3=21.01±0.14 , log β 4=11.34±0.071 .
A Computer Program for Flow-Log Analysis of Single Holes (FLASH)
Day-Lewis, F. D.; Johnson, C.D.; Paillet, Frederick L.; Halford, K.J.
2011-01-01
A new computer program, FLASH (Flow-Log Analysis of Single Holes), is presented for the analysis of borehole vertical flow logs. The code is based on an analytical solution for steady-state multilayer radial flow to a borehole. The code includes options for (1) discrete fractures and (2) multilayer aquifers. Given vertical flow profiles collected under both ambient and stressed (pumping or injection) conditions, the user can estimate fracture (or layer) transmissivities and far-field hydraulic heads. FLASH is coded in Microsoft Excel with Visual Basic for Applications routines. The code supports manual and automated model calibration. ?? 2011, The Author(s). Ground Water ?? 2011, National Ground Water Association.
Harrison, Arnell S.; Dadisman, Shawn V.; Reich, Christopher D.; Wiese, Dana S.; Greenwood, Jason W.; Swarzenski, Peter W.
2007-01-01
In September of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Fort Lauderdale, FL. This report serves as an archive of unprocessed digital boomer and CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
NASA Astrophysics Data System (ADS)
Aminfar, Ali; Mojtahedi, Alireza; Ahmadi, Hamid; Aminfar, Mohammad Hossain
2017-06-01
Among numerous offshore structures used in oil extraction, jacket platforms are still the most favorable ones in shallow waters. In such structures, log piles are used to pin the substructure of the platform to the seabed. The pile's geometrical and geotechnical properties are considered as the main parameters in designing these structures. In this study, ANSYS was used as the FE modeling software to study the geometrical and geotechnical properties of the offshore piles and their effects on supporting jacket platforms. For this purpose, the FE analysis has been done to provide the preliminary data for the fuzzy-logic post-process. The resulting data were implemented to create Fuzzy Inference System (FIS) classifications. The resultant data of the sensitivity analysis suggested that the orientation degree is the main factor in the pile's geometrical behavior because piles which had the optimal operational degree of about 5° are more sustained. Finally, the results showed that the related fuzzified data supported the FE model and provided an insight for extended offshore pile designs.
Using Log Linear Analysis for Categorical Family Variables.
ERIC Educational Resources Information Center
Moen, Phyllis
The Goodman technique of log linear analysis is ideal for family research, because it is designed for categorical (non-quantitative) variables. Variables are dichotomized (for example, married/divorced, childless/with children) or otherwise categorized (for example, level of permissiveness, life cycle stage). Contingency tables are then…
Goff, J.A.; Holliger, K.
1999-01-01
The main borehole of the German Continental Deep Drilling Program (KTB) extends over 9000 m into a crystalline upper crust consisting primarily of interlayered gneiss and metabasite. We present a joint analysis of the velocity and lithology logs in an effort to extract the lithology component of the velocity log. Covariance analysis of lithology log, approximated as a binary series, indicates that it may originate from the superposition of two Brownian stochastic processes (fractal dimension 1.5) with characteristic scales of ???2800 m and ???150 m, respectively. Covariance analysis of the velocity fluctuations provides evidence for the superposition of four stochastic process with distinct characteristic scales. The largest two scales are identical to those derived from the lithology, confirming that these scales of velocity heterogeneity are caused by lithology variations. The third characteristic scale, ???20 m, also a Brownian process, is probably related to fracturing based on correlation with the resistivity log. The superposition of these three Brownian processes closely mimics the commonly observed 1/k decay (fractal dimension 2.0) of the velocity power spectrum. The smallest scale process (characteristic scale ???1.7 m) requires a low fractal dimension, ???1.0, and accounts for ???60% of the total rms velocity variation. A comparison of successive logs from 6900-7140 m depth indicates that such variations are not repeatable and thus probably do not represent true velocity variations in the crust. The results of this study resolve disparity between the differing published estimates of seismic heterogeneity based on the KTB sonic logs, and bridge the gap between estimates of crustal heterogeneity from geologic maps and borehole logs. Copyright 1999 by the American Geophysical Union.
The ALMA software architecture
NASA Astrophysics Data System (ADS)
Schwarz, Joseph; Farris, Allen; Sommer, Heiko
2004-09-01
The software for the Atacama Large Millimeter Array (ALMA) is being developed by many institutes on two continents. The software itself will function in a distributed environment, from the 0.5-14 kmbaselines that separate antennas to the larger distances that separate the array site at the Llano de Chajnantor in Chile from the operations and user support facilities in Chile, North America and Europe. Distributed development demands 1) interfaces that allow separated groups to work with minimal dependence on their counterparts at other locations; and 2) a common architecture to minimize duplication and ensure that developers can always perform similar tasks in a similar way. The Container/Component model provides a blueprint for the separation of functional from technical concerns: application developers concentrate on implementing functionality in Components, which depend on Containers to provide them with services such as access to remote resources, transparent serialization of entity objects to XML, logging, error handling and security. Early system integrations have verified that this architecture is sound and that developers can successfully exploit its features. The Containers and their services are provided by a system-orienteddevelopment team as part of the ALMA Common Software (ACS), middleware that is based on CORBA.
An analysis technique for testing log grades
Carl A. Newport; William G. O' Regan
1963-01-01
An analytical technique that may be used in evaluating log-grading systems is described. It also provides means of comparing two or more grading systems, or a proposed change with the system from which it was developed. The total volume and computed value of lumber from each sample log are the basic data used.
Financial Indicators of Reduced Impact Logging Performance in Brazil: Case Study Comparisons
Thomas P. Holmes; Frederick Boltz; Douglas R. Carter
2001-01-01
Indicators of financial performance are compared for three case studies in the Brazilian Amazon. Each case study presents parameters obtained from monitoring initial harvest entries into primary forests for reduced impact logging (RIL) and conventional logging (CL) operations. Differences in cost definitions and data collection protocols complicate the analysis, and...
Hodges, Mary K.V.; Orr, Stephanie M.; Potter, Katherine E.; LeMaitre, Tynan
2012-01-01
This report, prepared in cooperation with the U.S. Department of Energy, summarizes construction, geophysical, and lithologic data collected from about 4,509 feet of core from seven boreholes deepened or drilled by the U.S. Geological Survey (USGS), Idaho National Laboratory (INL) Project Office, from 2006 to 2009 at the INL. USGS 103, 105, 108, and 131 were deepened and cored from 759 to 1,307 feet, 800 to 1,409 feet, 760 to 1,218 feet, and 808 to 1,239 feet, respectively. Boreholes USGS 135, NRF-15, and NRF-16 were drilled and continuously cored from land surface to 1,198, 759, and 425 feet, respectively. Cores were photographed and digitally logged by using commercially available software. Borehole descriptions summarize location, completion date, and amount and type of core recovered.
Analysis of web-related threats in ten years of logs from a scientific portal
NASA Astrophysics Data System (ADS)
Santos, Rafael D. C.; Grégio, André R. A.; Raddick, Jordan; Vattki, Vamsi; Szalay, Alex
2012-06-01
SkyServer is an Internet portal to data from the Sloan Digital Sky Survey, the largest online archive of astronomy data in the world. provides free access to hundreds of millions of celestial objects for science, education and outreach purposes. Logs of accesses to SkyServer comprise around 930 million hits, 140 million web services accesses and 170 million SQL submitted queries, collected over the past 10 years. These logs also contain indications of compromise attempts on the servers. In this paper, we show some threats that were detected in ten years of stored logs, and compare them with known threats in those years. Also, we present an analysis of the evolution of those threats over these years.
NASA Astrophysics Data System (ADS)
Madiraju, Praveen; Zhang, Yanqing
2002-03-01
When a user logs in to a website, behind the scenes the user leaves his/her impressions, usage patterns and also access patterns in the web servers log file. A web usage mining agent can analyze these web logs to help web developers to improve the organization and presentation of their websites. They can help system administrators in improving the system performance. Web logs provide invaluable help in creating adaptive web sites and also in analyzing the network traffic analysis. This paper presents the design and implementation of a Web usage mining agent for digging in to the web log files.
Real-time Geomagnetic Data from a Raspberry Pi Magnetometer Network in the United Kingdom
NASA Astrophysics Data System (ADS)
Case, N.; Beggan, C.; Marple, S. R.
2017-12-01
In 2014, BGS and the University of Lancaster won an STFC Public Engagement grant to build and deploy 10 Raspberry Pi magnetometers to secondary schools across the UK to enable citizen science. The system uses a Raspberry Pi computer as a logging and data transfer device, connected to a set of three orthogonal miniature fluxgate magnetometers. The system has a nominal sensitivity of around 1 nanoTesla (nT), in each component direction (North, East and Down). This is around twenty times less sensitive than a current scientific-level instrument, but given its relatively low-cost, at about £250 ($325) per unit, this is an excellent price-to-performance ratio given we could not improve the sensitivity unless we spent a lot more money. The magnetic data are sampled at a 5 second cadence and sent to the AuroraWatch website at Lancaster University every 2 minutes. The data are freely available to view and download. The primary aim of the project is to encourage students from 14-18 years old to look at how sensors can be used to collect geophysical data and integrate it together to give a wider understanding of physical phenomena. A second aim is to provide useful data on the spatial variation of the magnetic field for analysis of geomagnetic storms, alongside data from the BGS observatory and University of Lancaster's SAMNET variometer network. We show results from the build, testing and running of the sensors including some recent storms and we reflect on our experiences in engaging schools and the general public with information about the magnetic field. The information to build the system and logging and analysis software for the Raspberry Pi is all freely available, allowing those interested to participate in the project as citizen scientists.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stathakis, S; Defoor, D; Linden, P
Purpose: To study the frequency of Multi-Leaf Collimator (MLC) leaf failures, investigate methods to predict them and reduce linac downtime. Methods: A Varian HD120 MLC was used in our study. The hyperterminal MLC errors logged from 06/2012 to 12/2014 were collected. Along with the hyperterminal errors, the MLC motor changes and all other MLC interventions by the linear accelerator engineer were recorded. The MLC dynalog files were also recorded on a daily basis for each treatment and during linac QA. The dynalog files were analyzed to calculate root mean square errors (RMS) and cumulative MLC travel distance per motor. Anmore » in-house MatLab code was used to analyze all dynalog files, record RMS errors and calculate the distance each MLC traveled per day. Results: A total of 269 interventions were recorded over a period of 18 months. Of these, 146 included MLC motor leaf change, 39 T-nut replacements, and 84 MLC cleaning sessions. Leaves close to the middle of each side required the most maintenance. In the A bank, leaves A27 to A40 recorded 73% of all interventions, while the same leaves in the B bank counted for 52% of the interventions. On average, leaves in the middle of the bank had their motors changed approximately every 1500m of travel. Finally, it was found that the number of RMS errors increased prior to an MLC motor change. Conclusion: An MLC dynalog file analysis software was developed that can be used to log daily MLC usage. Our eighteen-month data analysis showed that there is a correlation between the distance an MLC travels, the RMS and the life of the MLC motor. We plan to use this tool to predict MLC motor failures and with proper and timely intervention, reduce the downtime of the linac during clinical hours.« less
NASA Astrophysics Data System (ADS)
Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko
2017-02-01
A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were ⩽3 mm and ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases. This work was partly presented at the 58th Annual meeting of American Association of Physicists in Medicine.
Shiinoki, Takehiro; Hanazawa, Hideki; Yuasa, Yuki; Fujimoto, Koya; Uehara, Takuya; Shibuya, Keiko
2017-02-21
A combined system comprising the TrueBeam linear accelerator and a new real-time tumour-tracking radiotherapy system, SyncTraX, was installed at our institution. The objectives of this study are to develop a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine electronic portal image device (EPID) images and a log file and to verify this treatment in clinical cases. Respiratory-gated radiotherapy was performed using TrueBeam and the SyncTraX system. Cine EPID images and a log file were acquired for a phantom and three patients during the course of the treatment. Digitally reconstructed radiographs (DRRs) were created for each treatment beam using a planning CT set. The cine EPID images, log file, and DRRs were analysed using a developed software. For the phantom case, the accuracy of the proposed method was evaluated to verify the respiratory-gated radiotherapy. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker used as an internal surrogate were calculated to evaluate the gating accuracy and set-up uncertainty in the superior-inferior (SI), anterior-posterior (AP), and left-right (LR) directions. The proposed method achieved high accuracy for the phantom verification. For the clinical cases, the intra- and inter-fractional variations of the fiducial marker were ⩽3 mm and ±3 mm in the SI, AP, and LR directions. We proposed a method for the verification of respiratory-gated radiotherapy with SyncTraX using cine EPID images and a log file and showed that this treatment is performed with high accuracy in clinical cases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fabain, R.T.
1994-05-16
A rock strength analysis program, through intensive log analysis, can quantify rock hardness in terms of confined compressive strength to identify intervals suited for drilling with polycrystalline diamond compact (PDC) bits. Additionally, knowing the confined compressive strength helps determine the optimum PDC bit for the intervals. Computing rock strength as confined compressive strength can more accurately characterize a rock's actual hardness downhole than other methods. the information can be used to improve bit selections and to help adjust drilling parameters to reduce drilling costs. Empirical data compiled from numerous field strength analyses have provided a guide to selecting PDC drillmore » bits. A computer analysis program has been developed to aid in PDC bit selection. The program more accurately defines rock hardness in terms of confined strength, which approximates the in situ rock hardness downhole. Unconfined compressive strength is rock hardness at atmospheric pressure. The program uses sonic and gamma ray logs as well as numerous input data from mud logs. Within the range of lithologies for which the program is valid, rock hardness can be determine with improved accuracy. The program's output is typically graphed in a log format displaying raw data traces from well logs, computer-interpreted lithology, the calculated values of confined compressive strength, and various optional rock mechanic outputs.« less
Papadimitriou, Konstantinos I.; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M.
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4th) order topology. PMID:25653579
Papadimitriou, Konstantinos I; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4(th)) order topology.
Bicknell, Jake E; Struebig, Matthew J; Davies, Zoe G; Baraloto, Christopher
2015-01-01
Over 20% of the world's tropical forests have been selectively logged, and large expanses are allocated for future timber extraction. Reduced-impact logging (RIL) is being promoted as best practice forestry that increases sustainability and lowers CO2 emissions from logging, by reducing collateral damage associated with timber extraction. RIL is also expected to minimize the impacts of selective logging on biodiversity, although this is yet to be thoroughly tested. We undertake the most comprehensive study to date to investigate the biodiversity impacts of RIL across multiple taxonomic groups. We quantified birds, bats and large mammal assemblage structures, using a before-after control-impact (BACI) design across 20 sample sites over a 5-year period. Faunal surveys utilized point counts, mist nets and line transects and yielded >250 species. We examined assemblage responses to logging, as well as partitions of feeding guild and strata (understorey vs. canopy), and then tested for relationships with logging intensity to assess the primary determinants of community composition. Community analysis revealed little effect of RIL on overall assemblages, as structure and composition were similar before and after logging, and between logging and control sites. Variation in bird assemblages was explained by natural rates of change over time, and not logging intensity. However, when partitioned by feeding guild and strata, the frugivorous and canopy bird ensembles changed as a result of RIL, although the latter was also associated with change over time. Bats exhibited variable changes post-logging that were not related to logging, whereas large mammals showed no change at all. Indicator species analysis and correlations with logging intensities revealed that some species exhibited idiosyncratic responses to RIL, whilst abundance change of most others was associated with time. Synthesis and applications. Our study demonstrates the relatively benign effect of reduced-impact logging (RIL) on birds, bats and large mammals in a neotropical forest context, and therefore, we propose that forest managers should improve timber extraction techniques more widely. If RIL is extensively adopted, forestry concessions could represent sizeable and important additions to the global conservation estate – over 4 million km2. PMID:25954054
SU-E-T-392: Evaluation of Ion Chamber/film and Log File Based QA to Detect Delivery Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, C; Mason, B; Kirsner, S
2015-06-15
Purpose: Ion chamber and film (ICAF) is a method used to verify patient dose prior to treatment. More recently, log file based QA has been shown as an alternative for measurement based QA. In this study, we delivered VMAT plans with and without errors to determine if ICAF and/or log file based QA was able to detect the errors. Methods: Using two VMAT patients, the original treatment plan plus 7 additional plans with delivery errors introduced were generated and delivered. The erroneous plans had gantry, collimator, MLC, gantry and collimator, collimator and MLC, MLC and gantry, and gantry, collimator, andmore » MLC errors. The gantry and collimator errors were off by 4{sup 0} for one of the two arcs. The MLC error introduced was one in which the opening aperture didn’t move throughout the delivery of the field. For each delivery, an ICAF measurement was made as well as a dose comparison based upon log files. Passing criteria to evaluate the plans were ion chamber less and 5% and film 90% of pixels pass the 3mm/3% gamma analysis(GA). For log file analysis 90% of voxels pass the 3mm/3% 3D GA and beam parameters match what was in the plan. Results: Two original plans were delivered and passed both ICAF and log file base QA. Both ICAF and log file QA met the dosimetry criteria on 4 of the 12 erroneous cases analyzed (2 cases were not analyzed). For the log file analysis, all 12 erroneous plans alerted a mismatch in delivery versus what was planned. The 8 plans that didn’t meet criteria all had MLC errors. Conclusion: Our study demonstrates that log file based pre-treatment QA was able to detect small errors that may not be detected using an ICAF and both methods of were able to detect larger delivery errors.« less
Advanced Cyber Attack Modeling Analysis and Visualization
2010-03-01
Graph Analysis Network Web Logs Netflow Data TCP Dump Data System Logs Detect Protect Security Management What-If Figure 8. TVA attack graphs for...Clustered Graphs,” in Proceedings of the Symposium on Graph Drawing, September 1996. [25] K. Lakkaraju, W. Yurcik, A. Lee, “NVisionIP: NetFlow
2010-01-01
Background The development of DNA microarrays has facilitated the generation of hundreds of thousands of transcriptomic datasets. The use of a common reference microarray design allows existing transcriptomic data to be readily compared and re-analysed in the light of new data, and the combination of this design with large datasets is ideal for 'systems'-level analyses. One issue is that these datasets are typically collected over many years and may be heterogeneous in nature, containing different microarray file formats and gene array layouts, dye-swaps, and showing varying scales of log2- ratios of expression between microarrays. Excellent software exists for the normalisation and analysis of microarray data but many data have yet to be analysed as existing methods struggle with heterogeneous datasets; options include normalising microarrays on an individual or experimental group basis. Our solution was to develop the Batch Anti-Banana Algorithm in R (BABAR) algorithm and software package which uses cyclic loess to normalise across the complete dataset. We have already used BABAR to analyse the function of Salmonella genes involved in the process of infection of mammalian cells. Results The only input required by BABAR is unprocessed GenePix or BlueFuse microarray data files. BABAR provides a combination of 'within' and 'between' microarray normalisation steps and diagnostic boxplots. When applied to a real heterogeneous dataset, BABAR normalised the dataset to produce a comparable scaling between the microarrays, with the microarray data in excellent agreement with RT-PCR analysis. When applied to a real non-heterogeneous dataset and a simulated dataset, BABAR's performance in identifying differentially expressed genes showed some benefits over standard techniques. Conclusions BABAR is an easy-to-use software tool, simplifying the simultaneous normalisation of heterogeneous two-colour common reference design cDNA microarray-based transcriptomic datasets. We show BABAR transforms real and simulated datasets to allow for the correct interpretation of these data, and is the ideal tool to facilitate the identification of differentially expressed genes or network inference analysis from transcriptomic datasets. PMID:20128918
ERIC Educational Resources Information Center
Qin, Jian; Jurisica, Igor; Liddy, Elizabeth D.; Jansen, Bernard J; Spink, Amanda; Priss, Uta; Norton, Melanie J.
2000-01-01
These six articles discuss knowledge discovery in databases (KDD). Topics include data mining; knowledge management systems; applications of knowledge discovery; text and Web mining; text mining and information retrieval; user search patterns through Web log analysis; concept analysis; data collection; and data structure inconsistency. (LRW)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phisherman is an online software tool that was created to help experimenters study phishing. It can potentially be re-purposed to run other human studies. Phisherman enables studies to be run online, so that users can participate from their own computers. This means that experimenters can get data from subjects in their natural settings. Alternatively, an experimenter can also run the app online in a lab-based setting, if that is desired. The software enables the online deployment of a study that is comprised of three main parts: (1) a consent page, (2) a survey, and (3) an identification task, with instruction/transitionmore » screens between each part, allowing the experimenter to provide the user with instructions and messages. Upon logging in, the subject is taken to the permission page, where they agree to or do not agree to take part in the study. If the subject agrees to participate, then the software randomly chooses between doing the survey first (and identification task second) or the identification task first (and survey second). This is to balance possible order effects in the data. Procedurally, in the identification task, the software shows the stimuli to the subject, and asks if she thinks it is a phish (yes/no) and how confident she is about her answer. The subject is given 5 levels of certainty to select from, labeled "low" (1), to "medium" (3), to "high" (5), with the option of picking a level between low and medium (2), and between medium and high (4). After selecting his/her confidence level, then the "Next" button activates, allowing a user to move to the next email. The software saves a given subject's progress in the identification task, so that she may log in and out of the site. The consent page is a space for the experimenter to provide the subject with human studies board /internal review board information, and to formally consent to participate in the study. The survey is a space for the experimenter to provide questions and spaces for the users to input answers (allowing both multiple-choice and free-answer options). Phisherman includes administrative pages for managing the stimuli and users. This includes a tool for the experimenter to create, preview, edit, delete (if desired), and manage stimuli (emails). The stimuli may include pictures (uploaded to an appropriate folder) and links, for realism. The software includes a safety feature that prevents the user from going to any link location or opening a file/image. Instead of re-directing the subject's browser, the software provides a pop-up box with the URL location of where the user would have gone. Another administrative page may be used to create fake subject accounts for testing the software prior to deployment, as well as to delete subject accounts when necessary. Data from the experiment can be downloaded from another administrative page.« less
Swanson, William H.; Horner, Douglas G.; Dul, Mitchell W.; Malinovsky, Victor E.
2014-01-01
Purpose To develop guidelines for engineering perimetric stimuli to reduce test-retest variability in glaucomatous defects. Methods Perimetric testing was performed on one eye for 62 patients with glaucoma and 41 age-similar controls on size III and frequency-doubling perimetry and three custom tests with Gaussian blob and Gabor sinusoid stimuli. Stimulus range was controlled by values for ceiling (maximum sensitivity) and floor (minimum sensitivity). Bland-Altman analysis was used to derive 95% limits of agreement on test and retest, and bootstrap analysis was used to test the hypotheses about peak variability. Results Limits of agreement for the three custom stimuli were similar in width (0.72 to 0.79 log units) and peak variability (0.22 to 0.29 log units) for a stimulus range of 1.7 log units. The width of the limits of agreement for size III decreased from 1.78 to 1.37 to 0.99 log units for stimulus ranges of 3.9, 2.7, and 1.7 log units, respectively (F = 3.23, P < 0.001); peak variability was 0.99, 0.54, and 0.34 log units, respectively (P < 0.01). For a stimulus range of 1.3 log units, limits of agreement were narrowest with Gabor and widest with size III stimuli, and peak variability was lower (P < 0.01) with Gabor (0.18 log units) and frequency-doubling perimetry (0.24 log units) than with size III stimuli (0.38 log units). Conclusions Test-retest variability in glaucomatous visual field defects was substantially reduced by engineering the stimuli. Translational Relevance The guidelines should allow developers to choose from a wide range of stimuli. PMID:25371855
Swanson, William H; Horner, Douglas G; Dul, Mitchell W; Malinovsky, Victor E
2014-09-01
To develop guidelines for engineering perimetric stimuli to reduce test-retest variability in glaucomatous defects. Perimetric testing was performed on one eye for 62 patients with glaucoma and 41 age-similar controls on size III and frequency-doubling perimetry and three custom tests with Gaussian blob and Gabor sinusoid stimuli. Stimulus range was controlled by values for ceiling (maximum sensitivity) and floor (minimum sensitivity). Bland-Altman analysis was used to derive 95% limits of agreement on test and retest, and bootstrap analysis was used to test the hypotheses about peak variability. Limits of agreement for the three custom stimuli were similar in width (0.72 to 0.79 log units) and peak variability (0.22 to 0.29 log units) for a stimulus range of 1.7 log units. The width of the limits of agreement for size III decreased from 1.78 to 1.37 to 0.99 log units for stimulus ranges of 3.9, 2.7, and 1.7 log units, respectively ( F = 3.23, P < 0.001); peak variability was 0.99, 0.54, and 0.34 log units, respectively ( P < 0.01). For a stimulus range of 1.3 log units, limits of agreement were narrowest with Gabor and widest with size III stimuli, and peak variability was lower ( P < 0.01) with Gabor (0.18 log units) and frequency-doubling perimetry (0.24 log units) than with size III stimuli (0.38 log units). Test-retest variability in glaucomatous visual field defects was substantially reduced by engineering the stimuli. The guidelines should allow developers to choose from a wide range of stimuli.
Kay, Robert T.; Mills, Patrick C.; Dunning, Charles P.; Yeskis, Douglas J.; Ursic, James R.; Vendl, Mark
2004-01-01
The effectiveness of 28 methods used to characterize the fractured Galena-Platteville aquifer at eight sites in northern Illinois and Wisconsin is evaluated. Analysis of government databases, previous investigations, topographic maps, aerial photographs, and outcrops was essential to understanding the hydrogeology in the area to be investigated. The effectiveness of surface-geophysical methods depended on site geology. Lithologic logging provided essential information for site characterization. Cores were used for stratigraphy and geotechnical analysis. Natural-gamma logging helped identify the effect of lithology on the location of secondary- permeability features. Caliper logging identified large secondary-permeability features. Neutron logs identified trends in matrix porosity. Acoustic-televiewer logs identified numerous secondary-permeability features and their orientation. Borehole-camera logs also identified a number of secondary-permeability features. Borehole ground-penetrating radar identified lithologic and secondary-permeability features. However, the accuracy and completeness of this method is uncertain. Single-point-resistance, density, and normal resistivity logs were of limited use. Water-level and water-quality data identified flow directions and indicated the horizontal and vertical distribution of aquifer permeability and the depth of the permeable features. Temperature, spontaneous potential, and fluid-resistivity logging identified few secondary-permeability features at some sites and several features at others. Flowmeter logging was the most effective geophysical method for characterizing secondary-permeability features. Aquifer tests provided insight into the permeability distribution, identified hydraulically interconnected features, the presence of heterogeneity and anisotropy, and determined effective porosity. Aquifer heterogeneity prevented calculation of accurate hydraulic properties from some tests. Different methods, such as flowmeter logging and slug testing, occasionally produced different interpretations. Aquifer characterization improved with an increase in the number of data points, the period of data collection, and the number of methods used.
Management of an affiliated Physics Residency Program using a commercial software tool.
Zacarias, Albert S; Mills, Michael D
2010-06-01
A review of commercially available allied health educational management software tools was performed to evaluate their capacity to manage program data associated with a CAMPEP-accredited Therapy Physics Residency Program. Features of these software tools include: a) didactic course reporting and organization, b) competency reporting by topic, category and didactic course, c) student time management and accounting, and d) student patient case reporting by topic, category and course. The software package includes features for recording school administrative information; setting up lists of courses, faculty, clinical sites, categories, competencies, and time logs; and the inclusion of standardized external documents. There are provisions for developing evaluation and survey instruments. The mentors and program may be evaluated by residents, and residents may be evaluated by faculty members using this feature. Competency documentation includes the time spent on the problem or with the patient, time spent with the mentor, date of the competency, and approval by the mentor and program director. Course documentation includes course and lecture title, lecturer, topic information, date of lecture and approval by the Program Director. These software tools have the facility to include multiple clinical sites, with local subadministrators having the ability to approve competencies and attendance at clinical conferences. In total, these software tools have the capability of managing all components of a CAMPEP-accredited residency program. The application database lends the software to the support of multiple affiliated clinical sites within a single residency program. Such tools are a critical and necessary component if the medical physics profession is to meet the projected needs for qualified medical physicists in future years.
Web Application Software for Ground Operations Planning Database (GOPDb) Management
NASA Technical Reports Server (NTRS)
Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey
2013-01-01
A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.
Analysis of Effects of Sensor Multithreading to Generate Local System Event Timelines
2014-03-27
works on logs highlights the importance of logs [17, 18]. The two aforementioned works both reference the same 2009 Data Breach Investigations Report...the data breaches report on, the logs contained evidence of events leading up to 82% of those data breaches . This means that preventing 82% of the data ...report states that of the data breaches reported on, the logs contained evidence of events leading up to 66% of those data breaches . • The 2010 DBIR
Reducing Uncertainties in Hydrocarbon Prediction through Application of Elastic Domain
NASA Astrophysics Data System (ADS)
Shamsuddin, S. Z.; Hermana, M.; Ghosh, D. P.; Salim, A. M. A.
2017-10-01
The application of lithology and fluid indicators has helped the geophysicists to discriminate reservoirs to non-reservoirs from a field. This analysis is conducted to select the most suitable lithology and fluid indicator for the Malaysian basins that could lead to better eliminate pitfalls of amplitude. This paper uses different rock physics analysis such as elastic impedance, Lambda-Mu-Rho, and SQp-SQs attribute. Litho-elastic impedance log is generated by correlating the gamma ray log with extended elastic impedance log. The same application is used for fluid-elastic impedance by correlation of EEI log with water saturation or resistivity. The work is done on several well logging data collected from different fields in Malay basin and its neighbouring basin. There's an excellent separation between hydrocarbon sand and background shale for Well-1 from different cross-plot analysis. Meanwhile, the Well-2 shows good separation in LMR plot. The similar method is done on the Well-3 shows fair separation of silty sand and gas sand using SQp-SQs attribute which can be correlated with well log. Based on the point distribution histogram plot, different lithology and fluid can be separated clearly. Simultaneous seismic inversion results in acoustic impedance, Vp/Vs, SQp, and SQs volumes. There are many attributes available in the industry used to separate the lithology and fluid, however some of the methods are not suitable for the application to the basins in Malaysia.
Assessing the feasibility and profitability of cable logging in southern upland hardwood forests
Chris B. LeDoux; Dennis M. May; Tony Johnson; Richard H. Widmann
1995-01-01
Procedures developed to assess available timber supplies from upland hardwood forest statistics reported by the USDA Forest Services' Forest Inventory and Analysis unit were modified to assess the feasibility and profitability of cable logging in southern upland hardwood forests. Depending on the harvest system and yarding distance used, cable logging can be...
Selective logging in the Brazilian Amazon.
Asner, Gregory P; Knapp, David E; Broadbent, Eben N; Oliveira, Paulo J C; Keller, Michael; Silva, Jose N
2005-10-21
Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square kilometers per year (+/-14%) between 1999 and 2002, equivalent to 60 to 123% of previously reported deforestation area. Up to 1200 square kilometers per year of logging were observed on conservation lands. Each year, 27 million to 50 million cubic meters of wood were extracted, and a gross flux of approximately 0.1 billion metric tons of carbon was destined for release to the atmosphere by logging.
Dzurová, Lenka; Forneris, Federico; Savino, Simone; Galuszka, Petr; Vrabka, Josef; Frébort, Ivo
2015-08-01
The recently discovered cytokinin (CK)-specific phosphoribohydrolase "Lonely Guy" (LOG) is a key enzyme of CK biosynthesis, converting inactive CK nucleotides into biologically active free bases. We have determined the crystal structures of LOG from Claviceps purpurea (cpLOG) and its complex with the enzymatic product phosphoribose. The structures reveal a dimeric arrangement of Rossmann folds, with the ligands bound to large pockets at the interface between cpLOG monomers. Structural comparisons highlight the homology of cpLOG to putative lysine decarboxylases. Extended sequence analysis enabled identification of a distinguishing LOG sequence signature. Taken together, our data suggest phosphoribohydrolase activity for several proteins of unknown function. © 2015 Wiley Periodicals, Inc.
Computational Software for Fitting Seismic Data to Epidemic-Type Aftershock Sequence Models
NASA Astrophysics Data System (ADS)
Chu, A.
2014-12-01
Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work introduces software to implement two of ETAS models described in Ogata (1998). To find the Maximum-Likelihood Estimates (MLEs), my software provides estimates of the homogeneous background rate parameter and the temporal and spatial parameters that govern triggering effects by applying the Expectation-Maximization (EM) algorithm introduced in Veen and Schoenberg (2008). Despite other computer programs exist for similar data modeling purpose, using EM-algorithm has the benefits of stability and robustness (Veen and Schoenberg, 2008). Spatial shapes that are very long and narrow cause difficulties in optimization convergence and problems with flat or multi-modal log-likelihood functions encounter similar issues. My program uses a robust method to preset a parameter to overcome the non-convergence computational issue. In addition to model fitting, the software is equipped with useful tools for examining modeling fitting results, for example, visualization of estimated conditional intensity, and estimation of expected number of triggered aftershocks. A simulation generator is also given with flexible spatial shapes that may be defined by the user. This open-source software has a very simple user interface. The user may execute it on a local computer, and the program also has potential to be hosted online. Java language is used for the software's core computing part and an optional interface to the statistical package R is provided.
EARS : Repositioning data management near data acquisition.
NASA Astrophysics Data System (ADS)
Sinquin, Jean-Marc; Sorribas, Jordi; Diviacco, Paolo; Vandenberghe, Thomas; Munoz, Raquel; Garcia, Oscar
2016-04-01
The EU FP7 Projects Eurofleets and Eurofleets2 are an European wide alliance of marine research centers that aim to share their research vessels, to improve information sharing on planned, current and completed cruises, on details of ocean-going research vessels and specialized equipment, and to durably improve cost-effectiveness of cruises. Within this context logging of information on how, when and where anything happens on board of the vessel is crucial information for data users in a later stage. This forms a primordial step in the process of data quality control as it could assist in the understanding of anomalies and unexpected trends recorded in the acquired data sets. In this way completeness of the metadata is improved as it is recorded accurately at the origin of the measurement. The collection of this crucial information has been done in very different ways, using different procedures, formats and pieces of software in the context of the European Research Fleet. At the time that the Eurofleets project started, every institution and country had adopted different strategies and approaches, which complicated the task of users that need to log general purpose information and events on-board whenever they access a different platform loosing the opportunity to produce this valuable metadata on-board. Among the many goals the Eurofleets project has, a very important task is the development of an "event log software" called EARS (Eurofleets Automatic Reporting System) that enables scientists and operators to record what happens during a survey. EARS will allow users to fill, in a standardized way, the gap existing at the moment in metadata description that only very seldom links data with its history. Events generated automatically by acquisition instruments will also be handled, enhancing the granularity and precision of the event annotation. The adoption of a common procedure to log survey events and a common terminology to describe them is crucial to provide a friendly and successfully metadata on-board creation procedure for the whole the European Fleet. The possibility of automatically reporting metadata and general purpose data, will simplify the work of scientists and data managers with regards to data transmission. An improved accuracy and completeness of metadata is expected when events are recorded at acquisition time. This will also enhance multiple usages of the data as it allows verification of the different requirements existing in different disciplines.
Sowan, Azizeh Khaled; Reed, Charles Calhoun; Staggers, Nancy
2016-09-30
Large datasets of the audit log of modern physiologic monitoring devices have rarely been used for predictive modeling, capturing unsafe practices, or guiding initiatives on alarm systems safety. This paper (1) describes a large clinical dataset using the audit log of the physiologic monitors, (2) discusses benefits and challenges of using the audit log in identifying the most important alarm signals and improving the safety of clinical alarm systems, and (3) provides suggestions for presenting alarm data and improving the audit log of the physiologic monitors. At a 20-bed transplant cardiac intensive care unit, alarm data recorded via the audit log of bedside monitors were retrieved from the server of the central station monitor. Benefits of the audit log are many. They include easily retrievable data at no cost, complete alarm records, easy capture of inconsistent and unsafe practices, and easy identification of bedside monitors missed from a unit change of alarm settings adjustments. Challenges in analyzing the audit log are related to the time-consuming processes of data cleaning and analysis, and limited storage and retrieval capabilities of the monitors. The audit log is a function of current capabilities of the physiologic monitoring systems, monitor's configuration, and alarm management practices by clinicians. Despite current challenges in data retrieval and analysis, large digitalized clinical datasets hold great promise in performance, safety, and quality improvement. Vendors, clinicians, researchers, and professional organizations should work closely to identify the most useful format and type of clinical data to expand medical devices' log capacity.
Statistical models for the analysis and design of digital polymerase chain (dPCR) experiments
Dorazio, Robert; Hunter, Margaret
2015-01-01
Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log–log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model’s parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.
Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J
2017-12-01
qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.
NASA Astrophysics Data System (ADS)
Shenar, T.; Oskinova, L. M.; Järvinen, S. P.; Luckas, P.; Hainich, R.; Todt, H.; Hubrig, S.; Sander, A. A. C.; Ilyin, I.; Hamann, W.-R.
2017-10-01
Context. HD 54879 (O9.7 V) is one of a dozen O-stars for which an organized atmospheric magnetic field has been detected. Despite their importance, little is known about the winds and evolution of magnetized massive stars. Aims: To gain insights into the interplay between atmospheres, winds, and magnetic fields of massive stars, we acquired UV and X-ray data of HD 54879 using the Hubble Space Telescope and the XMM-Newton satellite. In addition, 35 optical amateur spectra were secured to study the variability of HD 54879. Methods: A multiwavelength (X-ray to optical) spectral analysis is performed using the Potsdam Wolf-Rayet (PoWR) model atmosphere code and the xspec software. Results: The photospheric parameters (T∗ = 30.5 kK, log g = 4.0 [cm s-2], log L = 4.45 [L⊙]) are typical for an O9.7 V star. The microturbulent, macroturbulent, and projected rotational velocities are lower than previously suggested (ξph,vmac,vsini ≤ 4 km s-1). An initial mass of 16 M⊙ and an age of 5 Myr are inferred from evolutionary tracks. We derive a mean X-ray emitting temperature of log TX = 6.7 [K] and an X-ray luminosity of LX = 1 × 1032 erg s-1. Short- and long-scale variability is seen in the Hα line, but only a very long period of P ≈ 5 yr could be estimated. Assessing the circumstellar density of HD 54879 using UV spectra, we can roughly estimate the mass-loss rate HD 54879 would have in the absence of a magnetic field as log ṀB = 0 ≈ -9.0 [M⊙ yr-1]. The magnetic field traps the stellar wind up to the Alfvén radius rA ≳ 12 R∗, implying that its true mass-loss rate is log Ṁ ≲ -10.2 [M⊙ yr-1]. Hence, density enhancements around magnetic stars can be exploited to estimate mass-loss rates of non-magnetic stars of similar spectral types, essential for resolving the weak wind problem. Conclusions: Our study confirms that strongly magnetized stars lose little or no mass, and supplies important constraints on the weak-wind problem of massive main sequence stars. Based on observations obtained with XMM-Newton, an ESA science mission with instruments and contributions directly funded by ESA Member States and NASA.
Collecting conditions usage metadata to optimize current and future ATLAS software and processing
NASA Astrophysics Data System (ADS)
Rinaldi, L.; Barberis, D.; Formica, A.; Gallas, E. J.; Oda, S.; Rybkin, G.; Verducci, M.; ATLAS Collaboration
2017-10-01
Conditions data (for example: alignment, calibration, data quality) are used extensively in the processing of real and simulated data in ATLAS. The volume and variety of the conditions data needed by different types of processing are quite diverse, so optimizing its access requires a careful understanding of conditions usage patterns. These patterns can be quantified by mining representative log files from each type of processing and gathering detailed information about conditions usage for that type of processing into a central repository.
2013-01-01
under Contract No. FA8721-05- C -0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded...logging capabilities or further modify the control to best suit its needs. 1.1 Audience and Structure of This Report This report is a hands -on guide...the follow- ing directory: C :\\Admin_Tools\\USB_Audit\\ When selecting a deployment path, avoid using spaces in directory names since this will cause
TENSOR DECOMPOSITIONS AND SPARSE LOG-LINEAR MODELS
Johndrow, James E.; Bhattacharya, Anirban; Dunson, David B.
2017-01-01
Contingency table analysis routinely relies on log-linear models, with latent structure analysis providing a common alternative. Latent structure models lead to a reduced rank tensor factorization of the probability mass function for multivariate categorical data, while log-linear models achieve dimensionality reduction through sparsity. Little is known about the relationship between these notions of dimensionality reduction in the two paradigms. We derive several results relating the support of a log-linear model to nonnegative ranks of the associated probability tensor. Motivated by these findings, we propose a new collapsed Tucker class of tensor decompositions, which bridge existing PARAFAC and Tucker decompositions, providing a more flexible framework for parsimoniously characterizing multivariate categorical data. Taking a Bayesian approach to inference, we illustrate empirical advantages of the new decompositions. PMID:29332971
Serum Spot 14 concentration is negatively associated with thyroid-stimulating hormone level
Chen, Yen-Ting; Tseng, Fen-Yu; Chen, Pei-Lung; Chi, Yu-Chao; Han, Der-Sheng; Yang, Wei-Shiung
2016-01-01
Abstract Spot 14 (S14) is a protein involved in fatty acid synthesis and was shown to be induced by thyroid hormone in rat liver. However, the presence of S14 in human serum and its relations with thyroid function status have not been investigated. The objectives of this study were to compare serum S14 concentrations in patients with hyperthyroidism or euthyroidism and to evaluate the associations between serum S14 and free thyroxine (fT4) or thyroid-stimulating hormone (TSH) levels. We set up an immunoassay for human serum S14 concentrations and compared its levels between hyperthyroid and euthyroid subjects. Twenty-six hyperthyroid patients and 29 euthyroid individuals were recruited. Data of all patients were pooled for the analysis of the associations between the levels of S14 and fT4, TSH, or quartile of TSH. The hyperthyroid patients had significantly higher serum S14 levels than the euthyroid subjects (median [Q1, Q3]: 975 [669, 1612] ng/mL vs 436 [347, 638] ng/mL, P < 0.001). In univariate linear regression, the log-transformed S14 level (logS14) was positively associated with fT4 but negatively associated with creatinine (Cre), total cholesterol (T-C), triglyceride (TG), low-density lipoprotein cholesterol (LDL-C), and TSH. The positive associations between logS14 and fT4 and the negative associations between logS14 and Cre, TG, T-C, or TSH remained significant after adjustment with sex and age. These associations were prominent in females but not in males. The logS14 levels were negatively associated with the TSH levels grouped by quartile (ß = −0.3020, P < 0.001). The association between logS14 and TSH quartile persisted after adjustment with sex and age (ß = −0.2828, P = 0.001). In stepwise multivariate regression analysis, only TSH grouped by quartile remained significantly associated with logS14 level. We developed an ELISA to measure serum S14 levels in human. Female patients with hyperthyroidism had higher serum S14 levels than the female subjects with euthyroidism. The serum logS14 concentrations were negatively associated with TSH levels. Changes of serum S14 level in the whole thyroid function spectrum deserve further investigation. PMID:27749565
Monitoring User Search Success through Transaction Log Analysis: The WolfPAC Example.
ERIC Educational Resources Information Center
Zink, Steven D.
1991-01-01
Describes the use of transaction log analysis of the online catalog at the University of Nevada, Reno, libraries to help evaluate reasons for unsuccessful user searches. Author, title, and subject searches are examined; problems with Library of Congress subject headings are discussed; and title keyword searching is suggested. (11 references) (LRW)
NASA Astrophysics Data System (ADS)
Naito, K.; Park, J.
2012-12-01
The Nankai Trough off southwest Japan is one of the best subduction-zone to study megathrust earthquake mechanism. Huge earthquakes have been repeated in the cycle of 100-150 years in the area, and in these days the next emergence of the earthquake becomes one of the most serious issue in Japan. Therefore, detailed descriptions of geological structure are urgently needed there. IODP (Integrated Ocean Drilling Program) have investigated this area in the NanTroSEIZE science plan. Seismic reflection, core sampling and borehole logging surveys have been executed during the NanTroSEIZE expeditions. Core-log-seismic data integration (CLSI) is useful for understanding the Nankai seismogenic zone. We use the seismic inversion method to do the CLSI. The seismic inversion (acoustic impedance inversion, A.I. inversion) is a method to estimate rock physical properties using seismic reflection and logging data. Acoustic impedance volume is inverted for seismic data with density and P-wave velocity of several boreholes with the technique. We use high-resolution 3D multi-channel seismic (MCS) reflection data obtained during KR06-02 cruise in 2006, and measured core sample properties by IODP Expeditions 322 and 333. P-wave velocities missing for some core sample are interpolated by the relationship between acoustic impedance and P-wave velocity. We used Hampson-Russell software for the seismic inversion. 3D porosity model is derived from the 3D acoustic impedance model to figure out rock physical properties of the incoming sedimentary sequence in the Nankai Trough off Kumano Basin. The result of our inversion analysis clearly shows heterogeneity of sediments; relatively high porosity sediments on the shallow layer of Kashinosaki Knoll, and distribution of many physical anomaly bands on volcanic and turbidite sediment layers around the 3D MCS survey area. In this talk, we will show 3D MCS, acoustic impedance, and porosity data for the incoming sedimentary sequence and discuss its possible implications for the Nankai seismogenic behavior.
Ingham, Steven C; Fanslau, Melody A; Burnham, Greg M; Ingham, Barbara H; Norback, John P; Schaffner, Donald W
2007-06-01
A computer-based tool (available at: www.wisc.edu/foodsafety/meatresearch) was developed for predicting pathogen growth in raw pork, beef, and poultry meat. The tool, THERM (temperature history evaluation for raw meats), predicts the growth of pathogens in pork and beef (Escherichia coli O157:H7, Salmonella serovars, and Staphylococcus aureus) and on poultry (Salmonella serovars and S. aureus) during short-term temperature abuse. The model was developed as follows: 25-g samples of raw ground pork, beef, and turkey were inoculated with a five-strain cocktail of the target pathogen(s) and held at isothermal temperatures from 10 to 43.3 degrees C. Log CFU per sample data were obtained for each pathogen and used to determine lag-phase duration (LPD) and growth rate (GR) by DMFit software. The LPD and GR were used to develop the THERM predictive tool, into which chronological time and temperature data for raw meat processing and storage are entered. The THERM tool then predicts a delta log CFU value for the desired pathogen-product combination. The accuracy of THERM was tested in 20 different inoculation experiments that involved multiple products (coarse-ground beef, skinless chicken breast meat, turkey scapula meat, and ground turkey) and temperature-abuse scenarios. With the time-temperature data from each experiment, THERM accurately predicted the pathogen growth and no growth (with growth defined as delta log CFU > 0.3) in 67, 85, and 95% of the experiments with E. coli 0157:H7, Salmonella serovars, and S. aureus, respectively, and yielded fail-safe predictions in the remaining experiments. We conclude that THERM is a useful tool for qualitatively predicting pathogen behavior (growth and no growth) in raw meats. Potential applications include evaluating process deviations and critical limits under the HACCP (hazard analysis critical control point) system.
Soulsby, David; Chica, Jeryl A M
2017-08-01
We have developed a simple, direct and novel method for the determination of partition coefficients and partitioning behavior using 1 H NMR spectroscopy combined with time domain complete reduction to amplitude-frequency tables (CRAFT). After partitioning into water and 1-octanol using standard methods, aliquots from each layer are directly analyzed using either proton or selective excitation NMR experiments. Signal amplitudes for each compound from each layer are then extracted directly from the time domain data in an automated fashion and analyzed using the CRAFT software. From these amplitudes, log P and log D 7.4 values can be calculated directly. Phase, baseline and internal standard issues, which can be problematic when Fourier transformed data are used, are unimportant when using time domain data. Furthermore, analytes can contain impurities because only a single resonance is examined and need not be UV active. Using this approach, we examined a variety of pharmaceutically relevant compounds and determined partition coefficients that are in excellent agreement with literature values. To demonstrate the utility of this approach, we also examined salicylic acid in more detail demonstrating an aggregation effect as a function of sample loading and partition coefficient behavior as a function of pH value. This method provides a valuable addition to the medicinal chemist toolbox for determining these important constants. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Hezarjaribi, Mehrnoosh; Ardestani, Fatemeh; Ghorbani, Hamid Reza
2016-08-01
Saccharomyces cerevisiae PTCC5269 growth was evaluated to specify an optimum culture medium to reach the highest protein production. Experiment design was conducted using a fraction of the full factorial methodology, and signal to noise ratio was used for results analysis. Maximum cell of 8.84 log (CFU/mL) was resulted using optimized culture composed of 0.3, 0.15, 1, and 50 g L(-1) of ammonium sulfate, iron sulfate, glycine, and glucose, respectively at 300 rpm and 35 °C. Glycine concentration (39.32 % contribution) and glucose concentration (36.15 % contribution) were determined as the most effective factors on the biomass production, while Saccharomyces cerevisiae growth had showed the least dependence on ammonium sulfate (5.2 % contribution) and iron sulfate (19.28 % contribution). The most interaction was diagnosed between ammonium sulfate and iron sulfate concentrations with interaction severity index of 50.71 %, while the less one recorded for glycine and glucose concentration was equal to 8.12 %. An acceptable consistency of 84.26 % was obtained between optimum theoretical cell numbers determined by software of 8.91 log (CFU/mL), and experimentally measured one at optimal condition confirms the suitability of the applied method. High protein content of 44.6 % using optimum culture suggests that Saccharomyces cerevisiae is a good commercial case for single cell protein production.
NASA Astrophysics Data System (ADS)
Kim, Taeyoun; Hwang, Seho; Jang, Seonghyung
2017-01-01
When finding the "sweet spot" of a shale gas reservoir, it is essential to estimate the brittleness index (BI) and total organic carbon (TOC) of the formation. Particularly, the BI is one of the key factors in determining the crack propagation and crushing efficiency for hydraulic fracturing. There are several methods for estimating the BI of a formation, but most of them are empirical equations that are specific to particular rock types. We estimated the mineralogical BI based on elemental capture spectroscopy (ECS) log and elastic BI based on well log data, and we propose a new method for predicting S-wave velocity (VS) using mineralogical BI and elastic BI. The TOC is related to the gas content of shale gas reservoirs. Since it is difficult to perform core analysis for all intervals of shale gas reservoirs, we make empirical equations for the Horn River Basin, Canada, as well as TOC log using a linear relation between core-tested TOC and well log data. In addition, two empirical equations have been suggested for VS prediction based on density and gamma ray log used for TOC analysis. By applying the empirical equations proposed from the perspective of BI and TOC to another well log data and then comparing predicted VS log with real VS log, the validity of empirical equations suggested in this paper has been tested.
Collett, Timothy S.; Lee, Wyung W.; Zyrianova, Margarita V.; Mrozewski, Stefan A.; Guerin, Gilles; Cook, Ann E.; Goldberg, Dave S.
2012-01-01
One of the objectives of the Gulf of Mexico Gas Hydrate Joint Industry Project Leg II (GOM JIP Leg II) was the collection of a comprehensive suite of logging-while-drilling (LWD) data within gas-hydrate-bearing sand reservoirs in order to make accurate estimates of the concentration of gas hydrates under various geologic conditions and to understand the geologic controls on the occurrence of gas hydrate at each of the sites drilled during this expedition. The LWD sensors just above the drill bit provided important information on the nature of the sediments and the occurrence of gas hydrate. There has been significant advancements in the use of downhole well-logging tools to acquire detailed information on the occurrence of gas hydrate in nature: From using electrical resistivity and acoustic logs to identify gas hydrate occurrences in wells to where wireline and advanced logging-while-drilling tools are routinely used to examine the petrophysical nature of gas hydrate reservoirs and the distribution and concentration of gas hydrates within various complex reservoir systems. Recent integrated sediment coring and well-log studies have confirmed that electrical resistivity and acoustic velocity data can yield accurate gas hydrate saturations in sediment grain supported (isotropic) systems such as sand reservoirs, but more advanced log analysis models are required to characterize gas hydrate in fractured (anisotropic) reservoir systems. In support of the GOM JIP Leg II effort, well-log data montages have been compiled and presented in this report which includes downhole logs obtained from all seven wells drilled during this expedition with a focus on identifying and characterizing the potential gas-hydrate-bearing sedimentary section in each of the wells. Also presented and reviewed in this report are the gas-hydrate saturation and sediment porosity logs for each of the wells as calculated from available downhole well logs.
ERIC Educational Resources Information Center
Almond, Russell; Deane, Paul; Quinlan, Thomas; Wagner, Michael; Sydorenko, Tetyana
2012-01-01
The Fall 2007 and Spring 2008 pilot tests for the "CBAL"™ Writing assessment included experimental keystroke logging capabilities. This report documents the approaches used to capture the keystroke logs and the algorithms used to process the outputs. It also includes some preliminary findings based on the pilot data. In particular, it…
Adjusting Quality index Log Values to Represent Local and Regional Commercial Sawlog Product Values
Orris D. McCauley; Joseph J. Mendel; Joseph J. Mendel
1969-01-01
The primary purpose of this paper is not only to report the results of a comparative analysis as to how well the Q.I. method predicts log product values when compared to commercial sawmill log output values, but also to develop a methodology which will facilitate the comparison and provide the adjustments needed by the sawmill operator.
British Columbia log export policy: historical review and analysis.
Craig W. Shinn
1993-01-01
Log exports have been restricted in British Columbia for over 100 years. The intent of the restriction is to use the timber in British Columbia to encourage development of forest industry, employment, and well-being in the Province. Logs have been exempted from the within-Province manufacturing rule at various times, in varying amounts, for different reasons, and by...
Make Log Yield Analysis Part of Your Daily Routine
Jan Wiedenbeck; Jeff Palmer; Robert Mayer
2006-01-01
You haven't been conducting regular log yield studies because you don't have extra people to assign to the task. Besides, you've been around sawmills your whole life and have an innate sense of how your logs are yielding relative to the price you paid for them. Right? At the USDA Forest Service's hardwood marketing and utilization research lab in...
Auditory Environment across the Life Span of Cochlear Implant Users: Insights from Data Logging
ERIC Educational Resources Information Center
Busch, Tobias; Vanpoucke, Filiep; van Wieringen, Astrid
2017-01-01
Purpose: We describe the natural auditory environment of people with cochlear implants (CIs), how it changes across the life span, and how it varies between individuals. Method: We performed a retrospective cross-sectional analysis of Cochlear Nucleus 6 CI sound-processor data logs. The logs were obtained from 1,501 people with CIs (ages 0-96…
When Smart People Fail: An Analysis of the Transaction Log of an Online Public Access Catalog.
ERIC Educational Resources Information Center
Peters, Thomas A.
1989-01-01
Describes a low cost study of the transaction logs of an online catalog at an academic library that examined failure rates, usage patterns, and probable causes of patron problems. The implications of the findings for bibliographic instruction and collection development are discussed and the benefits of analyzing transaction logs are identified.…
Nonuniversality of the Archie exponent due to multifractality of resistivity well logs
NASA Astrophysics Data System (ADS)
Dashtian, Hassan; Yang, Yafan; Sahimi, Muhammad
2015-12-01
Archie's law expresses a relation between the formation factor F of porous media and their porosity ϕ, F∝ϕ-m, where m is the Archie or the cementation exponent. Despite widespread use of Archie's law, the value of m and whether it is universal and independent of the type of reservoir have remained controversial. We analyze various porosity and resistivity logs along 36 wells in six Iranian oil and gas reservoirs using wavelet transform coherence and multifractal detrended fluctuation analysis. m is estimated for two sets of data: one set contains the resistivity data that include those segments of the well that contain significant clay content and one without. The analysis indicates that the well logs are multifractal and that due to the multifractality the exponent m is nonuniversal. Thus, analysis of the resistivity of laboratory or outcrop samples that are not multifractal yields estimates of m that are not applicable to well logs in oil or gas reservoirs.
Harrison, Arnell S.; Dadisman, Shawn V.; Kindinger, Jack G.; Morton, Robert A.; Blum, Mike D.; Wiese, Dana S.; Subiño, Janice A.
2007-01-01
In June of 1996, the U.S. Geological Survey conducted geophysical surveys from Nueces to Copano Bays, Texas. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, cruise log, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles and high resolution scanned TIFF images of the original paper printouts are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Harrison, Arnell S.; Dadisman, Shawn V.; Davis, Jeffrey B.; Flocks, James G.; Wiese, Dana S.
2009-01-01
From September 2 through 4, 2008, the U.S. Geological Survey and St. Johns River Water Management District (SJRWMD) conducted geophysical surveys in Lakes Cherry, Helen, Hiawassee, Louisa, and Prevatt, central Florida. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, FACS logs, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
ELSA: An integrated, semi-automated nebular abundance package
NASA Astrophysics Data System (ADS)
Johnson, Matthew D.; Levitt, Jesse S.; Henry, Richard B. C.; Kwitter, Karen B.
We present ELSA, a new modular software package, written in C, to analyze and manage spectroscopic data from emission-line objects. In addition to calculating plasma diagnostics and abundances from nebular emission lines, the software provides a number of convenient features including the ability to ingest logs produced by IRAF's splot task, to semi-automatically merge spectra in different wavelength ranges, and to automatically generate various data tables in machine-readable or LaTeX format. ELSA features a highly sophisticated interstellar reddening correction scheme that takes into account temperature and density effects as well as He II contamination of the hydrogen Balmer lines. Abundance calculations are performed using a 5-level atom approximation with recent atomic data, based on R. Henry's ABUN program. Downloading and detailed documentation for all aspects of ELSA are available at the following URL:
Birds of a Feather: Supporting Secure Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braswell III, H V
2006-04-24
Over the past few years Lawrence Livermore National Laboratory has begun the process of moving to a diskless environment in the Secure Computer Support realm. This movement has included many moving targets and increasing support complexity. We would like to set up a forum for Security and Support professionals to get together from across the Complex and discuss current deployments, lessons learned, and next steps. This would include what hardware, software, and hard copy based solutions are being used to manage Secure Computing. The topics to be discussed include but are not limited to: Diskless computing, port locking and management,more » PC, Mac, and Linux/UNIX support and setup, system imaging, security setup documentation and templates, security documentation and management, customer tracking, ticket tracking, software download and management, log management, backup/disaster recovery, and mixed media environments.« less
The smartphone as a platform for wearable cameras in health research.
Gurrin, Cathal; Qiu, Zhengwei; Hughes, Mark; Caprani, Niamh; Doherty, Aiden R; Hodges, Steve E; Smeaton, Alan F
2013-03-01
The Microsoft SenseCam, a small camera that is worn on the chest via a lanyard, increasingly is being deployed in health research. However, the SenseCam and other wearable cameras are not yet in widespread use because of a variety of factors. It is proposed that the ubiquitous smartphones can provide a more accessible alternative to SenseCam and similar devices. To perform an initial evaluation of the potential of smartphones to become an alternative to a wearable camera such as the SenseCam. In 2012, adults were supplied with a smartphone, which they wore on a lanyard, that ran life-logging software. Participants wore the smartphone for up to 1 day and the resulting life-log data were both manually annotated and automatically analyzed for the presence of visual concepts. The results were compared to prior work using the SenseCam. In total, 166,000 smartphone photos were gathered from 47 individuals, along with associated sensor readings. The average time spent wearing the device across all users was 5 hours 39 minutes (SD=4 hours 11 minutes). A subset of 36,698 photos was selected for manual annotation by five researchers. Software analysis of these photos supports the automatic identification of activities to a similar level of accuracy as for SenseCam images in a previous study. Many aspects of the functionality of a SenseCam largely can be replicated, and in some cases enhanced, by the ubiquitous smartphone platform. This makes smartphones good candidates for a new generation of wearable sensing devices in health research, because of their widespread use across many populations. It is envisioned that smartphones will provide a compelling alternative to the dedicated SenseCam hardware for a number of users and application areas. This will be achieved by integrating new types of sensor data, leveraging the smartphone's real-time connectivity and rich user interface, and providing support for a range of relatively sophisticated applications. Copyright © 2013 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Manfredini, A F; Malagoni, A M; Litmanen, H; Zhukovskaja, L; Jeannier, P; Dal Follo, D; Felisatti, M; Besseberg, A; Geistlinger, M; Bayer, P; Carrabre, J E
2011-03-01
Substances and methods used to increase oxygen blood transport and physical performance can be detected in the blood, but the screening of the athletes to be tested remains a critical issue for the International Federations. This project, AR.I.E.T.T.A., aimed to develop a software capable of analysing athletes' hematological and performance profiles to detect abnormal patterns. One-hundred eighty athletes belonging to the International Biathlon Union gave written informed consent to have their hematological data, previously collected according to anti-doping rules, used to develop the AR.I.E.T.T.A. software. Software was developed with the included sections: 1) log-in; 2) data-entry: where data are loaded, stored and grouped; 3) analysis: where data are analysed, validated scores are calculated, and parameters are simultaneously displayed as statistics, tables and graphs, and individual or subpopulation profiles; 4) screening: where an immediate evaluation of the risk score of the present sample and/or the athlete under study is obtained. The sample risk score or AR.I.E.T.T.A. score is calculated by a simple computational system combining different parameters (absolute values and intra-individual variations) considered concurrently. The AR.I.E.T.T.A. score is obtained by the sum of the deviation units derived from each parameter, considering the shift of the present value from the reference values, based on the number of standard deviations. AR.I.E.T.T.A. enables a quick evaluation of blood results assisting surveillance programs and perform timely target testing controls on athletes by the International Federations. Future studies aiming to validate the AR.I.E.T.T.A. score and improve the diagnostic accuracy will improve the system.
Guhathakurta, Debarpan; Dutta, Anirban
2016-01-01
Transcranial direct current stimulation (tDCS) modulates cortical neural activity and hemodynamics. Electrophysiological methods (electroencephalography-EEG) measure neural activity while optical methods (near-infrared spectroscopy-NIRS) measure hemodynamics coupled through neurovascular coupling (NVC). Assessment of NVC requires development of NIRS-EEG joint-imaging sensor montages that are sensitive to the tDCS affected brain areas. In this methods paper, we present a software pipeline incorporating freely available software tools that can be used to target vascular territories with tDCS and develop a NIRS-EEG probe for joint imaging of tDCS-evoked responses. We apply this software pipeline to target primarily the outer convexity of the brain territory (superficial divisions) of the middle cerebral artery (MCA). We then present a computational method based on Empirical Mode Decomposition of NIRS and EEG time series into a set of intrinsic mode functions (IMFs), and then perform a cross-correlation analysis on those IMFs from NIRS and EEG signals to model NVC at the lesional and contralesional hemispheres of an ischemic stroke patient. For the contralesional hemisphere, a strong positive correlation between IMFs of regional cerebral hemoglobin oxygen saturation and the log-transformed mean-power time-series of IMFs for EEG with a lag of about -15 s was found after a cumulative 550 s stimulation of anodal tDCS. It is postulated that system identification, for example using a continuous-time autoregressive model, of this coupling relation under tDCS perturbation may provide spatiotemporal discriminatory features for the identification of ischemia. Furthermore, portable NIRS-EEG joint imaging can be incorporated into brain computer interfaces to monitor tDCS-facilitated neurointervention as well as cortical reorganization.
Guhathakurta, Debarpan; Dutta, Anirban
2016-01-01
Transcranial direct current stimulation (tDCS) modulates cortical neural activity and hemodynamics. Electrophysiological methods (electroencephalography-EEG) measure neural activity while optical methods (near-infrared spectroscopy-NIRS) measure hemodynamics coupled through neurovascular coupling (NVC). Assessment of NVC requires development of NIRS-EEG joint-imaging sensor montages that are sensitive to the tDCS affected brain areas. In this methods paper, we present a software pipeline incorporating freely available software tools that can be used to target vascular territories with tDCS and develop a NIRS-EEG probe for joint imaging of tDCS-evoked responses. We apply this software pipeline to target primarily the outer convexity of the brain territory (superficial divisions) of the middle cerebral artery (MCA). We then present a computational method based on Empirical Mode Decomposition of NIRS and EEG time series into a set of intrinsic mode functions (IMFs), and then perform a cross-correlation analysis on those IMFs from NIRS and EEG signals to model NVC at the lesional and contralesional hemispheres of an ischemic stroke patient. For the contralesional hemisphere, a strong positive correlation between IMFs of regional cerebral hemoglobin oxygen saturation and the log-transformed mean-power time-series of IMFs for EEG with a lag of about −15 s was found after a cumulative 550 s stimulation of anodal tDCS. It is postulated that system identification, for example using a continuous-time autoregressive model, of this coupling relation under tDCS perturbation may provide spatiotemporal discriminatory features for the identification of ischemia. Furthermore, portable NIRS-EEG joint imaging can be incorporated into brain computer interfaces to monitor tDCS-facilitated neurointervention as well as cortical reorganization. PMID:27378836
NASA Technical Reports Server (NTRS)
Becker, D. D.
1980-01-01
The orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are examined. Potential interaction with the software is examined through an evaluation of the software requirements. The analysis is restricted to flight software requirements and excludes utility/checkout software. The results of the hardware/software interaction analysis for the forward reaction control system are presented.
Gandler, W; Shapiro, H
1990-01-01
Logarithmic amplifiers (log amps), which produce an output signal proportional to the logarithm of the input signal, are widely used in cytometry for measurements of parameters that vary over a wide dynamic range, e.g., cell surface immunofluorescence. Existing log amp circuits all deviate to some extent from ideal performance with respect to dynamic range and fidelity to the logarithmic curve; accuracy in quantitative analysis using log amps therefore requires that log amps be individually calibrated. However, accuracy and precision may be limited by photon statistics and system noise when very low level input signals are encountered.
A Spreadsheet for a 2 x 3 x 2 Log-Linear Analysis. AIR 1991 Annual Forum Paper.
ERIC Educational Resources Information Center
Saupe, Joe L.
This paper describes a personal computer spreadsheet set up to carry out hierarchical log-linear analyses, a type of analysis useful for institutional research into multidimensional frequency tables formed from categorical variables such as faculty rank, student class level, gender, or retention status. The spreadsheet provides a concrete vehicle…
Users' Perceptions of the Web As Revealed by Transaction Log Analysis.
ERIC Educational Resources Information Center
Moukdad, Haidar; Large, Andrew
2001-01-01
Describes the results of a transaction log analysis of a Web search engine, WebCrawler, to analyze user's queries for information retrieval. Results suggest most users do not employ advanced search features, and the linguistic structure often resembles a human-human communication model that is not always successful in human-computer communication.…
Catching errors with patient-specific pretreatment machine log file analysis.
Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa
2013-01-01
A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
Fuzzy inference system for identification of geological stratigraphy off Prydz Bay, East Antarctica
NASA Astrophysics Data System (ADS)
Singh, Upendra K.
2011-12-01
The analysis of well logging data plays key role in the exploration and development of hydrocarbon reservoirs. Various well log parameters such as porosity, gamma ray, density, transit time and resistivity, help in classification of strata and estimation of the physical, electrical and acoustical properties of the subsurface lithology. Strong and conspicuous changes in some of the log parameters associated with any particular geological stratigraphy formation are function of its composition, physical properties that help in classification. However some substrata show moderate values in respective log parameters and make difficult to identify the kind of strata, if we go by the standard variability ranges of any log parameters and visual inspection. The complexity increases further with more number of sensors involved. An attempt is made to identify the kinds of stratigraphy from well logs over Prydz bay basin, East Antarctica using fuzzy inference system. A model is built based on few data sets of known stratigraphy and further the network model is used as test model to infer the lithology of a borehole from their geophysical logs, not used in simulation. Initially the fuzzy based algorithm is trained, validated and tested on well log data and finally identifies the formation lithology of a hydrocarbon reservoir system of study area. The effectiveness of this technique is demonstrated by the analysis of the results for actual lithologs and coring data of ODP Leg 188. The fuzzy results show that the training performance equals to 82.95% while the prediction ability is 87.69%. The fuzzy results are very encouraging and the model is able to decipher even thin layer seams and other strata from geophysical logs. The result provides the significant sand formation of depth range 316.0- 341.0 m, where core recovery is incomplete.
Development of 6-DOF painting robot control system
NASA Astrophysics Data System (ADS)
Huang, Junbiao; Liu, Jianqun; Gao, Weiqiang
2017-01-01
With the development of society, the spraying technology of manufacturing industry in China has changed from the manual operation to the 6-DOF (Degree Of Freedom)robot automatic spraying. Spraying painting robot can not only complete the work which does harm to human being, but also improve the production efficiency and save labor costs. Control system is the most critical part of the 6-DOF robots, however, there is still a lack of relevant technology research in China. It is very necessary to study a kind of control system of 6-DOF spraying painting robots which is easy to operation, and has high efficiency and stable performance. With Googol controller platform, this paper develops programs based on Windows CE embedded systems to control the robot to finish the painting work. Software development is the core of the robot control system, including the direct teaching module, playback module, motion control module, setting module, man-machine interface, alarm module, log module, etc. All the development work of the entire software system has been completed, and it has been verified that the entire software works steady and efficient.
NASA Astrophysics Data System (ADS)
Wallis, Eric; Griffin, Todd M.; Popkie, Norm, Jr.; Eagan, Michael A.; McAtee, Robert F.; Vrazel, Danet; McKinly, Jim
2005-05-01
Ion Mobility Spectroscopy (IMS) is the most widespread detection technique in use by the military for the detection of chemical warfare agents, explosives, and other threat agents. Moreover, its role in homeland security and force protection has expanded due, in part, to its good sensitivity, low power, lightweight, and reasonable cost. With the increased use of IMS systems as continuous monitors, it becomes necessary to develop tools and methodologies to ensure optimal performance over a wide range of conditions and extended periods of time. Namely, instrument calibration is needed to ensure proper sensitivity and to correct for matrix or environmental effects. We have developed methodologies to deal with the semi-quantitative nature of IMS and allow us to generate response curves that allow a gauge of instrument performance and maintenance requirements. This instrumentation communicates to the IMS systems via a software interface that was developed in-house. The software measures system response, logs information to a database, and generates the response curves. This paper will discuss the instrumentation, software, data collected, and initial results from fielded systems.
Comparison of MWD and wireline applications and decision criteria, Malay Basin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zainun, K.; Redzuan, M.; Said, M.
1994-07-01
Since 1987, usage of measurement while drilling (MWD) technology within Esso Production Malaysia Inc. (EPMI) has evolved from an auxiliary directional drilling service to providing a reliable alternative to wireline logs for formation evaluation and well-completion purposes. The shift in EPMI's attitude toward the use of MWD in formation evaluation is attributed to the availability of a complete suite of logging services for the log analysis procedure, accuracy of the data, sufficient control in reservoir quality, and continuity in fields where there are already a high density of wireline-logged wells, increasing number of high angle and horizontal wells being drilled,more » a favorable track record, and realized economic benefits. The in-house analysis procedure, (EPMILOG[sup 6]), requires the availability of a deep and/or shallow investigating resistivity, formation density, neutron porosity, and gamma ray tools for a complete analysis. The availability of these services in MWD and also comparative evaluations of MWD responses with their correlative wireline counterparts show that MWD technology can be used, to a large extent, to complement or replace routine wireline logging services. MWD resistivity measurements are frequently observed to be less effected by mud filtrate invasion than the correlative wireline measurements and are, therefore, closer to the true resistivity of the formation. MWD formation evaluation services are most widely used in fields where there are already a high density of wells that were logged using wireline. The MWD data is used to decide perforation depths and intervals.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watney, W. Lynn
2014-09-30
1. Drilled, cored, and logged three wells to the basement and collecting more than 2,700 ft of conventional core; obtained 20 m i2 of multicomponent 3D seismic imaging and merged and reprocessed more than 125 mi 2 of existing 3D seismic data for use in modeling CO 2- EOR oil recovery and CO 2 storage in five oil fields in southern Kansas. 2. Determined the technical feasibility of injecting and sequestering CO 2 in a set of four depleted oil reservoirs in the Cutter, Pleasant Prairie South, Eubank, and Shuck fields in southwest Kansas; of concurrently recovering oil from thosemore » fields; and of quantifying the volumes of CO2 sequestered and oil recovered during the process. 3. Formed a consortium of six oil operating companies, five of which own and operate the four fields. The consortium became part of the Southwest Kansas CO 2-EOR Initiative for the purpose of sharing data, knowledge, and interest in understanding the potential for CO 2-EOR in Kansas. 4. Built a regional well database covering 30,000 mi 2 and containing stratigraphic tops from ~90,000 wells; correlated 30 major stratigraphic horizons; digitized key wells, including wireline logs and sample logs; and analyzed more than 3,000 drill stem tests to establish that fluid levels in deep aquifers below the Permian evaporites are not connected to the surface and therefore pressures are not hydrostatic. Connectivity with the surface aquifers is lacking because shale aquitards and impermeable evaporite layers consist of both halite and anhydrite. 5. Developed extensive web applications and an interactive mapping system that do the following: a. Facilitate access to a wide array of data obtained in the study, including core descriptions and analyses, sample logs, digital (LAS) well logs, seismic data, gravity and magnetics maps, structural and stratigraphic maps, inferred fault traces, earthquakes, Class I and II disposal wells, and surface lineaments. b. Provide real-time analysis of the project dataset, including automated integration and viewing of well logs, core, core analyses, brine chemistry, and stratigraphy using the Java Profile app. A cross-section app allows for the display of log data for up to four wells at a time. 6. Integrated interpretations from the project’s interactive web-based mapping system to gain insights to aid in assessing the efficacy of geologic CO 2 storage in Kansas and insights toward understanding recent seismicity to aid in evaluating induced vs. naturally occurring earthquakes. 7. Developed a digital type-log system, including web-based software to modify and refine stratigraphic nomenclature to provide stakeholders a common means for communication about the subsurface. 8. Contracted use of a nuclear magnetic resonance (NMR) log and ran it slowly to capture response and characterize larger pores common for carbonate reservoirs. Used NMR to extend core analyses to apply permeability, relative permeability to CO 2, and capillary pressure to the major rock types, each uniquely expressed as a reservoir quality index (RQI), present in the Mississippian and Arbuckle rocks. 9. Characterized and evaluated the possible role of microbes in dense brines. Used microbes to compliment H/O stable isotopes to fingerprint brine systems. Used perforation/swabbing to obtain samples from multiple hydrostratigraphic units and confirmed equivalent results using less expensive drill stem tests (DST). 10. Used an integrated approach from whole core, logs, tests, and seismic to verify and quantify properties of vuggy, brecciated, and fractured carbonate intervals. 11. Used complex geocellular static and dynamic models to evaluate regional storage capacity using large parallel processing. 12. Carbonates are complex reservoirs and CO 2-EOR needs to move to the next generation to increase effectiveness of CO 2 and efficiency and safety of the injection.« less
Logging-while-coring method and apparatus
Goldberg, David S.; Myers, Gregory J.
2007-11-13
A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.
Logging-while-coring method and apparatus
Goldberg, David S.; Myers, Gregory J.
2007-01-30
A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.
"Geo-statistics methods and neural networks in geophysical applications: A case study"
NASA Astrophysics Data System (ADS)
Rodriguez Sandoval, R.; Urrutia Fucugauchi, J.; Ramirez Cruz, L. C.
2008-12-01
The study is focus in the Ebano-Panuco basin of northeastern Mexico, which is being explored for hydrocarbon reservoirs. These reservoirs are in limestones and there is interest in determining porosity and permeability in the carbonate sequences. The porosity maps presented in this study are estimated from application of multiattribute and neural networks techniques, which combine geophysics logs and 3-D seismic data by means of statistical relationships. The multiattribute analysis is a process to predict a volume of any underground petrophysical measurement from well-log and seismic data. The data consist of a series of target logs from wells which tie a 3-D seismic volume. The target logs are neutron porosity logs. From the 3-D seismic volume a series of sample attributes is calculated. The objective of this study is to derive a set of attributes and the target log values. The selected set is determined by a process of forward stepwise regression. The analysis can be linear or nonlinear. In the linear mode the method consists of a series of weights derived by least-square minimization. In the nonlinear mode, a neural network is trained using the select attributes as inputs. In this case we used a probabilistic neural network PNN. The method is applied to a real data set from PEMEX. For better reservoir characterization the porosity distribution was estimated using both techniques. The case shown a continues improvement in the prediction of the porosity from the multiattribute to the neural network analysis. The improvement is in the training and the validation, which are important indicators of the reliability of the results. The neural network showed an improvement in resolution over the multiattribute analysis. The final maps provide more realistic results of the porosity distribution.
Lumber grades from old-growth Douglas-fir sawmill logs.
E.E. Matson
1956-01-01
The Pacific Northwest Forest and Range Experiment Station has made in recent years several studies to determine the grades of lumber that could be expected from various sizes and grades of logs. Data on No. 2 and No. 3 Sawmill logs obtained in four Oregon old-growth Douglas-fir studies have been analyzed and combined in this report. Included in this analysis are data...
Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt; Philip A. Araman
2005-01-01
This paper describes recent progress in the analysis of computed tomography (CT) images of hardwood logs. The long-term goal of the work is to develop a system that is capable of autonomous (or semiautonomous) detection of internal defects, so that log breakdown decisions can be optimized based on defect locations. The problem is difficult because wood exhibits large...
van den Berg, Thomas J T P; Franssen, Luuk; Kruijt, Bastiaan; Coppens, Joris E
2011-08-01
The current paper describes the design and population testing of a flicker sensitivity assessment technique corresponding to the psychophysical approach for straylight measurement. The purpose is twofold: to check the subjects' capability to perform the straylight test and as a test for retinal integrity for other purposes. The test was implemented in the Oculus C-Quant straylight meter, using homemade software (MATLAB). The geometry of the visual field lay-out was identical, as was the subjects' 2AFC task. A comparable reliability criterion ("unc") was developed. Outcome measure was logTCS (temporal contrast sensitivity). The population test was performed in science fair settings on about 400 subjects. Moreover, 2 subjects underwent extensive tests to check whether optical defects, mimicked with trial lenses and scatter filters, affected the TCS outcome. Repeated measures standard deviation was 0.11 log units for the reference population. Normal values for logTCS were around 2 (threshold 1%) with some dependence on age (range 6 to 85 years). The test outcome did not change upon a tenfold (optical) deterioration in visual acuity or straylight. The test has adequate precision for checking a subject's capability to perform straylight assessment. The unc reliability criterion ensures sufficient precision, also for assessment of retinal sensitivity loss.
Position Paper - pFLogger: The Parallel Fortran Logging framework for HPC Applications
NASA Technical Reports Server (NTRS)
Clune, Thomas L.; Cruz, Carlos A.
2017-01-01
In the context of high performance computing (HPC), software investments in support of text-based diagnostics, which monitor a running application, are typically limited compared to those for other types of IO. Examples of such diagnostics include reiteration of configuration parameters, progress indicators, simple metrics (e.g., mass conservation, convergence of solvers, etc.), and timers. To some degree, this difference in priority is justifiable as other forms of output are the primary products of a scientific model and, due to their large data volume, much more likely to be a significant performance concern. In contrast, text-based diagnostic content is generally not shared beyond the individual or group running an application and is most often used to troubleshoot when something goes wrong. We suggest that a more systematic approach enabled by a logging facility (or logger) similar to those routinely used by many communities would provide significant value to complex scientific applications. In the context of high-performance computing, an appropriate logger would provide specialized support for distributed and shared-memory parallelism and have low performance overhead. In this paper, we present our prototype implementation of pFlogger a parallel Fortran-based logging framework, and assess its suitability for use in a complex scientific application.
POSITION PAPER - pFLogger: The Parallel Fortran Logging Framework for HPC Applications
NASA Technical Reports Server (NTRS)
Clune, Thomas L.; Cruz, Carlos A.
2017-01-01
In the context of high performance computing (HPC), software investments in support of text-based diagnostics, which monitor a running application, are typically limited compared to those for other types of IO. Examples of such diagnostics include reiteration of configuration parameters, progress indicators, simple metrics (e.g., mass conservation, convergence of solvers, etc.), and timers. To some degree, this difference in priority is justifiable as other forms of output are the primary products of a scientific model and, due to their large data volume, much more likely to be a significant performance concern. In contrast, text-based diagnostic content is generally not shared beyond the individual or group running an application and is most often used to troubleshoot when something goes wrong. We suggest that a more systematic approach enabled by a logging facility (or 'logger') similar to those routinely used by many communities would provide significant value to complex scientific applications. In the context of high-performance computing, an appropriate logger would provide specialized support for distributed and shared-memory parallelism and have low performance overhead. In this paper, we present our prototype implementation of pFlogger - a parallel Fortran-based logging framework, and assess its suitability for use in a complex scientific application.
Determining the spatial altitude of the hydraulic fractures.
NASA Astrophysics Data System (ADS)
Khamiev, Marsel; Kosarev, Victor; Goncharova, Galina
2016-04-01
Mathematical modeling and numerical simulation are the most widely used approaches for the solving geological problems. They imply software tools which are based on Monte Carlo method. The results of this project presents shows the possibility of using PNL tool to determine fracturing location. The modeled media is a homogeneous rock (limestone) cut by a vertical borehole (d=216 mm) with metal casing 9 mm thick. The cement sheath is 35 mm thick. The borehole is filled with fresh water. The rock mass is cut by crack, filled with a mixture of doped (gadolinium oxide Gd2O3) proppant (75%) and water (25%). A pulse neutron logging (PNL) tool is used for quality control in hydraulic fracturing operations. It includes a fast neutron source (so-called "neutron generator") and a set of thermal (or epithermal) neutron-sensing devices, forming the so-called near (ND) and far (FD) detectors. To evaluate neutron properties various segments (sectors) of the rock mass, the detector must register only neutrons that come from this very formation. It's possible if detecting block includes some (6 for example) thermal neutron detectors arranged circumferentially inside the tool. As a result we get few independent well logs, each accords with define rock sector. Afterwards synthetic logs processing we can determine spatial position of the hydraulic fracture.
Four applications of a software data collection and analysis methodology
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Selby, Richard W., Jr.
1985-01-01
The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.
NASA Technical Reports Server (NTRS)
Lange, R. Connor
2012-01-01
Ever since Explorer-1, the United States' first Earth satellite, was developed and launched in 1958, JPL has developed many more spacecraft, including landers and orbiters. While these spacecraft vary greatly in their missions, capabilities,and destination, they all have something in common. All of the components of these spacecraft had to be comprehensively tested. While thorough testing is important to mitigate risk, it is also a very expensive and time consuming process. Thankfully,since virtually all of the software testing procedures for SMAP are computer controlled, these procedures can be automated. Most people testing SMAP flight software (FSW) would only need to write tests that exercise specific requirements and then check the filtered results to verify everything occurred as planned. This gives developers the ability to automatically launch tests on the testbed, distill the resulting logs into only the important information, generate validation documentation, and then deliver the documentation to management. With many of the steps in FSW testing automated, developers can use their limited time more effectively and can validate SMAP FSW modules quicker and test them more rigorously. As a result of the various benefits of automating much of the testing process, management is considering this automated tools use in future FSW validation efforts.
Real-Time Multimission Event Notification System for Mars Relay
NASA Technical Reports Server (NTRS)
Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Wang, Paul; Hy, Franklin H.
2013-01-01
As the Mars Relay Network is in constant flux (missions and teams going through their daily workflow), it is imperative that users are aware of such state changes. For example, a change by an orbiter team can affect operations on a lander team. This software provides an ambient view of the real-time status of the Mars network. The Mars Relay Operations Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay Network. As part of MaROS, a feature set was developed that operates on several levels of the software architecture. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. The result is a real-time event notification and management system, so mission teams can track and act upon events on a moment-by-moment basis. This software retrieves events from MaROS and displays them to the end user. Updates happen in real time, i.e., messages are pushed to the user while logged into the system, and queued when the user is not online for later viewing. The software does not do away with the email notifications, but augments them with in-line notifications. Further, this software expands the events that can generate a notification, and allows user-generated notifications. Existing software sends a smaller subset of mission-generated notifications via email. A common complaint of users was that the system-generated e-mails often "get lost" with other e-mail that comes in. This software allows for an expanded set (including user-generated) of notifications displayed in-line of the program. By separating notifications, this can improve a user's workflow.
NASA Astrophysics Data System (ADS)
Melton, R.; Thomas, J.
With the rapid growth in the number of space actors, there has been a marked increase in the complexity and diversity of software systems utilized to support SSA target tracking, indication, warning, and collision avoidance. Historically, most SSA software has been constructed with "closed" proprietary code, which limits interoperability, inhibits the code transparency that some SSA customers need to develop domain expertise, and prevents the rapid injection of innovative concepts into these systems. Open-source aerospace software, a rapidly emerging, alternative trend in code development, is based on open collaboration, which has the potential to bring greater transparency, interoperability, flexibility, and reduced development costs. Open-source software is easily adaptable, geared to rapidly changing mission needs, and can generally be delivered at lower costs to meet mission requirements. This paper outlines Ball's COSMOS C2 system, a fully open-source, web-enabled, command-and-control software architecture which provides several unique capabilities to move the current legacy SSA software paradigm to an open source model that effectively enables pre- and post-launch asset command and control. Among the unique characteristics of COSMOS is the ease with which it can integrate with diverse hardware. This characteristic enables COSMOS to serve as the command-and-control platform for the full life-cycle development of SSA assets, from board test, to box test, to system integration and test, to on-orbit operations. The use of a modern scripting language, Ruby, also permits automated procedures to provide highly complex decision making for the tasking of SSA assets based on both telemetry data and data received from outside sources. Detailed logging enables quick anomaly detection and resolution. Integrated real-time and offline data graphing renders the visualization of the both ground and on-orbit assets simple and straightforward.
NASA Astrophysics Data System (ADS)
Tunc, Suleyman; Tunc, Berna; Caka, Deniz; Baris, Serif
2016-04-01
Locating and calculating size of the seismic events is quickly one of the most important and challenging issue in especially real time seismology. In this study, we developed a Matlab application to locate seismic events and calculate their magnitudes (Local Magnitude and empirical Moment Magnitude) using single station called SSL_Calc. This newly developed sSoftware has been tested on the all stations of the Marsite project "New Directions in Seismic Hazard Assessment through Focused Earth Observation in the Marmara Supersite-MARsite". SSL_Calc algorithm is suitable both for velocity and acceleration sensors. Data has to be in GCF (Güralp Compressed Format). Online or offline data can be selected in SCREAM software (belongs to Guralp Systems Limited) and transferred to SSL_Calc. To locate event P and S wave picks have to be marked by using SSL_Calc window manually. During magnitude calculation, instrument correction has been removed and converted to real displacement in millimeter. Then the displacement data is converted to Wood Anderson Seismometer output by using; Z=[0;0]; P=[-6.28+4.71j; -6.28-4.71j]; A0=[2080] parameters. For Local Magnitude calculation,; maximum displacement amplitude (A) and distance (dist) are used in formula (1) for distances up to 200km and formula (2) for more than 200km. ML=log10(A)-(-1.118-0.0647*dist+0.00071*dist2-3.39E-6*dist3+5.71e-9*dist4) (1) ML=log10(A)+(2.1173+0.0082*dist-0.0000059628*dist2) (2) Following Local Magnitude calculation, the programcode calculates two empiric Moment Magnitudes using formulas (3) Akkar et al. (2010) and (4) Ulusay et al. (2004). Mw=0.953* ML+0.422 (3) Mw=0.7768* ML+1.5921 (4) SSL_Calc is a software that is easy to implement and user friendly and offers practical solution to individual users to location of event and ML, Mw calculation.
Time trends in recurrence of juvenile nasopharyngeal angiofibroma: Experience of the past 4 decades.
Mishra, Anupam; Mishra, Subhash Chandra
2016-01-01
An analysis of time distribution of juvenile nasopharyngeal angiofibroma (JNA) from the last 4 decades is presented. Sixty recurrences were analyzed as per actuarial survival. SPSS software was used to generate Kaplan-Meier (KM) curves and time distributions were compared by Log-rank, Breslow and Tarone-Ware test. The overall recurrence rate was 17.59%. Majority underwent open transpalatal approach(es) without embolization. The probability of detecting a recurrence was 95% in first 24months and comparison of KM curves of 4 different time periods was not significant. This is the first and largest series to address the time-distribution. The required follow up period is 2years. Our recurrence is just half of the largest series (reported so far) suggesting the superiority of transpalatal techniques. The similarity of curves suggests less likelihood for recent technical advances to influence the recurrence that as per our hypothesis is more likely to reflect tumor biology per se. Copyright © 2016 Elsevier Inc. All rights reserved.
Automatically assisting human memory: a SenseCam browser.
Doherty, Aiden R; Moulin, Chris J A; Smeaton, Alan F
2011-10-01
SenseCams have many potential applications as tools for lifelogging, including the possibility of use as a memory rehabilitation tool. Given that a SenseCam can log hundreds of thousands of images per year, it is critical that these be presented to the viewer in a manner that supports the aims of memory rehabilitation. In this article we report a software browser constructed with the aim of using the characteristics of memory to organise SenseCam images into a form that makes the wealth of information stored on SenseCam more accessible. To enable a large amount of visual information to be easily and quickly assimilated by a user, we apply a series of automatic content analysis techniques to structure the images into "events", suggest their relative importance, and select representative images for each. This minimises effort when browsing and searching. We provide anecdotes on use of such a system and emphasise the need for SenseCam images to be meaningfully sorted using such a browser.
An object-oriented approach to data display and storage: 3 years experience, 25,000 cases.
Sainsbury, D A
1993-11-01
Object-oriented programming techniques were used to develop computer based data display and storage systems. These have been operating in the 8 anaesthetising areas of the Adelaide Children's Hospital for 3 years. The analogue and serial outputs from an array of patient monitors are connected to IBM compatible PC-XT computers. The information is displayed on a colour screen as wave-form and trend graphs and digital format in 'real time'. The trend data is printed simultaneously on a dot matrix printer. This data is also stored for 24 hours on 'hard' disk. The major benefit has been the provision of a single visual focus for all monitored variables. The automatic logging of data has been invaluable in the analysis of critical incidents. The systems were made possible by recent, rapid improvements in computer hardware and software. This paper traces the development of the program and demonstrates the advantages of object-oriented programming techniques.
Competing regression models for longitudinal data.
Alencar, Airlane P; Singer, Julio M; Rocha, Francisco Marcelo M
2012-03-01
The choice of an appropriate family of linear models for the analysis of longitudinal data is often a matter of concern for practitioners. To attenuate such difficulties, we discuss some issues that emerge when analyzing this type of data via a practical example involving pretest-posttest longitudinal data. In particular, we consider log-normal linear mixed models (LNLMM), generalized linear mixed models (GLMM), and models based on generalized estimating equations (GEE). We show how some special features of the data, like a nonconstant coefficient of variation, may be handled in the three approaches and evaluate their performance with respect to the magnitude of standard errors of interpretable and comparable parameters. We also show how different diagnostic tools may be employed to identify outliers and comment on available software. We conclude by noting that the results are similar, but that GEE-based models may be preferable when the goal is to compare the marginal expected responses. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A Log Logistic Survival Model Applied to Hypobaric Decompression Sickness
NASA Technical Reports Server (NTRS)
Conkin, Johnny
2001-01-01
Decompression sickness (DCS) is a complex, multivariable problem. A mathematical description or model of the likelihood of DCS requires a large amount of quality research data, ideas on how to define a decompression dose using physical and physiological variables, and an appropriate analytical approach. It also requires a high-performance computer with specialized software. I have used published DCS data to develop my decompression doses, which are variants of equilibrium expressions for evolved gas plus other explanatory variables. My analytical approach is survival analysis, where the time of DCS occurrence is modeled. My conclusions can be applied to simple hypobaric decompressions - ascents lasting from 5 to 30 minutes - and, after minutes to hours, to denitrogenation (prebreathing). They are also applicable to long or short exposures, and can be used whether the sufferer of DCS is at rest or exercising at altitude. Ultimately I would like my models to be applied to astronauts to reduce the risk of DCS during spacewalks, as well as to future spaceflight crews on the Moon and Mars.
AirLab: a cloud-based platform to manage and share antibody-based single-cell research.
Catena, Raúl; Özcan, Alaz; Jacobs, Andrea; Chevrier, Stephane; Bodenmiller, Bernd
2016-06-29
Single-cell analysis technologies are essential tools in research and clinical diagnostics. These methods include flow cytometry, mass cytometry, and other microfluidics-based technologies. Most laboratories that employ these methods maintain large repositories of antibodies. These ever-growing collections of antibodies, their multiple conjugates, and the large amounts of data generated in assays using specific antibodies and conditions makes a dedicated software solution necessary. We have developed AirLab, a cloud-based tool with web and mobile interfaces, for the organization of these data. AirLab streamlines the processes of antibody purchase, organization, and storage, antibody panel creation, results logging, and antibody validation data sharing and distribution. Furthermore, AirLab enables inventory of other laboratory stocks, such as primers or clinical samples, through user-controlled customization. Thus, AirLab is a mobile-powered and flexible tool that harnesses the capabilities of mobile tools and cloud-based technology to facilitate inventory and sharing of antibody and sample collections and associated validation data.
Porcaro, Antonio B; Ghimenton, Claudio; Petrozziello, Aldo; Sava, Teodoro; Migliorini, Filippo; Romano, Mario; Caruso, Beatrice; Cocco, Claudio; Antoniolli, Stefano Zecchinini; Lacola, Vincenzo; Rubilotta, Emanuele; Monaco, Carmelo
2012-10-01
To evaluate estradiol (E(2)) physiopathology along the pituitary-testicular-prostate axis at the time of initial diagnosis of prostate cancer (PC) and subsequent cluster selection of the patient population. Records of the diagnosed (n=105) and operated (n=91) patients were retrospectively reviewed. Age, percentage of positive cores at-biopsy (P+), biopsy Gleason score (bGS), E(2), prolactin (PRL), luteinizing hormone (LH), follicle-stimulating hormone (FSH), total testosterone (TT), free-testosterone (FT), prostate-specific antigen (PSA), pathology Gleason score (pGS), estimated tumor volume in relation to percentage of prostate volume (V+), overall prostate weight (Wi), clinical stage (cT), biopsy Gleason pattern (bGP) and pathology stage (pT), were the investigated variables. None of the patients had previously undergone hormonal manipulations. E(2) correlation and prediction by multiple linear regression analysis (MLRA) was performed. At diagnosis, the log E(2)/log bGS ratio clustered the population into groups A (log E(2)/log bGS ≤ 2.25), B (2.25
SU-F-T-295: MLCs Performance and Patient-Specific IMRT QA Using Log File Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osman, A; American University of Biuret Medical Center, Biuret; Maalej, N
2016-06-15
Purpose: To analyze the performance of the multi-leaf collimators (MLCs) from the log files recorded during the intensity modulated radiotherapy (IMRT) treatment and to construct the relative fluence maps and do the gamma analysis to compare the planned and executed MLCs movement. Methods: We developed a program to extract and analyze the data from dynamic log files (dynalog files) generated from sliding window IMRT delivery treatments. The program extracts the planned and executed (actual or delivered) MLCs movement, calculates and compares the relative planned and executed fluences. The fluence maps were used to perform the gamma analysis (with 3% dosemore » difference and 3 mm distance to agreement) for 3 IMR patients. We compared our gamma analysis results with those obtained from portal dose image prediction (PDIP) algorithm performed using the EPID. Results: For 3 different IMRT patient treatments, the maximum difference between the planned and the executed MCLs positions was 1.2 mm. The gamma analysis results of the planned and delivered fluences were in good agreement with the gamma analysis from portal dosimetry. The maximum difference for number of pixels passing the gamma criteria (3%/3mm) was 0.19% with respect to portal dosimetry results. Conclusion: MLC log files can be used to verify the performance of the MLCs. Patientspecific IMRT QA based on MLC movement log files gives similar results to EPID dosimetry results. This promising method for patient-specific IMRT QA is fast, does not require dose measurements in a phantom, can be done before the treatment and for every fraction, and significantly reduces the IMRT workload. The author would like to thank King Fahd University of petroleum and Minerals for the support.« less
NASA Astrophysics Data System (ADS)
Hsieh, Bieng-Zih; Lewis, Charles; Lin, Zsay-Shing
2005-04-01
The purpose of this study is to construct a fuzzy lithology system from well logs to identify formation lithology of a groundwater aquifer system in order to better apply conventional well logging interpretation in hydro-geologic studies because well log responses of aquifers are sometimes different from those of conventional oil and gas reservoirs. The input variables for this system are the gamma-ray log reading, the separation between the spherically focused resistivity and the deep very-enhanced resistivity curves, and the borehole compensated sonic log reading. The output variable is groundwater formation lithology. All linguistic variables are based on five linguistic terms with a trapezoidal membership function. In this study, 50 data sets are clustered into 40 training sets and 10 testing sets for constructing the fuzzy lithology system and validating the ability of system prediction, respectively. The rule-based database containing 12 fuzzy lithology rules is developed from the training data sets, and the rule strength is weighted. A Madani inference system and the bisector of area defuzzification method are used for fuzzy inference and defuzzification. The success of training performance and the prediction ability were both 90%, with the calculated correlation of training and testing equal to 0.925 and 0.928, respectively. Well logs and core data from a clastic aquifer (depths 100-198 m) in the Shui-Lin area of west-central Taiwan are used for testing the system's construction. Comparison of results from core analysis, well logging and the fuzzy lithology system indicates that even though the well logging method can easily define a permeable sand formation, distinguishing between silts and sands and determining grain size variation in sands is more subjective. These shortcomings can be improved by a fuzzy lithology system that is able to yield more objective decisions than some conventional methods of log interpretation.
Lee, Myung W.
2012-01-01
Through the use of three-dimensional seismic amplitude mapping, several gas hydrate prospects were identified in the Alaminos Canyon area of the Gulf of Mexico. Two of the prospects were drilled as part of the Gulf of Mexico Gas Hydrate Joint Industry Program Leg II in May 2009, and a suite of logging-while-drilling logs was acquired at each well site. Logging-while-drilling logs at the Alaminos Canyon 21–A site indicate that resistivities of approximately 2 ohm-meter and P-wave velocities of approximately 1.9 kilometers per second were measured in a possible gas-hydrate-bearing target sand interval between 540 and 632 feet below the sea floor. These values are slightly elevated relative to those measured in the hydrate-free sediment surrounding the sands. The initial well log analysis is inconclusive in determining the presence of gas hydrate in the logged sand interval, mainly because large washouts in the target interval degraded well log measurements. To assess gas-hydrate saturations, a method of compensating for the effect of washouts on the resistivity and acoustic velocities is required. To meet this need, a method is presented that models the washed-out portion of the borehole as a vertical layer filled with seawater (drilling fluid). Owing to the anisotropic nature of this geometry, the apparent anisotropic resistivities and velocities caused by the vertical layer are used to correct measured log values. By incorporating the conventional marine seismic data into the well log analysis of the washout-corrected well logs, the gas-hydrate saturation at well site AC21–A was estimated to be in the range of 13 percent. Because gas hydrates in the vertical fractures were observed, anisotropic rock physics models were also applied to estimate gas-hydrate saturations.
Nouri, Mahtab; Hamidiaval, Shadi; Akbarzadeh Baghban, Alireza; Basafa, Mohammad; Fahim, Mohammad
2015-01-01
Cephalometric norms of McNamara analysis have been studied in various populations due to their optimal efficiency. Dolphin cephalometric software greatly enhances the conduction of this analysis for orthodontic measurements. However, Dolphin is very expensive and cannot be afforded by many clinicians in developing countries. A suitable alternative software program in Farsi/English will greatly help Farsi speaking clinicians. The present study aimed to develop an affordable Iranian cephalometric analysis software program and compare it with Dolphin, the standard software available on the market for cephalometric analysis. In this diagnostic, descriptive study, 150 lateral cephalograms of normal occlusion individuals were selected in Mashhad and Qazvin, two major cities of Iran mainly populated with Fars ethnicity, the main Iranian ethnic group. After tracing the cephalograms, the McNamara analysis standards were measured both with Dolphin and the new software. The cephalometric software was designed using Microsoft Visual C++ program in Windows XP. Measurements made with the new software were compared with those of Dolphin software on both series of cephalograms. The validity and reliability were tested using intra-class correlation coefficient. Calculations showed a very high correlation between the results of the Iranian cephalometric analysis software and Dolphin. This confirms the validity and optimal efficacy of the newly designed software (ICC 0.570-1.0). According to our results, the newly designed software has acceptable validity and reliability and can be used for orthodontic diagnosis, treatment planning and assessment of treatment outcome.
Statistics, Structures & Satisfied Customers: Using Web Log Data to Improve Site Performance.
ERIC Educational Resources Information Center
Peacock, Darren
This paper explores some of the ways in which the National Museum of Australia is using Web analysis tools to shape its future directions in the delivery of online services. In particular, it explores the potential of quantitative analysis, based on Web server log data, to convert these ephemeral traces of user experience into a strategic…
Macfarlane, P.A.
2009-01-01
Regional aquifers in thick sequences of continentally derived heterolithic deposits, such as the High Plains of the North American Great Plains, are difficult to characterize hydrostratigraphically because of their framework complexity and the lack of high-quality subsurface information from drill cores and geophysical logs. However, using a database of carefully evaluated drillers' and sample logs and commercially available visualization software, it is possible to qualitatively characterize these complex frameworks based on the concept of relative permeability. Relative permeability is the permeable fraction of a deposit expressed as a percentage of its total thickness. In this methodology, uncemented coarse and fine sediments are arbitrarily set at relative permeabilities of 100% and 0%, respectively, with allowances made for log entries containing descriptions of mixed lithologies, heterolithic strata, and cementation. To better understand the arrangement of high- and low-permeability domains within the High Plains aquifer, a pilot study was undertaken in southwest Kansas to create three-dimensional visualizations of relative permeability using a database of >3000 logs. Aggregate relative permeability ranges up to 99% with a mean of 51%. Laterally traceable, thick domains of >80% relative permeability embedded within a lower relative permeability matrix strongly suggest that preferred pathways for lateral and vertical water transmission exist within the aquifer. Similarly, domains with relative permeabilities of <45% are traceable laterally over appreciable distances in the sub-surface and probably act as leaky confining layers. This study shows that the aquifer does not consist solely of local, randomly distributed, hydrostratigraphic units, as suggested by previous studies. ?? 2009 Geological Society of America.
VizieR Online Data Catalog: CoRoT red giants abundances (Morel+, 2014)
NASA Astrophysics Data System (ADS)
Morel, T.; Miglio, A.; Lagarde, N.; Montalban, J.; Rainer, M.; Poretti, E.; Eggenberger, P.; Hekker, S.; Kallinger, T.; Mosser, B.; Valentini, M.; Carrier, F.; Hareter, M.; Mantegazza, L.
2014-02-01
The equivalent widths were measured manually assuming Gaussian profiles or Voigt profiles for the few lines with extended damping wings. Lines with an unsatisfactory fit or significantly affected by telluric features were discarded. Only values eventually retained for the analysis are provided. For the chemical abundances, the usual notation is used: [X/Y]=[log({epsilon}(X))-log({epsilon}(Y))]star - [log({epsilon}(X))-log({epsilon}(Y))]⊙ with log{epsilon}(X)=12+log[N(X)/N(H)] (N is the number density of the species). For lithium, the following notation is used: [Li/H]=log(N(Li))star-log(N(Li))⊙. The adopted solar abundances are taken from Grevesse & Sauval (1998SSRv...85..161G), except for Li for which we adopt our derived values: log({epsilon}(Li))⊙=1.09 and 1.13 in LTE and NLTE, respectively (see text). All the abundances are computed under the assumption of LTE, except Li for which values corrected for departures from LTE using the data of Lind et al. (2009A&A...503..541L) are also provided. All the quoted error bars are 1-sigma uncertainties. (6 data files).
Paillet, Frederick L.; Hodges, Richard E.; Corland, Barbara S.
2002-01-01
This report presents and describes geophysical logs for six boreholes in Lariat Gulch, a topographic gulch at the former U.S. Air Force site PJKS in Jefferson County near Denver, Colorado. Geophysical logs include gamma, normal resistivity, fluid-column temperature and resistivity, caliper, televiewer, and heat-pulse flowmeter. These logs were run in two boreholes penetrating only the Fountain Formation of Pennsylvanian and Permian age (logged to depths of about 65 and 570 feet) and in four boreholes (logged to depths of about 342 to 742 feet) penetrating mostly the Fountain Formation and terminating in Precambrian crystalline rock, which underlies the Fountain Formation. Data from the logs were used to identify fractures and bedding planes and to locate the contact between the two formations. The logs indicated few fractures in the boreholes and gave no indication of higher transmissivity in the contact zone between the two formations. Transmissivities for all fractures in each borehole were estimated to be less than 2 feet squared per day.
Extracting the Textual and Temporal Structure of Supercomputing Logs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jain, S; Singh, I; Chandra, A
2009-05-26
Supercomputers are prone to frequent faults that adversely affect their performance, reliability and functionality. System logs collected on these systems are a valuable resource of information about their operational status and health. However, their massive size, complexity, and lack of standard format makes it difficult to automatically extract information that can be used to improve system management. In this work we propose a novel method to succinctly represent the contents of supercomputing logs, by using textual clustering to automatically find the syntactic structures of log messages. This information is used to automatically classify messages into semantic groups via an onlinemore » clustering algorithm. Further, we describe a methodology for using the temporal proximity between groups of log messages to identify correlated events in the system. We apply our proposed methods to two large, publicly available supercomputing logs and show that our technique features nearly perfect accuracy for online log-classification and extracts meaningful structural and temporal message patterns that can be used to improve the accuracy of other log analysis techniques.« less
Wood, Fred B; Benson, Dennis; LaCroix, Eve-Marie; Siegel, Elliot R; Fariss, Susan
2005-07-01
The transition to a largely Internet and Web-based environment for dissemination of health information has changed the health information landscape and the framework for evaluation of such activities. A multidimensional evaluative approach is needed. This paper discusses one important dimension of Web evaluation-usage data. In particular, we discuss the collection and analysis of external data on website usage in order to develop a better understanding of the health information (and related US government information) market space, and to estimate the market share or relative levels of usage for National Library of Medicine (NLM) and National Institutes of Health (NIH) websites compared to other health information providers. The primary method presented is Internet audience measurement based on Web usage by external panels of users and assembled by private vendors-in this case, comScore. A secondary method discussed is Web usage based on Web log software data. The principle metrics for both methods are unique visitors and total pages downloaded per month. NLM websites (primarily MedlinePlus and PubMed) account for 55% to 80% of total NIH website usage depending on the metric used. In turn, NIH.gov top-level domain usage (inclusive of NLM) ranks second only behind WebMD in the US domestic home health information market and ranks first on a global basis. NIH.gov consistently ranks among the top three or four US government top-level domains based on global Web usage. On a site-specific basis, the top health information websites in terms of global usage appear to be WebMD, MSN Health, PubMed, Yahoo! Health, AOL Health, and MedlinePlus. Based on MedlinePlus Web log data and external Internet audience measurement data, the three most heavily used cancer-centric websites appear to be www.cancer.gov (National Cancer Institute), www.cancer.org (American Cancer Society), and www.breastcancer.org (non-profit organization). Internet audience measurement has proven useful to NLM, with significant advantages compared to sole reliance on usage data from Web log software. Internet audience data has helped NLM better understand the relative usage of NLM and NIH websites in the intersection of the health information and US government information market sectors, which is the primary market intersector for NLM and NIH. However important, Web usage is only one dimension of a complete Web evaluation framework, and other primary research methods, such as online user surveys, usability tests, and focus groups, are also important for comprehensive evaluation that includes qualitative elements, such as user satisfaction and user friendliness, as well as quantitative indicators of website usage.
Benson, Dennis; LaCroix, Eve-Marie; Siegel, Elliot R; Fariss, Susan
2005-01-01
Background The transition to a largely Internet and Web-based environment for dissemination of health information has changed the health information landscape and the framework for evaluation of such activities. A multidimensional evaluative approach is needed. Objective This paper discusses one important dimension of Web evaluation—usage data. In particular, we discuss the collection and analysis of external data on website usage in order to develop a better understanding of the health information (and related US government information) market space, and to estimate the market share or relative levels of usage for National Library of Medicine (NLM) and National Institutes of Health (NIH) websites compared to other health information providers. Methods The primary method presented is Internet audience measurement based on Web usage by external panels of users and assembled by private vendors—in this case, comScore. A secondary method discussed is Web usage based on Web log software data. The principle metrics for both methods are unique visitors and total pages downloaded per month. Results NLM websites (primarily MedlinePlus and PubMed) account for 55% to 80% of total NIH website usage depending on the metric used. In turn, NIH.gov top-level domain usage (inclusive of NLM) ranks second only behind WebMD in the US domestic home health information market and ranks first on a global basis. NIH.gov consistently ranks among the top three or four US government top-level domains based on global Web usage. On a site-specific basis, the top health information websites in terms of global usage appear to be WebMD, MSN Health, PubMed, Yahoo! Health, AOL Health, and MedlinePlus. Based on MedlinePlus Web log data and external Internet audience measurement data, the three most heavily used cancer-centric websites appear to be www.cancer.gov (National Cancer Institute), www.cancer.org (American Cancer Society), and www.breastcancer.org (non-profit organization). Conclusions Internet audience measurement has proven useful to NLM, with significant advantages compared to sole reliance on usage data from Web log software. Internet audience data has helped NLM better understand the relative usage of NLM and NIH websites in the intersection of the health information and US government information market sectors, which is the primary market intersector for NLM and NIH. However important, Web usage is only one dimension of a complete Web evaluation framework, and other primary research methods, such as online user surveys, usability tests, and focus groups, are also important for comprehensive evaluation that includes qualitative elements, such as user satisfaction and user friendliness, as well as quantitative indicators of website usage. PMID:15998622
WILSON-BAPPU EFFECT: EXTENDED TO SURFACE GRAVITY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Sunkyung; Kang, Wonseok; Lee, Jeong-Eun
2013-10-01
In 1957, Wilson and Bappu found a tight correlation between the stellar absolute visual magnitude (M{sub V} ) and the width of the Ca II K emission line for late-type stars. Here, we revisit the Wilson-Bappu relationship (WBR) to claim that the WBR can be an excellent indicator of stellar surface gravity of late-type stars as well as a distance indicator. We have measured the width (W) of the Ca II K emission line in high-resolution spectra of 125 late-type stars obtained with the Bohyunsan Optical Echelle Spectrograph and adopted from the Ultraviolet and Visual Echelle Spectrograph archive. Based onmore » our measurement of the emission line width (W), we have obtained a WBR of M{sub V} = 33.76 - 18.08 log W. In order to extend the WBR to being a surface gravity indicator, stellar atmospheric parameters such as effective temperature (T{sub eff}), surface gravity (log g), metallicity ([Fe/H]), and micro-turbulence ({xi}{sub tur}) have been derived from self-consistent detailed analysis using the Kurucz stellar atmospheric model and the abundance analysis code, MOOG. Using these stellar parameters and log W, we found that log g = -5.85 log W+9.97 log T{sub eff} - 23.48 for late-type stars.« less
The statistics of identifying differentially expressed genes in Expresso and TM4: a comparison
Sioson, Allan A; Mane, Shrinivasrao P; Li, Pinghua; Sha, Wei; Heath, Lenwood S; Bohnert, Hans J; Grene, Ruth
2006-01-01
Background Analysis of DNA microarray data takes as input spot intensity measurements from scanner software and returns differential expression of genes between two conditions, together with a statistical significance assessment. This process typically consists of two steps: data normalization and identification of differentially expressed genes through statistical analysis. The Expresso microarray experiment management system implements these steps with a two-stage, log-linear ANOVA mixed model technique, tailored to individual experimental designs. The complement of tools in TM4, on the other hand, is based on a number of preset design choices that limit its flexibility. In the TM4 microarray analysis suite, normalization, filter, and analysis methods form an analysis pipeline. TM4 computes integrated intensity values (IIV) from the average intensities and spot pixel counts returned by the scanner software as input to its normalization steps. By contrast, Expresso can use either IIV data or median intensity values (MIV). Here, we compare Expresso and TM4 analysis of two experiments and assess the results against qRT-PCR data. Results The Expresso analysis using MIV data consistently identifies more genes as differentially expressed, when compared to Expresso analysis with IIV data. The typical TM4 normalization and filtering pipeline corrects systematic intensity-specific bias on a per microarray basis. Subsequent statistical analysis with Expresso or a TM4 t-test can effectively identify differentially expressed genes. The best agreement with qRT-PCR data is obtained through the use of Expresso analysis and MIV data. Conclusion The results of this research are of practical value to biologists who analyze microarray data sets. The TM4 normalization and filtering pipeline corrects microarray-specific systematic bias and complements the normalization stage in Expresso analysis. The results of Expresso using MIV data have the best agreement with qRT-PCR results. In one experiment, MIV is a better choice than IIV as input to data normalization and statistical analysis methods, as it yields as greater number of statistically significant differentially expressed genes; TM4 does not support the choice of MIV input data. Overall, the more flexible and extensive statistical models of Expresso achieve more accurate analytical results, when judged by the yardstick of qRT-PCR data, in the context of an experimental design of modest complexity. PMID:16626497
Identification Method of Mud Shale Fractures Base on Wavelet Transform
NASA Astrophysics Data System (ADS)
Xia, Weixu; Lai, Fuqiang; Luo, Han
2018-01-01
In recent years, inspired by seismic analysis technology, a new method for analysing mud shale fractures oil and gas reservoirs by logging properties has emerged. By extracting the high frequency attribute of the wavelet transform in the logging attribute, the formation information hidden in the logging signal is extracted, identified the fractures that are not recognized by conventional logging and in the identified fracture segment to show the “cycle jump”, “high value”, “spike” and other response effect is more obvious. Finally formed a complete wavelet denoising method and wavelet high frequency identification fracture method.
NASA Astrophysics Data System (ADS)
Mattey, D.
2012-04-01
The concentration of CO2 in cave air is one of the main controls on the rate of degassing of dripwater and on the kinetics of calcite precipitation forming speleothem deposits. Measurements of cave air CO2reveal great complexity in the spatial distribution among interconnected cave chambers and temporal changes on synoptic to seasonal time scales. The rock of Gibraltar hosts a large number of caves distributed over a 300 meter range in altitude and monthly sampling and analysis of air and water combined with continuous logging of temperature, humidity and drip discharge rates since 2004 reveals the importance of density-driven seasonal ventilation which drives large-scale advection of CO2-rich air though the cave systems. Since 2008 we have deployed automatic CO2 monitoring systems that regularly sample cave air from up to 8 locations distributed laterally and vertically in St Michaels Cave located near the top of the rock at 275m asl and Ragged Staff Cave located in the heart of the rock near sea level. The logging system is controlled by a Campbell Scientific CR1000 programmable datalogger which controls an 8 port manifold connected to sampling lines leading to different parts of the cave over a distance of up to 250 meters. The manifold is pumped at a rate of 5l per minute drawing air through 6mm or 8mm id polythene tubing via a 1m Nafion loop to reduce humidity to local ambient conditions. The outlet of the primary pump leads to an open split which is sampled by a second low flow pump which delivers air at 100ml/minute to a Licor 820 CO2 analyser. The software selects the port to be sampled, flushes the line for 2 minutes and CO2 analysed as a set of 5 measurements averaged over 10 second intervals. The system then switches to the next port and when complete shuts down to conserve power after using 20 watts over a 30 minute period of analysis. In the absence of local mains power (eg from the show cave lighting system) two 12v car batteries will power the system for analysis at 4h intervals for about 1 month. Two logging systems sampling cave air from 13 locations over a vertical range of 275m have run continuously for up to 5 years and return a very detailed picture of cave ventilation patterns and their responses to local weather and seasonal change.
NASA Astrophysics Data System (ADS)
Mert, Bayram Ali; Dag, Ahmet
2017-12-01
In this study, firstly, a practical and educational geostatistical program (JeoStat) was developed, and then example analysis of porosity parameter distribution, using oilfield data, was presented. With this program, two or three-dimensional variogram analysis can be performed by using normal, log-normal or indicator transformed data. In these analyses, JeoStat offers seven commonly used theoretical variogram models (Spherical, Gaussian, Exponential, Linear, Generalized Linear, Hole Effect and Paddington Mix) to the users. These theoretical models can be easily and quickly fitted to experimental models using a mouse. JeoStat uses ordinary kriging interpolation technique for computation of point or block estimate, and also uses cross-validation test techniques for validation of the fitted theoretical model. All the results obtained by the analysis as well as all the graphics such as histogram, variogram and kriging estimation maps can be saved to the hard drive, including digitised graphics and maps. As such, the numerical values of any point in the map can be monitored using a mouse and text boxes. This program is available to students, researchers, consultants and corporations of any size free of charge. The JeoStat software package and source codes available at: http://www.jeostat.com/JeoStat_2017.0.rar.
Debugging and Performance Analysis Software Tools for Peregrine System |
High-Performance Computing | NREL Debugging and Performance Analysis Software Tools for Peregrine System Debugging and Performance Analysis Software Tools for Peregrine System Learn about debugging and performance analysis software tools available to use with the Peregrine system. Allinea
3-Dimensional Root Cause Diagnosis via Co-analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Ziming; Lan, Zhiling; Yu, Li
2012-01-01
With the growth of system size and complexity, reliability has become a major concern for large-scale systems. Upon the occurrence of failure, system administrators typically trace the events in Reliability, Availability, and Serviceability (RAS) logs for root cause diagnosis. However, RAS log only contains limited diagnosis information. Moreover, the manual processing is time-consuming, error-prone, and not scalable. To address the problem, in this paper we present an automated root cause diagnosis mechanism for large-scale HPC systems. Our mechanism examines multiple logs to provide a 3-D fine-grained root cause analysis. Here, 3-D means that our analysis will pinpoint the failure layer,more » the time, and the location of the event that causes the problem. We evaluate our mechanism by means of real logs collected from a production IBM Blue Gene/P system at Oak Ridge National Laboratory. It successfully identifies failure layer information for 219 failures during 23-month period. Furthermore, it effectively identifies the triggering events with time and location information, even when the triggering events occur hundreds of hours before the resulting failures.« less
Magnetostriction measurement by four probe method
NASA Astrophysics Data System (ADS)
Dange, S. N.; Radha, S.
2018-04-01
The present paper describes the design and setting up of an indigenouslydevelopedmagnetostriction(MS) measurement setup using four probe method atroom temperature.A standard strain gauge is pasted with a special glue on the sample and its change in resistance with applied magnetic field is measured using KeithleyNanovoltmeter and Current source. An electromagnet with field upto 1.2 tesla is used to source the magnetic field. The sample is placed between the magnet poles using self designed and developed wooden probe stand, capable of moving in three mutually perpendicular directions. The nanovoltmeter and current source are interfaced with PC using RS232 serial interface. A software has been developed in for logging and processing of data. Proper optimization of measurement has been done through software to reduce the noise due to thermal emf and electromagnetic induction. The data acquired for some standard magnetic samples are presented. The sensitivity of the setup is 1microstrain with an error in measurement upto 5%.
Solutions for acceleration measurement in vehicle crash tests
NASA Astrophysics Data System (ADS)
Dima, D. S.; Covaciu, D.
2017-10-01
Crash tests are useful for validating computer simulations of road traffic accidents. One of the most important parameters measured is the acceleration. The evolution of acceleration versus time, during a crash test, form a crash pulse. The correctness of the crash pulse determination depends on the data acquisition system used. Recommendations regarding the instrumentation for impact tests are given in standards, which are focused on the use of accelerometers as impact sensors. The goal of this paper is to present the device and software developed by authors for data acquisition and processing. The system includes two accelerometers with different input ranges, a processing unit based on a 32-bit microcontroller and a data logging unit with SD card. Data collected on card, as text files, is processed with a dedicated software running on personal computers. The processing is based on diagrams and includes the digital filters recommended in standards.
NASA Technical Reports Server (NTRS)
Ramirez, Eric; Gutheinz, Sandy; Brison, James; Ho, Anita; Allen, James; Ceritelli, Olga; Tobar, Claudia; Nguyen, Thuykien; Crenshaw, Harrel; Santos, Roxann
2008-01-01
Supplier Management System (SMS) allows for a consistent, agency-wide performance rating system for suppliers used by NASA. This version (2.0) combines separate databases into one central database that allows for the sharing of supplier data. Information extracted from the NBS/Oracle database can be used to generate ratings. Also, supplier ratings can now be generated in the areas of cost, product quality, delivery, and audit data. Supplier data can be charted based on real-time user input. Based on these individual ratings, an overall rating can be generated. Data that normally would be stored in multiple databases, each requiring its own log-in, is now readily available and easily accessible with only one log-in required. Additionally, the database can accommodate the storage and display of quality-related data that can be analyzed and used in the supplier procurement decision-making process. Moreover, the software allows for a Closed-Loop System (supplier feedback), as well as the capability to communicate with other federal agencies.
Machine learning models for lipophilicity and their domain of applicability.
Schroeter, Timon; Schwaighofer, Anton; Mika, Sebastian; Laak, Antonius Ter; Suelzle, Detlev; Ganzer, Ursula; Heinrich, Nikolaus; Müller, Klaus-Robert
2007-01-01
Unfavorable lipophilicity and water solubility cause many drug failures; therefore these properties have to be taken into account early on in lead discovery. Commercial tools for predicting lipophilicity usually have been trained on small and neutral molecules, and are thus often unable to accurately predict in-house data. Using a modern Bayesian machine learning algorithm--a Gaussian process model--this study constructs a log D7 model based on 14,556 drug discovery compounds of Bayer Schering Pharma. Performance is compared with support vector machines, decision trees, ridge regression, and four commercial tools. In a blind test on 7013 new measurements from the last months (including compounds from new projects) 81% were predicted correctly within 1 log unit, compared to only 44% achieved by commercial software. Additional evaluations using public data are presented. We consider error bars for each method (model based error bars, ensemble based, and distance based approaches), and investigate how well they quantify the domain of applicability of each model.
Gopalakrishnan, V; Baskaran, R; Venkatraman, B
2016-08-01
A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee-Pro wireless modules and PSoC controller for wireless interfacing, and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gopalakrishnan, V.; Baskaran, R.; Venkatraman, B.
A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee–Pro wireless modules and PSoC controller for wireless interfacing,more » and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.« less
Scale out databases for CERN use cases
NASA Astrophysics Data System (ADS)
Baranowski, Zbigniew; Grzybek, Maciej; Canali, Luca; Lanza Garcia, Daniel; Surdy, Kacper
2015-12-01
Data generation rates are expected to grow very fast for some database workloads going into LHC run 2 and beyond. In particular this is expected for data coming from controls, logging and monitoring systems. Storing, administering and accessing big data sets in a relational database system can quickly become a very hard technical challenge, as the size of the active data set and the number of concurrent users increase. Scale-out database technologies are a rapidly developing set of solutions for deploying and managing very large data warehouses on commodity hardware and with open source software. In this paper we will describe the architecture and tests on database systems based on Hadoop and the Cloudera Impala engine. We will discuss the results of our tests, including tests of data loading and integration with existing data sources and in particular with relational databases. We will report on query performance tests done with various data sets of interest at CERN, notably data from the accelerator log database.
Fast and Accurate Simulation Technique for Large Irregular Arrays
NASA Astrophysics Data System (ADS)
Bui-Van, Ha; Abraham, Jens; Arts, Michel; Gueuning, Quentin; Raucy, Christopher; Gonzalez-Ovejero, David; de Lera Acedo, Eloy; Craeye, Christophe
2018-04-01
A fast full-wave simulation technique is presented for the analysis of large irregular planar arrays of identical 3-D metallic antennas. The solution method relies on the Macro Basis Functions (MBF) approach and an interpolatory technique to compute the interactions between MBFs. The Harmonic-polynomial (HARP) model is established for the near-field interactions in a modified system of coordinates. For extremely large arrays made of complex antennas, two approaches assuming a limited radius of influence for mutual coupling are considered: one is based on a sparse-matrix LU decomposition and the other one on a tessellation of the array in the form of overlapping sub-arrays. The computation of all embedded element patterns is sped up with the help of the non-uniform FFT algorithm. Extensive validations are shown for arrays of log-periodic antennas envisaged for the low-frequency SKA (Square Kilometer Array) radio-telescope. The analysis of SKA stations with such a large number of elements has not been treated yet in the literature. Validations include comparison with results obtained with commercial software and with experiments. The proposed method is particularly well suited to array synthesis, in which several orders of magnitude can be saved in terms of computation time.
2013-09-01
to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS
2002-12-01
radio and batteries. The procedures outlined in this CHETN will concentrate on the Magellan GPS ProMARK X-CP receiver as it was used to collect...The Magellan GPS ProMARK X-CP is a small robust light receiver that can log 9 hr of both pseudorange and carrier phase satellite data for post...post- processing software, pseudorange GPS data recorded by the ProMARK X-CP can be post-processed differential to achieve 1-3 m (3.3-9.8 ft) horizontal
NASA Astrophysics Data System (ADS)
Kostarev, S. N.; Sereda, T. G.
2017-10-01
The article is concerned with the problem of transmitting data from telemetric devices in order to provide automated systems for the electric drive control of oil-extracting equipment. The paper given discusses the possibility to use a logging cable as means of signal transfer. Simulation models of signaling and relay-contact circuits for monitoring critical drive parameters are under discussion. The authors suggest applying the operator ⊕ (excluding OR) to increase anti-jamming effects and to get a more reliable noise filter.
GISentinel: a software platform for automatic ulcer detection on capsule endoscopy videos
NASA Astrophysics Data System (ADS)
Yi, Steven; Jiao, Heng; Meng, Fan; Leighton, Jonathon A.; Shabana, Pasha; Rentz, Lauri
2014-03-01
In this paper, we present a novel and clinically valuable software platform for automatic ulcer detection on gastrointestinal (GI) tract from Capsule Endoscopy (CE) videos. Typical CE videos take about 8 hours. They have to be reviewed manually by physicians to detect and locate diseases such as ulcers and bleedings. The process is time consuming. Moreover, because of the long-time manual review, it is easy to lead to miss-finding. Working with our collaborators, we were focusing on developing a software platform called GISentinel, which can fully automated GI tract ulcer detection and classification. This software includes 3 parts: the frequency based Log-Gabor filter regions of interest (ROI) extraction, the unique feature selection and validation method (e.g. illumination invariant feature, color independent features, and symmetrical texture features), and the cascade SVM classification for handling "ulcer vs. non-ulcer" cases. After the experiments, this SW gave descent results. In frame-wise, the ulcer detection rate is 69.65% (319/458). In instance-wise, the ulcer detection rate is 82.35%(28/34).The false alarm rate is 16.43% (34/207). This work is a part of our innovative 2D/3D based GI tract disease detection software platform. The final goal of this SW is to find and classification of major GI tract diseases intelligently, such as bleeding, ulcer, and polyp from the CE videos. This paper will mainly describe the automatic ulcer detection functional module.
Winston, Flaura K; Mirman, Jessica H; Curry, Allison E; Pfeiffer, Melissa R; Elliott, Michael R; Durbin, Dennis R
2015-02-01
Inexperienced, less-skilled driving characterises many newly licensed drivers and contributes to high crash rates. A randomised trial of TeenDrivingPlan (TDP), a new learner driver phase internet-based intervention, demonstrated effectiveness in improving safety relevant, on-road driving behaviour, primarily through greater driving practice diversity. To inform future learner driver interventions, this analysis examined TDP use and its association with practice diversity. Posthoc analysis of data from teen/parent dyads (n=107), enrolled early in learner phase and assigned to treatment arm in randomised trial. Inserted software beacons captured TDP use data. Electronic surveys completed by parents and teens assessed diversity of practice driving and TDP usability ratings at 24 weeks (end of study period). Most families (84%) used TDP early in the learner period; however, the number of TDP sessions in the first week was three times higher among dyads who achieved greater practice diversity than those with less. By week five many families still engaged with TDP, but differences in TDP use could not be detected between families with high versus low practice diversity. Usability was not a major issue for this sample based on largely positive user ratings. An engaging internet-based intervention, such as TDP, can support families in achieving high practice diversity. Future learner driver interventions should provide important information early in the learner period when engagement is greatest, encourage continued learning as part of logging practice drives, and incorporate monitoring software for further personalisation to meet family needs. NCT01498575. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Using EMIS to Identify Top Opportunities for Commercial Building Efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Guanjing; Singla, Rupam; Granderson, Jessica
Energy Management and Information Systems (EMIS) comprise a broad family of tools and services to manage commercial building energy use. These technologies offer a mix of capabilities to store, display, and analyze energy use and system data, and in some cases, provide control. EMIS technologies enable 10–20 percent site energy savings in best practice implementations. Energy Information System (EIS) and Fault Detection and Diagnosis (FDD) systems are two key technologies in the EMIS family. Energy Information Systems are broadly defined as the web-based software, data acquisition hardware, and communication systems used to analyze and display building energy performance. At amore » minimum, an EIS provides daily, hourly or sub-hourly interval meter data at the whole-building level, with graphical and analytical capability. Fault Detection and Diagnosis systems automatically identify heating, ventilation, and air-conditioning (HVAC) system or equipment-level performances issues, and in some cases are able to isolate the root causes of the problem. They use computer algorithms to continuously analyze system-level operational data to detect faults and diagnose their causes. Many FDD tools integrate the trend log data from a Building Automation System (BAS) but otherwise are stand-alone software packages; other types of FDD tools are implemented as “on-board” equipment-embedded diagnostics. (This document focuses on the former.) Analysis approaches adopted in FDD technologies span a variety of techniques from rule-based methods to process history-based approaches. FDD tools automate investigations that can be conducted via manual data inspection by someone with expert knowledge, thereby expanding accessibility and breath of analysis opportunity, and also reducing complexity.« less
Analyzing engagement in a web-based intervention platform through visualizing log-data.
Morrison, Cecily; Doherty, Gavin
2014-11-13
Engagement has emerged as a significant cross-cutting concern within the development of Web-based interventions. There have been calls to institute a more rigorous approach to the design of Web-based interventions, to increase both the quantity and quality of engagement. One approach would be to use log-data to better understand the process of engagement and patterns of use. However, an important challenge lies in organizing log-data for productive analysis. Our aim was to conduct an initial exploration of the use of visualizations of log-data to enhance understanding of engagement with Web-based interventions. We applied exploratory sequential data analysis to highlight sequential aspects of the log data, such as time or module number, to provide insights into engagement. After applying a number of processing steps, a range of visualizations were generated from the log-data. We then examined the usefulness of these visualizations for understanding the engagement of individual users and the engagement of cohorts of users. The visualizations created are illustrated with two datasets drawn from studies using the SilverCloud Platform: (1) a small, detailed dataset with interviews (n=19) and (2) a large dataset (n=326) with 44,838 logged events. We present four exploratory visualizations of user engagement with a Web-based intervention, including Navigation Graph, Stripe Graph, Start-Finish Graph, and Next Action Heat Map. The first represents individual usage and the last three, specific aspects of cohort usage. We provide examples of each with a discussion of salient features. Log-data analysis through data visualization is an alternative way of exploring user engagement with Web-based interventions, which can yield different insights than more commonly used summative measures. We describe how understanding the process of engagement through visualizations can support the development and evaluation of Web-based interventions. Specifically, we show how visualizations can (1) allow inspection of content or feature usage in a temporal relationship to the overall program at different levels of granularity, (2) detect different patterns of use to consider personalization in the design process, (3) detect usability issues, (4) enable exploratory analysis to support the design of statistical queries to summarize the data, (5) provide new opportunities for real-time evaluation, and (6) examine assumptions about interactivity that underlie many summative measures in this field.
Analyzing Engagement in a Web-Based Intervention Platform Through Visualizing Log-Data
2014-01-01
Background Engagement has emerged as a significant cross-cutting concern within the development of Web-based interventions. There have been calls to institute a more rigorous approach to the design of Web-based interventions, to increase both the quantity and quality of engagement. One approach would be to use log-data to better understand the process of engagement and patterns of use. However, an important challenge lies in organizing log-data for productive analysis. Objective Our aim was to conduct an initial exploration of the use of visualizations of log-data to enhance understanding of engagement with Web-based interventions. Methods We applied exploratory sequential data analysis to highlight sequential aspects of the log data, such as time or module number, to provide insights into engagement. After applying a number of processing steps, a range of visualizations were generated from the log-data. We then examined the usefulness of these visualizations for understanding the engagement of individual users and the engagement of cohorts of users. The visualizations created are illustrated with two datasets drawn from studies using the SilverCloud Platform: (1) a small, detailed dataset with interviews (n=19) and (2) a large dataset (n=326) with 44,838 logged events. Results We present four exploratory visualizations of user engagement with a Web-based intervention, including Navigation Graph, Stripe Graph, Start–Finish Graph, and Next Action Heat Map. The first represents individual usage and the last three, specific aspects of cohort usage. We provide examples of each with a discussion of salient features. Conclusions Log-data analysis through data visualization is an alternative way of exploring user engagement with Web-based interventions, which can yield different insights than more commonly used summative measures. We describe how understanding the process of engagement through visualizations can support the development and evaluation of Web-based interventions. Specifically, we show how visualizations can (1) allow inspection of content or feature usage in a temporal relationship to the overall program at different levels of granularity, (2) detect different patterns of use to consider personalization in the design process, (3) detect usability issues, (4) enable exploratory analysis to support the design of statistical queries to summarize the data, (5) provide new opportunities for real-time evaluation, and (6) examine assumptions about interactivity that underlie many summative measures in this field. PMID:25406097
Adjustments to forest inventory and analysis estimates of 2001 saw-log volumes for Kentucky
Stanley J. Zarnoch; Jeffery A. Turner
2005-01-01
The 2001 Kentucky Forest Inventory and Analysis survey overestimated hardwood saw-log volume in tree grade 1. This occurred because 2001 field crews classified too many trees as grade 1 trees. Data collected by quality assurance crews were used to generate two types of adjustments, one based on the proportion of trees misclassified and the other on the proportion of...
ERIC Educational Resources Information Center
Asunka, Stephen; Chae, Hui Soo; Hughes, Brian; Natriello, Gary
2009-01-01
Transaction logs of user activity on an academic library website were analyzed to determine general usage patterns on the website. This paper reports on insights gained from the analysis, and identifies and discusses issues relating to content access, interface design and general functionality of the website. (Contains 13 figures and 8 tables.)
The Use of OPAC in a Large Academic Library: A Transactional Log Analysis Study of Subject Searching
ERIC Educational Resources Information Center
Villen-Rueda, Luis; Senso, Jose A.; de Moya-Anegon, Felix
2007-01-01
The analysis of user searches in catalogs has been the topic of research for over four decades, involving numerous studies and diverse methodologies. The present study looks at how different types of users effect queries in the catalog of a university library. For this purpose, we analyzed log files to determine which was the most frequent type of…
Gregory P. Asner; Michael Keller; Rodrigo Pereira; Johan C. Zweede
2002-01-01
We combined a detailed field study of forest canopy damage with calibrated Landsat 7 Enhanced Thematic Mapper Plus (ETM+) reflectance data and texture analysis to assess the sensitivity of basic broadband optical remote sensing to selective logging in Amazonia. Our field study encompassed measurements of ground damage and canopy gap fractions along a chronosequence of...
Statistical Analysis of the Exchange Rate of Bitcoin.
Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen
2015-01-01
Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate.
Log Analysis Using Splunk Hadoop Connect
2017-06-01
running a logging service puts a performance tax on the system and may cause the degradation of performance. More thorough 8 logging will cause a...several nodes. For example, a disk failure would affect all the tasks running on a particular node and generate an alert message not only for the disk...the commands that were executed from the " Run " command. The keylogger installation did not create any registry keys for the program itself. However
The application of PGNAA borehole logging for copper grade estimation at Chuquicamata mine.
Charbucinski, J; Duran, O; Freraut, R; Heresi, N; Pineyro, I
2004-05-01
The field trials of a prompt gamma neutron activation (PGNAA) spectrometric logging method and instrumentation (SIROLOG) for copper grade estimation in production holes of a porphyry type copper ore mine, Chuquicamata in Chile, are described. Examples of data analysis, calibration procedures and copper grade profiles are provided. The field tests have proved the suitability of the PGNAA logging system for in situ quality control of copper ore.
Geophysical evaluation of sandstone aquifers in the Reconcavo-Tucano Basin, Bahia -- Brazil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lima, O.A.L. de
1993-11-01
The upper clastic sediments in the Reconcavo-Tucano basin comprise a multilayer aquifer system of Jurassic age. Its groundwater is normally fresh down to depths of more than 1,000 m. Locally, however, there are zones producing high salinity or sulfur geothermal water. Analysis of electrical logs of more than 150 wells enabled the identification of the most typical sedimentary structures and the gross geometries for the sandstone units in selected areas of the basin. Based on this information, the thick sands are interpreted as coalescent point bars and the shales as flood plain deposits of a large fluvial environment. The resistivitymore » logs and core laboratory data are combined to develop empirical equations relating aquifer porosity and permeability to log-derived parameters such as formation factor and cementation exponent. Temperature logs of 15 wells were useful to quantify the water leakage through semiconfining shales. The groundwater quality was inferred from spontaneous potential (SP) log deflections under control of chemical analysis of water samples. An empirical chart is developed that relates the SP-derived water resistivity to the true water resistivity within the formations. The patterns of salinity variation with depth inferred from SP logs were helpful in identifying subsurface flows along major fault zones, where extensive mixing of water is taking place. A total of 49 vertical Schlumberger resistivity soundings aid in defining aquifer structures and in extrapolating the log derived results. Transition zones between fresh and saline waters have also been detected based on a combination of logging and surface sounding data. Ionic filtering by water leakage across regional shales, local convection and mixing along major faults and hydrodynamic dispersion away from lateral permeability contrasts are the main mechanisms controlling the observed distributions of salinity and temperature within the basin.« less
Lee, M.W.; Collett, T.S.; Lewis, K.A.
2012-01-01
Through the use of 3-D seismic amplitude mapping, several gashydrate prospects were identified in the Alaminos Canyon (AC) area of the Gulf of Mexico. Two locations were drilled as part of the Gulf of MexicoGasHydrate Joint Industry Project Leg II (JIP Leg II) in May of 2009 and a comprehensive set of logging-while-drilling (LWD) logs were acquired at each well site. LWD logs indicated that resistivity in the range of ~2 ohm-m and P-wave velocity in the range of ~1.9 km/s were measured in the target sand interval between 515 and 645 feet below sea floor. These values were slightly elevated relative to those measured in the sediment above and below the target sand. However, the initial well log analysis was inconclusive regarding the presence of gashydrate in the logged sand interval, mainly because largewashouts caused by drilling in the target interval degraded confidence in the well log measurements. To assess gashydratesaturations in the sedimentary section drilled in the Alaminos Canyon 21B (AC21-B) well, a method of compensating for the effect of washouts on the resistivity and acoustic velocities was developed. The proposed method models the washed-out portion of the borehole as a vertical layer filled with sea water (drilling fluid) and the apparent anisotropic resistivity and velocities caused by a vertical layer are used to correct the measured log values. By incorporating the conventional marine seismic data into the well log analysis, the average gashydratesaturation in the target sand section in the AC21-Bwell can be constrained to the range of 8–28%, with 20% being our best estimate.
Gavino, V C; Milo, G E; Cornwell, D G
1982-03-01
Image analysis was used for the automated measurement of colony frequency (f) and colony diameter (d) in cultures of smooth muscle cells, Initial studies with the inverted microscope showed that number of cells (N) in a colony varied directly with d: log N = 1.98 log d - 3.469 Image analysis generated the complement of a cumulative distribution for f as a function of d. The number of cells in each segment of the distribution function was calculated by multiplying f and the average N for the segment. These data were displayed as a cumulative distribution function. The total number of colonies (fT) and the total number of cells (NT) were used to calculate the average colony size (NA). Population doublings (PD) were then expressed as log2 NA. Image analysis confirmed previous studies in which colonies were sized and counted with an inverted microscope. Thus, image analysis is a rapid and automated technique for the measurement of clonal growth.
NMR Methods, Applications and Trends for Groundwater Evaluation and Management
NASA Astrophysics Data System (ADS)
Walsh, D. O.; Grunewald, E. D.
2011-12-01
Nuclear magnetic resonance (NMR) measurements have a tremendous potential for improving groundwater characterization, as they provide direct detection and measurement of groundwater and unique information about pore-scale properties. NMR measurements, commonly used in chemistry and medicine, are utilized in geophysical investigations through non-invasive surface NMR (SNMR) or downhole NMR logging measurements. Our recent and ongoing research has focused on improving the performance and interpretation of NMR field measurements for groundwater characterization. Engineering advancements have addressed several key technical challenges associated with SNMR measurements. Susceptibility of SNMR measurements to environmental noise has been dramatically reduced through the development of multi-channel acquisition hardware and noise-cancellation software. Multi-channel instrumentation (up to 12 channels) has also enabled more efficient 2D and 3D imaging. Previous limitations in measuring NMR signals from water in silt, clay and magnetic geology have been addressed by shortening the instrument dead-time from 40 ms to 4 ms, and increasing the power output. Improved pulse sequences have been developed to more accurately estimate NMR relaxation times and their distributions, which are sensitive to pore size distributions. Cumulatively, these advancements have vastly expanded the range of environments in which SNMR measurements can be obtained, enabling detection of groundwater in smaller pores, in magnetic geology, in the unsaturated zone, and nearby to infrastructure (presented here in case studies). NMR logging can provide high-resolution estimates of bound and mobile water content and pore size distributions. While NMR logging has been utilized in oil and gas applications for decades, its use in groundwater investigations has been limited by the large size and high cost of oilfield NMR logging tools and services. Recently, engineering efforts funded by the US Department of Energy have produced an NMR logging tool that is much smaller and less costly than comparable oilfield NMR logging tools. This system is specifically designed for near surface groundwater investigations, incorporates small diameter probes (as small as 1.67 inches diameter) and man-portable surface stations, and provides NMR data and information content on par with oilfield NMR logging tools. A direct-push variant of this logging tool has also been developed. Key challenges associated with small diameter tools include inherently lower SNR and logging speeds, the desire to extend the sensitive zone as far as possible into unconsolidated formations, and simultaneously maintaining high power and signal fidelity. Our ongoing research in groundwater NMR aims to integrating surface and borehole measurements for regional-scale permeability mapping, and to develop in-place NMR sensors for long term monitoring of contaminant and remediation processes. In addition to groundwater resource characterization, promising new applications of NMR include assessing water content in ice and permafrost, management of groundwater in mining operations, and evaluation and management of groundwater in civil engineering applications.
On comparison of net survival curves.
Pavlič, Klemen; Perme, Maja Pohar
2017-05-02
Relative survival analysis is a subfield of survival analysis where competing risks data are observed, but the causes of death are unknown. A first step in the analysis of such data is usually the estimation of a net survival curve, possibly followed by regression modelling. Recently, a log-rank type test for comparison of net survival curves has been introduced and the goal of this paper is to explore its properties and put this methodological advance into the context of the field. We build on the association between the log-rank test and the univariate or stratified Cox model and show the analogy in the relative survival setting. We study the properties of the methods using both the theoretical arguments as well as simulations. We provide an R function to enable practical usage of the log-rank type test. Both the log-rank type test and its model alternatives perform satisfactory under the null, even if the correlation between their p-values is rather low, implying that both approaches cannot be used simultaneously. The stratified version has a higher power in case of non-homogeneous hazards, but also carries a different interpretation. The log-rank type test and its stratified version can be interpreted in the same way as the results of an analogous semi-parametric additive regression model despite the fact that no direct theoretical link can be established between the test statistics.
NASA Astrophysics Data System (ADS)
Haris, A.; Nafian, M.; Riyanto, A.
2017-07-01
Danish North Sea Fields consist of several formations (Ekofisk, Tor, and Cromer Knoll) that was started from the age of Paleocene to Miocene. In this study, the integration of seismic and well log data set is carried out to determine the chalk sand distribution in the Danish North Sea field. The integration of seismic and well log data set is performed by using the seismic inversion analysis and seismic multi-attribute. The seismic inversion algorithm, which is used to derive acoustic impedance (AI), is model-based technique. The derived AI is then used as external attributes for the input of multi-attribute analysis. Moreover, the multi-attribute analysis is used to generate the linear and non-linear transformation of among well log properties. In the case of the linear model, selected transformation is conducted by weighting step-wise linear regression (SWR), while for the non-linear model is performed by using probabilistic neural networks (PNN). The estimated porosity, which is resulted by PNN shows better suited to the well log data compared with the results of SWR. This result can be understood since PNN perform non-linear regression so that the relationship between the attribute data and predicted log data can be optimized. The distribution of chalk sand has been successfully identified and characterized by porosity value ranging from 23% up to 30%.
NASA Technical Reports Server (NTRS)
Singh, S. P.
1979-01-01
The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.
Spectral Analysis of the sdO Standard Star Feige 34
NASA Astrophysics Data System (ADS)
Latour, M.; Chayer, P.; Green, E. M.; Fontaine, G.
2017-03-01
We present our current work on the spectral analysis of the hot sdO star Feige 34. We combine high S/N optical spectra and fully-blanketed non-LTE model atmospheres to derive its fundamental parameters (Teff, log g) and helium abundance. Our best fits indicate Teff = 63 000 K, log g = 6.0 and log N(He)/N(H) = -1.8. We also use available ultraviolet spectra (IUE and FUSE) to measure metal abundances. We find the star to be enriched in iron and nickel by a factor of ten with respect to the solar values, while lighter elements have subsolar abundances. The FUSE spectrum suggests that the spectral lines could be broadened by rotation.
Zhou, Lei-Lei; Xu, Xiao-Yue; Ni, Jie; Zhao, Xia; Zhou, Jian-Wei; Feng, Ji-Feng
2018-06-01
Due to the low incidence and the heterogeneity of subtypes, the biological process of T-cell lymphomas is largely unknown. Although many genes have been detected in T-cell lymphomas, the role of these genes in biological process of T-cell lymphomas was not further analyzed. Two qualified datasets were downloaded from Gene Expression Omnibus database. The biological functions of differentially expressed genes were evaluated by gene ontology enrichment and KEGG pathway analysis. The network for intersection genes was constructed by the cytoscape v3.0 software. Kaplan-Meier survival curves and log-rank test were employed to assess the association between differentially expressed genes and clinical characters. The intersection mRNAs were proved to be associated with fundamental processes of T-cell lymphoma cells. These intersection mRNAs were involved in the activation of some cancer-related pathways, including PI3K/AKT, Ras, JAK-STAT, and NF-kappa B signaling pathway. PDGFRA, CXCL12, and CCL19 were the most significant central genes in the signal-net analysis. The results of survival analysis are not entirely credible. Our findings uncovered aberrantly expressed genes and a complex RNA signal network in T-cell lymphomas and indicated cancer-related pathways involved in disease initiation and progression, providing a new insight for biotargeted therapy in T-cell lymphomas. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Vucicevic, J; Popovic, M; Nikolic, K; Filipic, S; Obradovic, D; Agbaba, D
2017-03-01
For this study, 31 compounds, including 16 imidazoline/α-adrenergic receptor (IRs/α-ARs) ligands and 15 central nervous system (CNS) drugs, were characterized in terms of the retention factors (k) obtained using biopartitioning micellar and classical reversed phase chromatography (log k BMC and log k wRP , respectively). Based on the retention factor (log k wRP ) and slope of the linear curve (S) the isocratic parameter (φ 0 ) was calculated. Obtained retention factors were correlated with experimental log BB values for the group of examined compounds. High correlations were obtained between logarithm of biopartitioning micellar chromatography (BMC) retention factor and effective permeability (r(log k BMC /log BB): 0.77), while for RP-HPLC system the correlations were lower (r(log k wRP /log BB): 0.58; r(S/log BB): -0.50; r(φ 0 /P e ): 0.61). Based on the log k BMC retention data and calculated molecular parameters of the examined compounds, quantitative structure-permeability relationship (QSPR) models were developed using partial least squares, stepwise multiple linear regression, support vector machine and artificial neural network methodologies. A high degree of structural diversity of the analysed IRs/α-ARs ligands and CNS drugs provides wide applicability domain of the QSPR models for estimation of blood-brain barrier penetration of the related compounds.
Emergency medicine clerkship encounter and procedure logging using handheld computers.
Penciner, Rick; Siddiqui, Sanam; Lee, Shirley
2007-08-01
Tracking medical student clinical encounters is now an accreditation requirement of medical schools. The use of handheld computers for electronic logging is emerging as a strategy to achieve this. To evaluate the technical feasibility and student satisfaction of a novel electronic logging and feedback program using handheld computers in the emergency department. This was a survey study of fourth-year medical student satisfaction with the use of their handheld computers for electronic logging of patient encounters and procedures. The authors also included an analysis of this technology. Forty-six students participated in this pilot project, logging a total of 2,930 encounters. Students used the logs an average of 7.6 shifts per rotation, logging an average of 8.3 patients per shift. Twenty-nine students (63%) responded to the survey. Students generally found it easy to complete each encounter (69%) and easy to synchronize their handheld computer with the central server (83%). However, half the students (49%) never viewed the feedback Web site and most (79%) never reviewed their logs with their preceptors. Overall, only 17% found the logging program beneficial as a learning tool. Electronic logging by medical students during their emergency medicine clerkship has many potential benefits as a method to document clinical encounters and procedures performed. However, this study demonstrated poor compliance and dissatisfaction with the process. In order for electronic logging using handheld computers to be a beneficial educational tool for both learners and educators, obstacles to effective implementation need to be addressed.
INSPIRE and SPIRES Log File Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Cole; /Wheaton Coll. /SLAC
2012-08-31
SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are mademore » between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.« less
Tao, Weijing; Shen, Yang; Guo, Lili; Bo, Genji
2014-01-01
Balanced steady-state free precession MR angiography (b-SSFP MRA) has shown great promise in diagnosing renal artery stenosis (RAS) as a non-contrast MR angiography (NC-MRA) method. However, results from related studies are inconsistent. The purpose of this meta-analysis was to assess the accuracy of b-SSFP MRA compared to contrast-enhanced MR angiography (CE-MRA) in diagnosing RAS. English and Chinese studies that were published prior to September 4, 2013 and that assessed b-SSFP MRA diagnostic performance in RAS patients were reviewed. Quality of the literature was assessed independently by two observers. The statistical analysis was adopted by the software of Meta-Disc version 1.4. Using the heterogeneity test, a statistical effect model was chosen to calculate different pooled weighted values. The receiver operator characteristic (ROC) space and Spearman correlation coefficient were to explore threshold effect. Sensitivity analysis and the publication bias were performed to demonstrate if the pooled estimates were stable and reliable. We produced forest plots to calculate the pooled values and corresponding 95% confidence interval (CI) of sensitivity, specificity, positive likelihood ratio (PLR), negative likelihood ratio (NLR), diagnostic odds ratio (DOR), and constructed a summary receiver operating characteristic curve (SROC) to calculate the area under the curve (AUC). A total of 10 high quality articles were used in this meta-analysis. The studies showed a high degree of heterogeneity. The "shoulder-arm" shape in the ROC plot and the Spearman correlation coefficient between the log(SEN) and log(1-SPE) suggested that there was a threshold effect. Sensitivity analysis demonstrated that the actual combined effect size was equal to the theoretical combined effect size. The publication bias was low after quality evaluation of the literature and the construction of a funnel plot. The pooled sensitivity was 0.88 (95% CI, 0.83-0.91) and pooled specificity was 0.94 (95% CI, 0.93-0.95); pooled PLR was 14.57 (95% CI, 9.78-21.71]) and pooled NLR was 0.15 (95% CI, 0.11-0.20). The AUC was 0.964 3. In contrast to CE-MRA, the b-SSFP MRA is more accurate in diagnosing RAS, and may be able to replace other diagnostic methods in patients with renal insufficiency.
Application of Fracture Distribution Prediction Model in Xihu Depression of East China Sea
NASA Astrophysics Data System (ADS)
Yan, Weifeng; Duan, Feifei; Zhang, Le; Li, Ming
2018-02-01
There are different responses on each of logging data with the changes of formation characteristics and outliers caused by the existence of fractures. For this reason, the development of fractures in formation can be characterized by the fine analysis of logging curves. The well logs such as resistivity, sonic transit time, density, neutron porosity and gamma ray, which are classified as conventional well logs, are more sensitive to formation fractures. In view of traditional fracture prediction model, using the simple weighted average of different logging data to calculate the comprehensive fracture index, are more susceptible to subjective factors and exist a large deviation, a statistical method is introduced accordingly. Combining with responses of conventional logging data on the development of formation fracture, a prediction model based on membership function is established, and its essence is to analyse logging data with fuzzy mathematics theory. The fracture prediction results in a well formation in NX block of Xihu depression through two models are compared with that of imaging logging, which shows that the accuracy of fracture prediction model based on membership function is better than that of traditional model. Furthermore, the prediction results are highly consistent with imaging logs and can reflect the development of cracks much better. It can provide a reference for engineering practice.
Evolution of the transfer function characterization of surface scatter phenomena
NASA Astrophysics Data System (ADS)
Harvey, James E.; Pfisterer, Richard N.
2016-09-01
Based upon the empirical observation that BRDF measurements of smooth optical surfaces exhibited shift-invariant behavior when plotted versus o , the original Harvey-Shack (OHS) surface scatter theory was developed as a scalar linear systems formulation in which scattered light behavior was characterized by a surface transfer function (STF) reminiscent of the optical transfer function (OTF) of modern image formation theory (1976). This shift-invariant behavior combined with the inverse power law behavior when plotting log BRDF versus log o was quickly incorporated into several optical analysis software packages. Although there was no explicit smooth-surface approximation in the OHS theory, there was a limitation on both the incident and scattering angles. In 1988 the modified Harvey-Shack (MHS) theory removed the limitation on the angle of incidence; however, a moderate-angle scattering limitation remained. Clearly for large incident angles the BRDF was no longer shift-invariant as a different STF was now required for each incident angle. In 2011 the generalized Harvey-Shack (GHS) surface scatter theory, characterized by a two-parameter family of STFs, evolved into a practical modeling tool to calculate BRDFs from optical surface metrology data for situations that violate the smooth surface approximation inherent in the Rayleigh-Rice theory and/or the moderate-angle limitation of the Beckmann-Kirchhoff theory. And finally, the STF can be multiplied by the classical OTF to provide a complete linear systems formulation of image quality as degraded by diffraction, geometrical aberrations and surface scatter effects from residual optical fabrication errors.
Santoro, Adriana Leandra; Carrilho, Emanuel; Lanças, Fernando Mauro; Montanari, Carlos Alberto
2016-06-10
The pharmacokinetic properties of flavonoids with differing degrees of lipophilicity were investigated using immobilized artificial membranes (IAMs) as the stationary phase in high performance liquid chromatography (HPLC). For each flavonoid compound, we investigated whether the type of column used affected the correlation between the retention factors and the calculated octanol/water partition (log Poct). Three-dimensional (3D) molecular descriptors were calculated from the molecular structure of each compound using i) VolSurf software, ii) the GRID method (computational procedure for determining energetically favorable binding sites in molecules of known structure using a probe for calculating the 3D molecular interaction fields, between the probe and the molecule), and iii) the relationship between partition and molecular structure, analyzed in terms of physicochemical descriptors. The VolSurf built-in Caco-2 model was used to estimate compound permeability. The extent to which the datasets obtained from different columns differ both from each other and from both the calculated log Poct and the predicted permeability in Caco-2 cells was examined by principal component analysis (PCA). The immobilized membrane partition coefficients (kIAM) were analyzed using molecular descriptors in partial least square regression (PLS) and a quantitative structure-retention relationship was generated for the chromatographic retention in the cholesterol column. The cholesterol column provided the best correlation with the permeability predicted by the Caco-2 cell model and a good fit model with great prediction power was obtained for its retention data (R(2)=0.96 and Q(2)=0.85 with four latent variables). Copyright © 2015 Elsevier B.V. All rights reserved.
Netzeva, Tatiana I; Gallegos Saliner, Ana; Worth, Andrew P
2006-05-01
The aim of the present study was to illustrate that it is possible and relatively straightforward to compare the domain of applicability of a quantitative structure-activity relationship (QSAR) model in terms of its physicochemical descriptors with a large inventory of chemicals. A training set of 105 chemicals with data for relative estrogenic gene activation, obtained in a recombinant yeast assay, was used to develop the QSAR. A binary classification model for predicting active versus inactive chemicals was developed using classification tree analysis and two descriptors with a clear physicochemical meaning (octanol-water partition coefficient, or log Kow, and the number of hydrogen bond donors, or n(Hdon)). The model demonstrated a high overall accuracy (90.5%), with a sensitivity of 95.9% and a specificity of 78.1%. The robustness of the model was evaluated using the leave-many-out cross-validation technique, whereas the predictivity was assessed using an artificial external test set composed of 12 compounds. The domain of the QSAR training set was compared with the chemical space covered by the European Inventory of Existing Commercial Chemical Substances (EINECS), as incorporated in the CDB-EC software, in the log Kow / n(Hdon) plane. The results showed that the training set and, therefore, the applicability domain of the QSAR model covers a small part of the physicochemical domain of the inventory, even though a simple method for defining the applicability domain (ranges in the descriptor space) was used. However, a large number of compounds are located within the narrow descriptor window.