42 CFR 423.308 - Definitions and terminology.
Code of Federal Regulations, 2010 CFR
2010-10-01
... exclude any costs attributable to benefits beyond basic prescription drug coverage, but also to exclude... benefits beyond basic prescription drug coverage, but also to exclude any prescription drug coverage costs... assistance outside the Part D benefit, provided that documentation of such nominal cost-sharing has been...
Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder
NASA Technical Reports Server (NTRS)
Staats, Matt
2009-01-01
We present work on a prototype tool based on the JavaPathfinder (JPF) model checker for automatically generating tests satisfying the MC/DC code coverage criterion. Using the Eclipse IDE, developers and testers can quickly instrument Java source code with JPF annotations covering all MC/DC coverage obligations, and JPF can then be used to automatically generate tests that satisfy these obligations. The prototype extension to JPF enables various tasks useful in automatic test generation to be performed, such as test suite reduction and execution of generated tests.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Public Welfare (Continued) NATIONAL FOUNDATION ON THE ARTS AND THE HUMANITIES NATIONAL ENDOWMENT FOR THE ARTS COLLECTION OF CLAIMS Salary Offset § 1150.20 What debts are included or excluded from coverage of...
Code of Federal Regulations, 2010 CFR
2010-10-01
... Public Welfare (Continued) NATIONAL FOUNDATION ON THE ARTS AND THE HUMANITIES NATIONAL ENDOWMENT FOR THE ARTS COLLECTION OF CLAIMS Salary Offset § 1150.20 What debts are included or excluded from coverage of...
Code of Federal Regulations, 2013 CFR
2013-01-01
... excluded from life insurance coverage by law: (1) An employee of a corporation supervised by the Farm... dependents school overseas, if employed by the Federal Government in a nonteaching position during the recess period between school years. (b) The following employees are also excluded from life insurance coverage...
Code of Federal Regulations, 2014 CFR
2014-01-01
... excluded from life insurance coverage by law: (1) An employee of a corporation supervised by the Farm... dependents school overseas, if employed by the Federal Government in a nonteaching position during the recess period between school years. (b) The following employees are also excluded from life insurance coverage...
Code of Federal Regulations, 2011 CFR
2011-01-01
... excluded from life insurance coverage by law: (1) An employee of a corporation supervised by the Farm... dependents school overseas, if employed by the Federal Government in a nonteaching position during the recess period between school years. (b) The following employees are also excluded from life insurance coverage...
Code of Federal Regulations, 2012 CFR
2012-01-01
... excluded from life insurance coverage by law: (1) An employee of a corporation supervised by the Farm... dependents school overseas, if employed by the Federal Government in a nonteaching position during the recess period between school years. (b) The following employees are also excluded from life insurance coverage...
75 FR 80731 - Request for Exclusion of 120 Volt, 100 Watt R20 Short Incandescent Reflector Lamps
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-23
... petition seeks to exclude from the coverage of energy conservation standards for incandescent reflector... sought by NEMA would exclude 120 volt, 100 watt R20 short lamps from coverage of energy conservation... of its energy conservation standard as applied to this type of lamp pending the outcome of this...
Ozhinsky, Eugene; Vigneron, Daniel B; Nelson, Sarah J
2011-04-01
To develop a technique for optimizing coverage of brain 3D (1) H magnetic resonance spectroscopic imaging (MRSI) by automatic placement of outer-volume suppression (OVS) saturation bands (sat bands) and to compare the performance for point-resolved spectroscopic sequence (PRESS) MRSI protocols with manual and automatic placement of sat bands. The automated OVS procedure includes the acquisition of anatomic images from the head, obtaining brain and lipid tissue maps, calculating optimal sat band placement, and then using those optimized parameters during the MRSI acquisition. The data were analyzed to quantify brain coverage volume and data quality. 3D PRESS MRSI data were acquired from three healthy volunteers and 29 patients using protocols that included either manual or automatic sat band placement. On average, the automatic sat band placement allowed the acquisition of PRESS MRSI data from 2.7 times larger brain volumes than the conventional method while maintaining data quality. The technique developed helps solve two of the most significant problems with brain PRESS MRSI acquisitions: limited brain coverage and difficulty in prescription. This new method will facilitate routine clinical brain 3D MRSI exams and will be important for performing serial evaluation of response to therapy in patients with brain tumors and other neurological diseases. Copyright © 2011 Wiley-Liss, Inc.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Coverage. 734.401 Section 734.401...) POLITICAL ACTIVITIES OF FEDERAL EMPLOYEES Employees in Certain Agencies and Positions § 734.401 Coverage. (a... agencies and positions described in paragraph (a) of this section are excluded from coverage under this...
Code of Federal Regulations, 2013 CFR
2013-01-01
... 5 Administrative Personnel 2 2013-01-01 2013-01-01 false Coverage. 734.401 Section 734.401...) POLITICAL ACTIVITIES OF FEDERAL EMPLOYEES Employees in Certain Agencies and Positions § 734.401 Coverage. (a... agencies and positions described in paragraph (a) of this section are excluded from coverage under this...
Code of Federal Regulations, 2012 CFR
2012-01-01
... 5 Administrative Personnel 2 2012-01-01 2012-01-01 false Coverage. 734.401 Section 734.401...) POLITICAL ACTIVITIES OF FEDERAL EMPLOYEES Employees in Certain Agencies and Positions § 734.401 Coverage. (a... agencies and positions described in paragraph (a) of this section are excluded from coverage under this...
Code of Federal Regulations, 2014 CFR
2014-01-01
... 5 Administrative Personnel 2 2014-01-01 2014-01-01 false Coverage. 734.401 Section 734.401...) POLITICAL ACTIVITIES OF FEDERAL EMPLOYEES Employees in Certain Agencies and Positions § 734.401 Coverage. (a... agencies and positions described in paragraph (a) of this section are excluded from coverage under this...
SU-C-BRB-02: Automatic Planning as a Potential Strategy for Dose Escalation for Pancreas SBRT?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, S; Zheng, D; Ma, R
Purpose: Stereotactic body radiation therapy (SBRT) has been suggested to provide high rates of local control for locally advanced pancreatic cancer. However, the close proximity of highly radiosensitive normal tissues usually causes the labor-intensive planning process, and may impede further escalation of the prescription dose. The present study evaluates the potential of an automatic planning system as a dose escalation strategy. Methods: Ten pancreatic cancer patients treated with SBRT were studied retrospectively. SBRT was delivered over 5 consecutive fractions with 6 ∼ 8Gy/fraction. Two plans were generated by Pinnacle Auto-Planning with the original prescription and escalated prescription, respectively. Escalated prescriptionmore » adds 1 Gy/fraction to the original prescription. Manually-created planning volumes were excluded in the optimization goals in order to assess the planning efficiency and quality simultaneously. Critical organs with closest proximity were used to determine the plan normalization to ensure the OAR sparing. Dosimetric parameters including D100, and conformity index (CI) were assessed. Results: Auto-plans directly generate acceptable plans for 70% of the cases without necessity of further improvement, and two more iterations at most are necessary for the rest of the cases. For the pancreas SBRT plans with the original prescription, autoplans resulted in favorable target coverage and PTV conformity (D100 = 96.3% ± 1.48%; CI = 0.88 ± 0.06). For the plans with the escalated prescriptions, no significant target under-dosage was observed, and PTV conformity remains reasonable (D100 = 93.3% ± 3.8%, and CI = 0.84 ± 0.05). Conclusion: Automatic planning, without substantial human-intervention process, results in reasonable PTV coverage and PTV conformity on the premise of adequate OAR sparing for the pancreas SBRT plans with escalated prescription. The results highlight the potential of autoplanning as a dose escalation strategy for pancreas SBRT treatment planning. Further investigations with a larger number of patients are necessary. The project is partially supported by Philips Medical Systems.« less
Insurance coverage for male infertility care in the United States.
Dupree, James M
2016-01-01
Infertility is a common condition experienced by many men and women, and treatments are expensive. The World Health Organization and American Society of Reproductive Medicine define infertility as a disease, yet private companies infrequently offer insurance coverage for infertility treatments. This is despite the clear role that healthcare insurance plays in ensuring access to care and minimizing the financial burden of expensive services. In this review, we assess the current knowledge of how male infertility care is covered by insurance in the United States. We begin with an appraisal of the costs of male infertility care, then examine the state insurance laws relevant to male infertility, and close with a discussion of why insurance coverage for male infertility is important to both men and women. Importantly, we found that despite infertility being classified as a disease and males contributing to almost half of all infertility cases, coverage for male infertility is often excluded from health insurance laws. Excluding coverage for male infertility places an undue burden on their female partners. In addition, excluding care for male infertility risks missing opportunities to diagnose important health conditions and identify reversible or irreversible causes of male infertility. Policymakers should consider providing equal coverage for male and female infertility care in future health insurance laws.
Insurance coverage for male infertility care in the United States
Dupree, James M
2016-01-01
Infertility is a common condition experienced by many men and women, and treatments are expensive. The World Health Organization and American Society of Reproductive Medicine define infertility as a disease, yet private companies infrequently offer insurance coverage for infertility treatments. This is despite the clear role that healthcare insurance plays in ensuring access to care and minimizing the financial burden of expensive services. In this review, we assess the current knowledge of how male infertility care is covered by insurance in the United States. We begin with an appraisal of the costs of male infertility care, then examine the state insurance laws relevant to male infertility, and close with a discussion of why insurance coverage for male infertility is important to both men and women. Importantly, we found that despite infertility being classified as a disease and males contributing to almost half of all infertility cases, coverage for male infertility is often excluded from health insurance laws. Excluding coverage for male infertility places an undue burden on their female partners. In addition, excluding care for male infertility risks missing opportunities to diagnose important health conditions and identify reversible or irreversible causes of male infertility. Policymakers should consider providing equal coverage for male and female infertility care in future health insurance laws. PMID:27030084
29 CFR 2590.701-4 - Rules relating to creditable coverage.
Code of Federal Regulations, 2010 CFR
2010-07-01
... foreign country, or any political subdivision of a State, the U.S. government, or a foreign country that... Children's Health Insurance Program). (2) Excluded coverage. Creditable coverage does not include coverage... standard method described in paragraph (b) of this section. A plan or issuer may use the alternative method...
29 CFR 2.11 - General principles.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Secretary of Labor GENERAL REGULATIONS Audiovisual Coverage of Administrative Hearings § 2.11 General... involve administrative hearings. If such administrative hearings are held, we encourage their audiovisual coverage. (b) Audiovisual coverage shall be excluded in adjudicatory proceedings involving the rights or...
42 CFR 409.49 - Excluded services.
Code of Federal Regulations, 2010 CFR
2010-10-01
... under Part B are excluded from home health coverage. Catheters, catheter supplies, ostomy bags, and supplies relating to ostomy care are not considered prosthetic devices if furnished under a home health...
20 CFR 404.1024 - Election of coverage by religious orders.
Code of Federal Regulations, 2012 CFR
2012-04-01
... DISABILITY INSURANCE (1950- ) Employment, Wages, Self-Employment, and Self-Employment Income Work Excluded from Employment § 404.1024 Election of coverage by religious orders. A religious order whose members...
20 CFR 404.1024 - Election of coverage by religious orders.
Code of Federal Regulations, 2011 CFR
2011-04-01
... DISABILITY INSURANCE (1950- ) Employment, Wages, Self-Employment, and Self-Employment Income Work Excluded from Employment § 404.1024 Election of coverage by religious orders. A religious order whose members...
20 CFR 404.1024 - Election of coverage by religious orders.
Code of Federal Regulations, 2010 CFR
2010-04-01
... DISABILITY INSURANCE (1950- ) Employment, Wages, Self-Employment, and Self-Employment Income Work Excluded from Employment § 404.1024 Election of coverage by religious orders. A religious order whose members...
20 CFR 404.1024 - Election of coverage by religious orders.
Code of Federal Regulations, 2013 CFR
2013-04-01
... DISABILITY INSURANCE (1950- ) Employment, Wages, Self-Employment, and Self-Employment Income Work Excluded from Employment § 404.1024 Election of coverage by religious orders. A religious order whose members...
20 CFR 404.1024 - Election of coverage by religious orders.
Code of Federal Regulations, 2014 CFR
2014-04-01
... DISABILITY INSURANCE (1950- ) Employment, Wages, Self-Employment, and Self-Employment Income Work Excluded from Employment § 404.1024 Election of coverage by religious orders. A religious order whose members...
The decision to exclude agricultural and domestic workers from the 1935 Social Security Act.
DeWitt, Larry
2010-01-01
The Social Security Act of 1935 excluded from coverage about half the workers in the American economy. Among the excluded groups were agricultural and domestic workers-a large percentage of whom were African Americans. This has led some scholars to conclude that policymakers in 1935 deliberately excluded African Americans from the Social Security system because of prevailing racial biases during that period. This article examines both the logic of this thesis and the available empirical evidence on the origins of the coverage exclusions. The author concludes that the racial-bias thesis is both conceptually flawed and unsupported by the existing empirical evidence. The exclusion of agricultural and domestic workers from the early program was due to considerations of administrative feasibility involving tax-collection procedures. The author finds no evidence of any other policy motive involving racial bias.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 29 Labor 1 2014-07-01 2013-07-01 true Coverage. 9.3 Section 9.3 Labor Office of the Secretary of Labor NONDISPLACEMENT OF QUALIFIED WORKERS UNDER SERVICE CONTRACTS General § 9.3 Coverage. This part applies to all service contracts and their solicitations, except those excluded by § 9.4 of this part...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 29 Labor 1 2013-07-01 2013-07-01 false Coverage. 9.3 Section 9.3 Labor Office of the Secretary of Labor NONDISPLACEMENT OF QUALIFIED WORKERS UNDER SERVICE CONTRACTS General § 9.3 Coverage. This part applies to all service contracts and their solicitations, except those excluded by § 9.4 of this part...
LiDAR Point Cloud and Stereo Image Point Cloud Fusion
2013-09-01
LiDAR point cloud (right) highlighting linear edge features ideal for automatic registration...point cloud (right) highlighting linear edge features ideal for automatic registration. Areas where topography is being derived, unfortunately, do...with the least amount of automatic correlation errors was used. The following graphic (Figure 12) shows the coverage of the WV1 stereo triplet as
Model-Based GUI Testing Using Uppaal at Novo Nordisk
NASA Astrophysics Data System (ADS)
Hjort, Ulrik H.; Illum, Jacob; Larsen, Kim G.; Petersen, Michael A.; Skou, Arne
This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates a test suite satisfying some testing criterion, such as edge or state coverage, and converts the individual test case into a scripting language that can be automatically executed against the target. The tool has significantly reduced the time required for test construction and generation, and reduced the number of test scripts while increasing the coverage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jani, S; Kishan, A; O'Connell, D
2014-06-01
Purpose: To investigate if pelvic nodal coverage for prostate patients undergoing intensity modulated radiotherapy (IMRT) can be predicted using mutual image information computed between planning and cone-beam CTs (CBCTs). Methods: Four patients with high-risk prostate adenocarcinoma were treated with IMRT on a Varian TrueBeam. Plans were designed such that 95% of the nodal planning target volume (PTV) received the prescription dose of 45 Gy (N=1) or 50.4 Gy (N=3). Weekly CBCTs (N=25) were acquired and the nodal clinical target volumes and organs at risk were contoured by a physician. The percent nodal volume receiving prescription dose was recorded as amore » ground truth. Using the recorded shifts performed by the radiation therapists at the time of image acquisition, CBCTs were aligned with the planning kVCT. Mutual image information (MI) was calculated between the CBCT and the aligned planning CT within the contour of the nodal PTV. Due to variable CBCT fields-of-view, CBCT images covering less than 90% of the nodal volume were excluded from the analysis, resulting in the removal of eight CBCTs. Results: A correlation coefficient of 0.40 was observed between the MI metric and the percent of the nodal target volume receiving the prescription dose. One patient's CBCTs had clear outliers from the rest of the patients. Upon further investigation, we discovered image artifacts that were present only in that patient's images. When those four images were excluded, the correlation improved to 0.81. Conclusion: This pilot study shows the potential of predicting pelvic nodal dosimetry by computing the mutual image information between planning CTs and patient setup CBCTs. Importantly, this technique does not involve manual or automatic contouring of the CBCT images. Additional patients and more robust exclusion criteria will help validate our findings.« less
45 CFR 146.113 - Rules relating to creditable coverage.
Code of Federal Regulations, 2010 CFR
2010-10-01
... State, the U.S. government, a foreign country, or any political subdivision of a State, the U.S... XXI of the Social Security Act (State Children's Health Insurance Program). (2) Excluded coverage... generally is determined by using the standard method described in paragraph (b) of this section. A plan or...
45 CFR 303.32 - National Medical Support Notice.
Code of Federal Regulations, 2012 CFR
2012-10-01
...(ren) to employers. (2) The State agency must transfer the NMSN to the employer within two business... such health care coverage for which the child(ren) is eligible (excluding the severable Notice to... obligation of the employee for employee contributions necessary for coverage of the child(ren) and send any...
45 CFR 303.32 - National Medical Support Notice.
Code of Federal Regulations, 2013 CFR
2013-10-01
...(ren) to employers. (2) The State agency must transfer the NMSN to the employer within two business... such health care coverage for which the child(ren) is eligible (excluding the severable Notice to... obligation of the employee for employee contributions necessary for coverage of the child(ren) and send any...
78 FR 7314 - Shared Responsibility Payment for Not Maintaining Minimum Essential Coverage
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-01
... accounting firm in accordance with generally accepted accounting principles the report of which is made... affordable coverage if the individual's required contribution (determined on an annual basis) for minimum... portion of the required contribution made through a salary reduction arrangement and excluded from gross...
29 CFR 778.2 - Coverage and exemptions not discussed.
Code of Federal Regulations, 2010 CFR
2010-07-01
... within the general coverage of the wage and hours provisions are wholly or partially excluded from the protection of the Act's minimum-wage and overtime-pay requirements. Some of these exemptions are self... Labor Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR STATEMENTS...
Automatic cloud coverage assessment of Formosat-2 image
NASA Astrophysics Data System (ADS)
Hsu, Kuo-Hsien
2011-11-01
Formosat-2 satellite equips with the high-spatial-resolution (2m ground sampling distance) remote sensing instrument. It has been being operated on the daily-revisiting mission orbit by National Space organization (NSPO) of Taiwan since May 21 2004. NSPO has also serving as one of the ground receiving stations for daily processing the received Formosat- 2 images. The current cloud coverage assessment of Formosat-2 image for NSPO Image Processing System generally consists of two major steps. Firstly, an un-supervised K-means method is used for automatically estimating the cloud statistic of Formosat-2 image. Secondly, manual estimation of cloud coverage from Formosat-2 image is processed by manual examination. Apparently, a more accurate Automatic Cloud Coverage Assessment (ACCA) method certainly increases the efficiency of processing step 2 with a good prediction of cloud statistic. In this paper, mainly based on the research results from Chang et al, Irish, and Gotoh, we propose a modified Formosat-2 ACCA method which considered pre-processing and post-processing analysis. For pre-processing analysis, cloud statistic is determined by using un-supervised K-means classification, Sobel's method, Otsu's method, non-cloudy pixels reexamination, and cross-band filter method. Box-Counting fractal method is considered as a post-processing tool to double check the results of pre-processing analysis for increasing the efficiency of manual examination.
Mapping AIS coverage for trusted surveillance
NASA Astrophysics Data System (ADS)
Lapinski, Anna-Liesa S.; Isenor, Anthony W.
2010-10-01
Automatic Identification System (AIS) is an unattended vessel reporting system developed for collision avoidance. Shipboard AIS equipment automatically broadcasts vessel positional data at regular intervals. The real-time position and identity data from a vessel is received by other vessels in the area thereby assisting with local navigation. As well, AIS broadcasts are beneficial to those concerned with coastal and harbour security. Land-based AIS receiving stations can also collect the AIS broadcasts. However, reception at the land station is dependent upon the ship's position relative to the receiving station. For AIS to be used as a trusted surveillance system, the characteristics of the AIS coverage area in the vicinity of the station (or stations) should be understood. This paper presents some results of a method being investigated at DRDC Atlantic, Canada) to map the AIS coverage characteristics of a dynamic AIS reception network. The method is shown to clearly distinguish AIS reception edges from those edges caused by vessel traffic patterns. The method can also be used to identify temporal changes in the coverage area, an important characteristic for local maritime security surveillance activities. Future research using the coverage estimate technique is also proposed to support surveillance activities.
Chan, Tao
2012-01-01
CT has become an established method for calculating body composition, but it requires data from the whole body, which are not typically obtained in routine PET/CT examinations. A computerized scheme that evaluates whole-body lean body mass (LBM) based on CT data from limited-whole-body coverage was developed. The LBM so obtained was compared with results from conventional predictive equations. LBM can be obtained automatically from limited-whole-body CT data by 3 means: quantification of body composition from CT images in the limited-whole-body scan, based on thresholding of CT attenuation; determination of the range of coverage based on a characteristic trend of changing composition across different levels and pattern recognition of specific features at strategic positions; and estimation of the LBM of the whole body on the basis of a predetermined relationship between proportion of fat mass and extent of coverage. This scheme was validated using 18 whole-body PET/CT examinations truncated at different lengths to emulate limited-whole-body data. LBM was also calculated using predictive equations that had been reported for use in SUV normalization. LBM derived from limited-whole-body data using the proposed method correlated strongly with LBM derived from whole-body CT data, with correlation coefficients ranging from 0.991 (shorter coverage) to 0.998 (longer coverage) and SEMs of LBM ranging from 0.14 to 0.33 kg. These were more accurate than results from different predictive equations, which ranged in correlation coefficient from 0.635 to 0.970 and in SEM from 0.64 to 2.40 kg. LBM of the whole body could be automatically estimated from CT data of limited-whole-body coverage typically acquired in PET/CT examinations. This estimation allows more accurate and consistent quantification of metabolic activity of tumors based on LBM-normalized standardized uptake value.
Bernheim, Susannah M.; Wang, Yongfei; Bradley, Elizabeth H.; Masoudi, Frederick A.; Rathore, Saif S.; Ross, Joseph S.; Drye, Elizabeth; Krumholz, Harlan M.
2012-01-01
Background The Centers for Medicare and Medicaid Services (CMS) provides public reporting on the quality of hospital care for patients with acute myocardial infarction (AMI). CMS Core Measures allow discretion in excluding patients because of relative contraindications to aspirin, beta-blockers and angiotensin converting enzyme inhibitors. We describe trends in the proportion of AMI patients with contraindications that could lead to discretionary exclusion from public reporting. Methods We completed cross-sectional analyses of three nationally-representative data cohorts of AMI admissions among Medicare patients in 1994–5 (n=170,928), 1998–9 (n=27,432), and 2000–2001 (n=27,300) from the national Medicare quality improvement projects. Patients were categorized as ineligible (e.g. transfer patients), automatically excluded (specified absolute medical contraindications), discretionarily excluded (potentially excluded based on relative contraindications), or ‘ideal’ for treatment for each measure. Results For 4 of 5 measures the percentage of discretionarily excluded patients increased over the three time periods (admission aspirin 15.8% to 16.9% and admission beta-blocker 14.3% to 18.3%, discharge aspirin 10.3% to 12.3%, and ACE-I 2.8% to 3.9%, p<.001). Of patients potentially included in measures (those who were not ineligible or automatically excluded), the discretionarily excluded represented 25.5 % to 69.2% in 2000–01. Treatment rates among patients with discretionary exclusions also increased for 4 of 5 measures (all except ACE-I). Conclusions A sizeable and growing proportion of AMI patients have relative contraindications to treatments that may result in discretionary exclusion from publicly-reported quality measures. These patients represent a large population for which there is insufficient evidence as to whether measure exclusion or inclusion and treatment represents best care. PMID:21095284
Automatic cloud tracking applied to GOES and Meteosat observations
NASA Technical Reports Server (NTRS)
Endlich, R. M.; Wolf, D. E.
1981-01-01
An improved automatic processing method for the tracking of cloud motions as revealed by satellite imagery is presented and applications of the method to GOES observations of Hurricane Eloise and Meteosat water vapor and infrared data are presented. The method is shown to involve steps of picture smoothing, target selection and the calculation of cloud motion vectors by the matching of a group at a given time with its best likeness at a later time, or by a cross-correlation computation. Cloud motion computations can be made in as many as four separate layers simultaneously. For data of 4 and 8 km resolution in the eye of Hurricane Eloise, the automatic system is found to provide results comparable in accuracy and coverage to those obtained by NASA analysts using the Atmospheric and Oceanographic Information Processing System, with results obtained by the pattern recognition and cross correlation computations differing by only fractions of a pixel. For Meteosat water vapor data from the tropics and midlatitudes, the automatic motion computations are found to be reliable only in areas where the water vapor fields contained small-scale structure, although excellent results are obtained using Meteosat IR data in the same regions. The automatic method thus appears to be competitive in accuracy and coverage with motion determination by human analysts.
GFam: a platform for automatic annotation of gene families.
Sasidharan, Rajkumar; Nepusz, Tamás; Swarbreck, David; Huala, Eva; Paccanaro, Alberto
2012-10-01
We have developed GFam, a platform for automatic annotation of gene/protein families. GFam provides a framework for genome initiatives and model organism resources to build domain-based families, derive meaningful functional labels and offers a seamless approach to propagate functional annotation across periodic genome updates. GFam is a hybrid approach that uses a greedy algorithm to chain component domains from InterPro annotation provided by its 12 member resources followed by a sequence-based connected component analysis of un-annotated sequence regions to derive consensus domain architecture for each sequence and subsequently generate families based on common architectures. Our integrated approach increases sequence coverage by 7.2 percentage points and residue coverage by 14.6 percentage points higher than the coverage relative to the best single-constituent database within InterPro for the proteome of Arabidopsis. The true power of GFam lies in maximizing annotation provided by the different InterPro data sources that offer resource-specific coverage for different regions of a sequence. GFam's capability to capture higher sequence and residue coverage can be useful for genome annotation, comparative genomics and functional studies. GFam is a general-purpose software and can be used for any collection of protein sequences. The software is open source and can be obtained from http://www.paccanarolab.org/software/gfam/.
ERIC Educational Resources Information Center
Wilson-Simmons, Renée; Dworsky, Amy; Tongue, Denzel; Hulbutta, Marikate
2016-01-01
The Affordable Care Act includes language that requires states to provide Medicaid coverage to youth who were in foster care in their state before aging out of the child welfare system. However, most states have interpreted the law differently for youth who move to their state after aging out, determining that automatic Medicaid coverage is an…
The availability and marginal costs of dependent employer-sponsored health insurance.
Miller, G Edward; Vistnes, Jessica; Buettgens, Matthew; Dubay, Lisa
2017-01-21
In this study, we examine differences by firm size in the availability of dependent coverage and the incremental cost of such coverage. We use data from the Medical Expenditure Panel Survey - Insurance Component (MEPS-IC) to show that among employees eligible for single coverage, dependent coverage was almost always available for employees in large firms (100 or more employees) but not in smaller firms, particularly those with fewer than 10 employees. In addition, when dependent coverage was available, eligible employees in smaller firms were more likely than employees in large firms to face two situations that represented the extremes of the incremental cost distribution: (1) they paid nothing for single or family coverage or (2) they paid nothing for single coverage but faced a high contribution for family coverage. These results suggest that firm size may be an important factor in policy assessments, such as analyses of the financial implications for families excluded from subsidized Marketplace coverage due to affordable offers of single coverage or of potential rollbacks to public coverage for children.
Development and evaluation of automatic landing control laws for power lift STOL aircraft
NASA Technical Reports Server (NTRS)
Feinreich, B.; Gevaert, G.
1981-01-01
A series of investigations were conducted to generate and verify through ground bases simulation and flight research a data base to aid in the design and certification of advanced propulsive lift short takeoff and landing aircraft. Problems impacting the design of powered lift short haul aircraft that are to be landed automatically on STOL runways in adverse weather were examined. An understanding of the problems was gained by a limited coverage of important elements that are normally included in the certification process of a CAT 3 automatic landing system.
Paproki, A; Engstrom, C; Chandra, S S; Neubert, A; Fripp, J; Crozier, S
2014-09-01
To validate an automatic scheme for the segmentation and quantitative analysis of the medial meniscus (MM) and lateral meniscus (LM) in magnetic resonance (MR) images of the knee. We analysed sagittal water-excited double-echo steady-state MR images of the knee from a subset of the Osteoarthritis Initiative (OAI) cohort. The MM and LM were automatically segmented in the MR images based on a deformable model approach. Quantitative parameters including volume, subluxation and tibial-coverage were automatically calculated for comparison (Wilcoxon tests) between knees with variable radiographic osteoarthritis (rOA), medial and lateral joint space narrowing (mJSN, lJSN) and pain. Automatic segmentations and estimated parameters were evaluated for accuracy using manual delineations of the menisci in 88 pathological knee MR examinations at baseline and 12 months time-points. The median (95% confidence-interval (CI)) Dice similarity index (DSI) (2 ∗|Auto ∩ Manual|/(|Auto|+|Manual|)∗ 100) between manual and automated segmentations for the MM and LM volumes were 78.3% (75.0-78.7), 83.9% (82.1-83.9) at baseline and 75.3% (72.8-76.9), 83.0% (81.6-83.5) at 12 months. Pearson coefficients between automatic and manual segmentation parameters ranged from r = 0.70 to r = 0.92. MM in rOA/mJSN knees had significantly greater subluxation and smaller tibial-coverage than no-rOA/no-mJSN knees. LM in rOA knees had significantly greater volumes and tibial-coverage than no-rOA knees. Our automated method successfully segmented the menisci in normal and osteoarthritic knee MR images and detected meaningful morphological differences with respect to rOA and joint space narrowing (JSN). Our approach will facilitate analyses of the menisci in prospective MR cohorts such as the OAI for investigations into pathophysiological changes occurring in early osteoarthritis (OA) development. Copyright © 2014 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
AFETR Instrumentation Handbook
1971-09-01
of time. From this, vehicle velocity and acceleration can be computed. LOCATION Three Askanias are mobile and may be located at selected universal...Being mobile , these cinetheodolites may be placed for optimum launch coverage. Preprogrammed focusing is provided for automatic focus from 2000 and 8000...console trailer. IR (lead sulfide sensor ) Automatic Tracking System with 1 to 20 miles range. Elevation range: -10 deg to +90 deg Azimuth range: 350
Change-Based Satellite Monitoring Using Broad Coverage and Targetable Sensing
NASA Technical Reports Server (NTRS)
Chien, Steve A.; Tran, Daniel Q.; Doubleday, Joshua R.; Doggett, Thomas
2013-01-01
A generic software framework analyzes data from broad coverage sweeps or general larger areas of interest. Change detection methods are used to extract subsets of directed swath areas that intersect areas of change. These areas are prioritized and allocated to targetable assets. This method is deployed in an automatic fashion, and has operated without human monitoring or intervention for sustained periods of time (months).
Automatically processed alpha-track radon monitor
Langner, Jr., G. Harold
1993-01-01
An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided.
Automatically processed alpha-track radon monitor
Langner, G.H. Jr.
1993-01-12
An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided.
Hoffman, Steven J; Justicz, Victoria
2016-07-01
To develop and validate a method for automatically quantifying the scientific quality and sensationalism of individual news records. After retrieving 163,433 news records mentioning the Severe Acute Respiratory Syndrome (SARS) and H1N1 pandemics, a maximum entropy model for inductive machine learning was used to identify relationships among 500 randomly sampled news records that correlated with systematic human assessments of their scientific quality and sensationalism. These relationships were then computationally applied to automatically classify 10,000 additional randomly sampled news records. The model was validated by randomly sampling 200 records and comparing human assessments of them to the computer assessments. The computer model correctly assessed the relevance of 86% of news records, the quality of 65% of records, and the sensationalism of 73% of records, as compared to human assessments. Overall, the scientific quality of SARS and H1N1 news media coverage had potentially important shortcomings, but coverage was not too sensationalizing. Coverage slightly improved between the two pandemics. Automated methods can evaluate news records faster, cheaper, and possibly better than humans. The specific procedure implemented in this study can at the very least identify subsets of news records that are far more likely to have particular scientific and discursive qualities. Copyright © 2016 Elsevier Inc. All rights reserved.
Bernheim, Susannah M; Wang, Yongfei; Bradley, Elizabeth H; Masoudi, Frederick A; Rathore, Saif S; Ross, Joseph S; Drye, Elizabeth; Krumholz, Harlan M
2010-11-01
The Centers for Medicare and Medicaid Services provides public reporting on the quality of hospital care for patients with acute myocardial infarction (AMI). The Centers for Medicare and Medicaid Services Core Measures allow discretion in excluding patients because of relative contraindications to aspirin, β-blockers, and angiotensin-converting enzyme inhibitors. We describe trends in the proportion of patients with AMI with contraindications that could lead to discretionary exclusion from public reporting. We completed cross-sectional analyses of 3 nationally representative data cohorts of AMI admissions among Medicare patients in 1994-1995 (n = 170,928), 1998-1999 (n = 27,432), and 2000-2001 (n = 27,300) from the national Medicare quality improvement projects. Patients were categorized as ineligible (eg, transfer patients), automatically excluded (specified absolute medical contraindications), discretionarily excluded (potentially excluded based on relative contraindications), or "ideal" for treatment for each measure. For 4 of 5 measures, the percentage of discretionarily excluded patients increased over the 3 periods (admission aspirin 15.8% to 16.9%, admission β-blocker 14.3% to 18.3%, discharge aspirin 10.3% to 12.3%, and angiotensin-converting enzyme inhibitors 2.8% to 3.9%; P < .001). Of patients potentially included in measures (those who were not ineligible or automatically excluded), the discretionarily excluded represented 25.5% to 69.2% in 2000-2001. Treatment rates among patients with discretionary exclusions also increased for 4 of 5 measures (all except angiotensin-converting enzyme inhibitors). A sizeable and growing proportion of patients with AMI have relative contraindications to treatments that may result in discretionary exclusion from publicly reported quality measures. These patients represent a large population for which there is insufficient evidence as to whether measure exclusion or inclusion and treatment represents best care. Copyright © 2010 Mosby, Inc. All rights reserved.
42 CFR 411.15 - Particular services excluded from coverage.
Code of Federal Regulations, 2014 CFR
2014-10-01
... magnification of images for impaired vision. (2) Exceptions. (i) Post-surgical prosthetic lenses customarily... condition and clinical status. (j) Personal comfort services, except as necessary for the palliation or...
42 CFR 411.15 - Particular services excluded from coverage.
Code of Federal Regulations, 2013 CFR
2013-10-01
... magnification of images for impaired vision. (2) Exceptions. (i) Post-surgical prosthetic lenses customarily... condition and clinical status. (j) Personal comfort services, except as necessary for the palliation or...
42 CFR 411.15 - Particular services excluded from coverage.
Code of Federal Regulations, 2012 CFR
2012-10-01
... magnification of images for impaired vision. (2) Exceptions. (i) Post-surgical prosthetic lenses customarily... condition and clinical status. (j) Personal comfort services, except as necessary for the palliation or...
42 CFR 411.15 - Particular services excluded from coverage.
Code of Federal Regulations, 2011 CFR
2011-10-01
... magnification of images for impaired vision. (2) Exceptions. (i) Post-surgical prosthetic lenses customarily... condition and clinical status. (j) Personal comfort services, except as necessary for the palliation or...
Statewide Cellular Coverage Map
DOT National Transportation Integrated Search
2002-02-01
The role of wireless communications in transportation is becoming increasingly important. Wireless communications are critical for many applications of Intelligent Transportation Systems (ITS) such as Automatic Vehicle Location (AVL) and Automated Co...
Gowda, Charitha; Dong, Shiming; Potter, Rachel C; Dombkowski, Kevin J; Stokley, Shannon; Dempsey, Amanda F
2013-01-01
Immunization information systems (IISs) are valuable surveillance tools; however, population relocation may introduce bias when determining immunization coverage. We explored alternative methods for estimating the vaccine-eligible population when calculating adolescent immunization levels using a statewide IIS. We performed a retrospective analysis of the Michigan State Care Improvement Registry (MCIR) for all adolescents aged 11-18 years registered in the MCIR as of October 2010. We explored four methods for determining denominators: (1) including all adolescents with MCIR records, (2) excluding adolescents with out-of-state residence, (3) further excluding those without MCIR activity ≥ 10 years prior to the evaluation date, and (4) using a denominator based on U.S. Census data. We estimated state- and county-specific coverage levels for four adolescent vaccines. We found a 20% difference in estimated vaccination coverage between the most inclusive and restrictive denominator populations. Although there was some variability among the four methods in vaccination at the state level (2%-11%), greater variation occurred at the county level (up to 21%). This variation was substantial enough to potentially impact public health assessments of immunization programs. Generally, vaccines with higher coverage levels had greater absolute variation, as did counties with smaller populations. At the county level, using the four denominator calculation methods resulted in substantial differences in estimated adolescent immunization rates that were less apparent when aggregated at the state level. Further research is needed to ascertain the most appropriate method for estimating vaccine coverage levels using IIS data.
Excluding Abortion Coverage from Health Reform Act
Sen. Coburn, Tom [R-OK
2010-08-05
Senate - 08/05/2010 Read twice and referred to the Committee on Health, Education, Labor, and Pensions. (All Actions) Tracker: This bill has the status IntroducedHere are the steps for Status of Legislation:
Encaoua, J; Abgral, R; Leleu, C; El Kabbaj, O; Caradec, P; Bourhis, D; Pradier, O; Schick, U
2017-06-01
To study the impact on radiotherapy planning of an automatically segmented target volume delineation based on ( 18 F)-fluorodeoxy-D-glucose (FDG)-hybrid positron emission tomography-computed tomography (PET-CT) compared to a manually delineation based on computed tomography (CT) in oesophageal carcinoma patients. Fifty-eight patients diagnosed with oesophageal cancer between September 2009 and November 2014 were included. The majority had squamous cell carcinoma (84.5 %), and advanced stage (37.9 % were stade IIIA) and 44.8 % had middle oesophageal lesion. Gross tumour volumes were retrospectively defined based either manually on CT or automatically on coregistered PET/CT images using three different threshold methods: standard-uptake value (SUV) of 2.5, 40 % of maximum intensity and signal-to-background ratio. Target volumes were compared in length, volume and using the index of conformality. Radiotherapy plans to the dose of 50Gy and 66Gy using intensity-modulated radiotherapy were generated and compared for both data sets. Planification target volume coverage and doses delivered to organs at risk (heart, lung and spinal cord) were compared. The gross tumour volume based manually on CT was significantly longer than that automatically based on signal-to-background ratio (6.4cm versus 5.3cm; P<0.008). Doses to the lungs (V20, D mean ), heart (V40), and spinal cord (D max ) were significantly lower on plans using the PTV SBR . The PTV SBR coverage was statistically better than the PTV CT coverage on both plans. (50Gy: P<0.0004 and 66Gy: P<0.0006). The automatic PET segmentation algorithm based on the signal-to-background ratio method for the delineation of oesophageal tumours is interesting, and results in better target volume coverage and decreased dose to organs at risk. This may allow dose escalation up to 66Gy to the gross tumour volume. Copyright © 2017 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.
Exploiting the systematic review protocol for classification of medical abstracts.
Frunza, Oana; Inkpen, Diana; Matwin, Stan; Klement, William; O'Blenis, Peter
2011-01-01
To determine whether the automatic classification of documents can be useful in systematic reviews on medical topics, and specifically if the performance of the automatic classification can be enhanced by using the particular protocol of questions employed by the human reviewers to create multiple classifiers. The test collection is the data used in large-scale systematic review on the topic of the dissemination strategy of health care services for elderly people. From a group of 47,274 abstracts marked by human reviewers to be included in or excluded from further screening, we randomly selected 20,000 as a training set, with the remaining 27,274 becoming a separate test set. As a machine learning algorithm we used complement naïve Bayes. We tested both a global classification method, where a single classifier is trained on instances of abstracts and their classification (i.e., included or excluded), and a novel per-question classification method that trains multiple classifiers for each abstract, exploiting the specific protocol (questions) of the systematic review. For the per-question method we tested four ways of combining the results of the classifiers trained for the individual questions. As evaluation measures, we calculated precision and recall for several settings of the two methods. It is most important not to exclude any relevant documents (i.e., to attain high recall for the class of interest) but also desirable to exclude most of the non-relevant documents (i.e., to attain high precision on the class of interest) in order to reduce human workload. For the global method, the highest recall was 67.8% and the highest precision was 37.9%. For the per-question method, the highest recall was 99.2%, and the highest precision was 63%. The human-machine workflow proposed in this paper achieved a recall value of 99.6%, and a precision value of 17.8%. The per-question method that combines classifiers following the specific protocol of the review leads to better results than the global method in terms of recall. Because neither method is efficient enough to classify abstracts reliably by itself, the technology should be applied in a semi-automatic way, with a human expert still involved. When the workflow includes one human expert and the trained automatic classifier, recall improves to an acceptable level, showing that automatic classification techniques can reduce the human workload in the process of building a systematic review. Copyright © 2010 Elsevier B.V. All rights reserved.
Gowda, Charitha; Dong, Shiming; Potter, Rachel C.; Dombkowski, Kevin J.; Stokley, Shannon
2013-01-01
Objective Immunization information systems (IISs) are valuable surveillance tools; however, population relocation may introduce bias when determining immunization coverage. We explored alternative methods for estimating the vaccine-eligible population when calculating adolescent immunization levels using a statewide IIS. Methods We performed a retrospective analysis of the Michigan State Care Improvement Registry (MCIR) for all adolescents aged 11–18 years registered in the MCIR as of October 2010. We explored four methods for determining denominators: (1) including all adolescents with MCIR records, (2) excluding adolescents with out-of-state residence, (3) further excluding those without MCIR activity ≥10 years prior to the evaluation date, and (4) using a denominator based on U.S. Census data. We estimated state- and county-specific coverage levels for four adolescent vaccines. Results We found a 20% difference in estimated vaccination coverage between the most inclusive and restrictive denominator populations. Although there was some variability among the four methods in vaccination at the state level (2%–11%), greater variation occurred at the county level (up to 21%). This variation was substantial enough to potentially impact public health assessments of immunization programs. Generally, vaccines with higher coverage levels had greater absolute variation, as did counties with smaller populations. Conclusion At the county level, using the four denominator calculation methods resulted in substantial differences in estimated adolescent immunization rates that were less apparent when aggregated at the state level. Further research is needed to ascertain the most appropriate method for estimating vaccine coverage levels using IIS data. PMID:24179260
Kahende, Jennifer; Malarcher, Ann; England, Lucinda; Zhang, Lei; Mowery, Paul; Xu, Xin; Sevilimedu, Varadan; Rolle, Italia
2017-01-01
To assess state coverage and utilization of Medicaid smoking cessation medication benefits among fee-for-service enrollees who smoked cigarettes. We used the linked National Health Interview Survey (survey years 1995, 1997-2005) and the Medicaid Analytic eXtract files (1999-2008) to assess utilization of smoking cessation medication benefits among 5,982 cigarette smokers aged 18-64 years enrolled in Medicaid fee-for-service whose state Medicaid insurance covered at least one cessation medication. We excluded visits during pregnancy, and those covered by managed care or under dual enrollment (Medicaid and Medicare). Multivariate logistic regression was used to determine correlates of cessation medication benefit utilization among Medicaid fee-for-service enrollees, including measures of drug coverage (comprehensive cessation medication coverage, number of medications in state benefit, varenicline coverage), individual-level demographics at NHIS interview, age at Medicaid enrollment, and state-level cigarette excise taxes, statewide smoke-free laws, and per-capita tobacco control funding. In 1999, the percent of smokers with ≥1 medication claims was 5.7% in the 30 states that covered at least one Food and Drug Administration (FDA)-approved cessation medication; this increased to 9.9% in 2008 in the 44 states that covered at least one FDA-approved medication (p<0.01). Cessation medication utilization was greater among older individuals (≥ 25 years), females, non-Hispanic whites, and those with higher educational attainment. Comprehensive coverage, the number of smoking cessation medications covered and varenicline coverage were all positively associated with utilization; cigarette excise tax and per-capita tobacco control funding were also positively associated with utilization. Utilization of medication benefits among fee-for-service Medicaid enrollees increased from 1999-2008 and varied by individual and state-level characteristics. Given that the Affordable Care Act bars state Medicaid programs from excluding any FDA-approved cessation medications from coverage as of January 2014, monitoring Medicaid cessation medication claims may be beneficial for informing efforts to increase utilization and maximize smoking cessation.
Bartlett, D L; Ezzati-Rice, T M; Stokley, S; Zhao, Z
2001-05-01
The National Immunization Survey (NIS) and the National Health Interview Survey (NHIS) produce national coverage estimates for children aged 19 months to 35 months. The NIS is a cost-effective, random-digit-dialing telephone survey that produces national and state-level vaccination coverage estimates. The National Immunization Provider Record Check Study (NIPRCS) is conducted in conjunction with the annual NHIS, which is a face-to-face household survey. As the NIS is a telephone survey, potential coverage bias exists as the survey excludes children living in nontelephone households. To assess the validity of estimates of vaccine coverage from the NIS, we compared 1995 and 1996 NIS national estimates with results from the NHIS/NIPRCS for the same years. Both the NIS and the NHIS/NIPRCS produce similar results. The NHIS/NIPRCS supports the findings of the NIS.
Mathauer, Inke; Behrendt, Thorsten
2017-02-16
Contributory social health insurance for formal sector employees only has proven challenging for moving towards universal health coverage (UHC). This is because the informally employed and the poor usually remain excluded. One way to expand UHC is to fully or partially subsidize health insurance contributions for excluded population groups through government budget transfers. This paper analyses the institutional design features of such government subsidization arrangements in Latin America and assesses their performance with respect to UHC progress. The aim is to identify UHC conducive institutional design features of such arrangements. A literature search provided the information to analyse institutional design features, with a focus on the following aspects: eligibility/enrolment rules, financing and pooling arrangements, and purchasing and benefit package design. Based on secondary data analysis, UHC progress is assessed in terms of improved population coverage, financial protection and access to needed health care services. Such government subsidization arrangements currently exist in eight countries of Latin America (Bolivia, Chile, Colombia, Costa Rica, Dominican Republic, Mexico, Peru, Uruguay). Institutional design features and UHC related performance vary significantly. Notably, countries with a universalist approach or indirect targeting have higher population coverage rates. Separate pools for the subsidized maintain inequitable access. The relatively large scopes of the benefit packages had a positive impact on financial protection and access to care. In the long term, merging different schemes into one integrated health financing system without opt-out options for the better-off is desirable, while equally expanding eligibility to cover those so far excluded. In the short and medium term, the harmonization of benefit packages could be a priority. UHC progress also depends on substantial supply side investments to ensure the availability of quality services, particularly in rural areas. Future research should generate more evidence on the implementation process and impact of subsidization arrangements on UHC progress.
J-Adaptive estimation with estimated noise statistics. [for orbit determination
NASA Technical Reports Server (NTRS)
Jazwinski, A. H.; Hipkins, C.
1975-01-01
The J-Adaptive estimator described by Jazwinski and Hipkins (1972) is extended to include the simultaneous estimation of the statistics of the unmodeled system accelerations. With the aid of simulations it is demonstrated that the J-Adaptive estimator with estimated noise statistics can automatically estimate satellite orbits to an accuracy comparable with the data noise levels, when excellent, continuous tracking coverage is available. Such tracking coverage will be available from satellite-to-satellite tracking.
Gomez, G; Stanford, F C
2018-03-01
Obesity is now the most prevalent chronic disease in the United States, which amounts to an estimated $147 billion in health care spending annually. The Affordable Care Act (ACA) enacted in 2010 included provisions for private and public health insurance plans that expanded coverage for lifestyle/behavior modification and bariatric surgery for the treatment of obesity. Pharmacotherapy, however, has not been included despite their evidence-based efficacy. We set out to investigate the coverage of Food and Drug Administration-approved medications for obesity within Medicare, Medicaid and ACA-established marketplace health insurance plans. We examined coverage for phentermine, diethylpropion, phendimetrazine, Benzphentamine, Lorcaserin, Phentermine/Topiramate (Qysmia), Liraglutide (Saxenda) and Buproprion/Naltrexone (Contrave) among Medicare, Medicaid and marketplace insurance plans in 34 states. Among 136 marketplace health insurance plans, 11% had some coverage for the specified drugs in only nine states. Medicare policy strictly excludes drug therapy for obesity. Only seven state Medicaid programs have drug coverage. Obesity requires an integrated approach to combat its public health threat. Broader coverage of pharmacotherapy can make a significant contribution to fighting this complex and chronic disease.
Olfactory-Induced Synesthesias: A Review and Model
ERIC Educational Resources Information Center
Stevenson, Richard J.; Tomiczek, Caroline
2007-01-01
Recent reviews of synesthesia concentrate upon rare neurodevelopmental examples and exclude common olfactory-induced experiences with which they may profitably be compared. Like the neurodevelopmental synesthesias, odor-induced experiences involve different sensory modalities; are reliable, asymmetric (concurrents cannot induce), and automatic;…
Code of Federal Regulations, 2010 CFR
2010-01-01
...) The Government Printing Office. (d) Agencies excluded. This part does not apply to: (1) A Government corporation; (2) The Central Intelligence Agency; (3) The Defense Intelligence Agency; (4) The National... principal function of which is the conduct of foreign intelligence or counterintelligence activities; (6...
Fungal mycelia in soils - a new method for quantification of their biomass
NASA Astrophysics Data System (ADS)
Drabløs Eldhuset, Toril; Lange, Holger; Svetlik, Jan; Børja, Isabella
2013-04-01
All plant-bearing soils are interwoven with fungal hyphae. Their structure and function are affected by environmental factors like drought, which might be a stress factor of increasing importance in many world regions due to climate change. The fungal mycelium in soil is important both for mycorrhizal symbiosis with plant roots and for litter decomposition, and thereby also for carbon turnover in soils. However, the mycelium biomass has been difficult to assess. Here we describe a simple and feasible method to quantify the biomass of fungal mycelium. We report on a manipulation study in the field where drought stress has been induced. The experiment was performed in a Norway spruce (Picea abies) 20 years old stand planted on former agricultural land, with a control plot and a roofed plot where precipitation was excluded. To investigate the fungal mycelium, nylon nets (mesh size 1 mm, width 7 cm and length 25 cm), were inserted vertically into the soil down to 20 cm depth. The nets were left in the soil from October to June, removed and replaced by new nets that were left in the soil from June to October. After removal, by cutting a block of soil around each net, the nets were cleaned from residual soil and scanned using the image scanner CanoScan 9000F. The resulting images were analyzed using the image processing software ImageJ. The image analysis was based on the distribution of grey values in the individual pixels which characterize the different components in the image (voids, hyphae, the nylon net, and soil). Based on the repeated visual evaluation of hyphal coverage in the net segments, we obtained an exponential equation allowing us to determine automatically the coverage of net windows by hyphae in percentage for each net scanned. In this way we can compare the hyphal coverage in the control and the drought-exposed plots. Based on the hyphal coverage scans together with hyphal dry weight on clean nets, we account for the soil particles adhering to the nets. Using this analysis method, the hyphal mat coverage in mm2 on any net is quantified and the hyphal biomass on the net can be calculated and compared between treatments. Also, the hyphal biomass per cm3 soil at the spot where the net has been inserted can be assessed. In addition, DNA from net-bound hyphae may be extracted to determine the identity of fungal species at different soil depths for the individual treatments.
Luman, Elizabeth T; Sablan, Mariana; Stokley, Shannon; McCauley, Mary M; Shaw, Kate M
2008-01-01
Background Lack of methodological rigor can cause survey error, leading to biased results and suboptimal public health response. This study focused on the potential impact of 3 methodological "shortcuts" pertaining to field surveys: relying on a single source for critical data, failing to repeatedly visit households to improve response rates, and excluding remote areas. Methods In a vaccination coverage survey of young children conducted in the Commonwealth of the Northern Mariana Islands in July 2005, 3 sources of vaccination information were used, multiple follow-up visits were made, and all inhabited areas were included in the sampling frame. Results are calculated with and without these strategies. Results Most children had at least 2 sources of data; vaccination coverage estimated from any single source was substantially lower than from all sources combined. Eligibility was ascertained for 79% of households after the initial visit and for 94% of households after follow-up visits; vaccination coverage rates were similar with and without follow-up. Coverage among children on remote islands differed substantially from that of their counterparts on the main island indicating a programmatic need for locality-specific information; excluding remote islands from the survey would have had little effect on overall estimates due to small populations and divergent results. Conclusion Strategies to reduce sources of survey error should be maximized in public health surveys. The impact of the 3 strategies illustrated here will vary depending on the primary outcomes of interest and local situations. Survey limitations such as potential for error should be well-documented, and the likely direction and magnitude of bias should be considered. PMID:18371195
Application of Semantic Tagging to Generate Superimposed Information on a Digital Encyclopedia
NASA Astrophysics Data System (ADS)
Garrido, Piedad; Tramullas, Jesus; Martinez, Francisco J.
We can find in the literature several works regarding the automatic or semi-automatic processing of textual documents with historic information using free software technologies. However, more research work is needed to integrate the analysis of the context and provide coverage to the peculiarities of the Spanish language from a semantic point of view. This research work proposes a novel knowledge-based strategy based on combining subject-centric computing, a topic-oriented approach, and superimposed information. It subsequent combination with artificial intelligence techniques led to an automatic analysis after implementing a made-to-measure interpreted algorithm which, in turn, produced a good number of associations and events with 90% reliability.
Wilson, Elizabeth Ruth; Kyle, Theodore K; Nadglowski, Joseph F; Stanford, Fatima Cody
2017-02-01
Evidence-based obesity treatments, such as bariatric surgery, are not considered essential health benefits under the Affordable Care Act. Employer-sponsored wellness programs with incentives based on biometric outcomes are allowed and often used despite mixed evidence regarding their effectiveness. This study examines consumers' perceptions of their coverage for obesity treatments and exposure to workplace wellness programs. A total of 7,378 participants completed an online survey during 2015-2016. Respondents answered questions regarding their health coverage for seven medical services and exposure to employer wellness programs that target weight or body mass index (BMI). Using χ 2 tests, associations between perceptions of exposure to employer wellness programs and coverage for medical services were examined. Differences between survey years were also assessed. Most respondents reported they did not have health coverage for obesity treatments, but more of the respondents with employer wellness programs reported having coverage. Neither the perception of coverage for obesity treatments nor exposure to wellness programs increased between 2015 and 2016. Even when consumers have exposure to employer wellness programs that target BMI, their health insurance often excludes obesity treatments. Given the clinical and cost-effectiveness of such treatments, reducing that coverage gap may mitigate obesity's individual- and population-level effects. © 2017 The Obesity Society.
Extending Medicare coverage to medically necessary dental care.
Patton, L L; White, B A; Field, M J
2001-09-01
Periodically, Congress considers expanding Medicare coverage to include some currently excluded health care services. In 1999 and 2000, an Institute of Medicine committee studied the issues related to coverage for certain services, including "medically necessary dental services." The committee conducted a literature search for dental care studies in five areas: head and neck cancer, leukemia, lymphoma, organ transplantation, and heart valve repair or replacement. The committee examined evidence to support Medicare coverage for dental services related to these conditions and estimated the cost to Medicare of such coverage. Evidence supported Medicare coverage for preventive dental care before jaw radiation therapy for head or neck cancer and coverage for treatment to prevent or eliminate acute oral infections for patients with leukemia before chemotherapy. Insufficient evidence supported dental coverage for patients with lymphoma or organ transplants and for patients who had undergone heart valve repair or replacement. The committee suggested that Congress update statutory language to permit Medicare coverage of effective dental services needed in conjunction with surgery, chemotherapy, radiation therapy or pharmacological treatment for life-threatening medical conditions. Dental care is important for members of all age groups. More direct, research-based evidence on the efficacy of medically necessary dental care is needed both to guide treatment and to support Medicare payment policy.
Genetic testing and the future of disability insurance: ethics, law & policy.
Wolf, Susan M; Kahn, Jeffrey P
2007-01-01
Predictive genetic testing poses fundamental questions for disability insurance, a crucial resource funding basic needs when disability prevents income from work. This article, from an NIH-funded project, presents the first indepth analysis of the challenging issues: Should disability insurers be permitted to consider genetics and exclude predicted disability? May disabilities with a recognized genetic basis be excluded from coverage as pre-existing conditions? How can we assure that private insurers writing individual and group policies, employers, and public insurers deal competently and appropriately with genetic testing?
An automatic markerless registration method for neurosurgical robotics based on an optical camera.
Meng, Fanle; Zhai, Fangwen; Zeng, Bowei; Ding, Hui; Wang, Guangzhi
2018-02-01
Current markerless registration methods for neurosurgical robotics use the facial surface to match the robot space with the image space, and acquisition of the facial surface usually requires manual interaction and constrains the patient to a supine position. To overcome these drawbacks, we propose a registration method that is automatic and does not constrain patient position. An optical camera attached to the robot end effector captures images around the patient's head from multiple views. Then, high coverage of the head surface is reconstructed from the images through multi-view stereo vision. Since the acquired head surface point cloud contains color information, a specific mark that is manually drawn on the patient's head prior to the capture procedure can be extracted to automatically accomplish coarse registration rather than using facial anatomic landmarks. Then, fine registration is achieved by registering the high coverage of the head surface without relying solely on the facial region, thus eliminating patient position constraints. The head surface was acquired by the camera with a good repeatability accuracy. The average target registration error of 8 different patient positions measured with targets inside a head phantom was [Formula: see text], while the mean surface registration error was [Formula: see text]. The method proposed in this paper achieves automatic markerless registration in multiple patient positions and guarantees registration accuracy inside the head. This method provides a new approach for establishing the spatial relationship between the image space and the robot space.
A machine learning approach for classification of anatomical coverage in CT
NASA Astrophysics Data System (ADS)
Wang, Xiaoyong; Lo, Pechin; Ramakrishna, Bharath; Goldin, Johnathan; Brown, Matthew
2016-03-01
Automatic classification of anatomical coverage of medical images is critical for big data mining and as a pre-processing step to automatically trigger specific computer aided diagnosis systems. The traditional way to identify scans through DICOM headers has various limitations due to manual entry of series descriptions and non-standardized naming conventions. In this study, we present a machine learning approach where multiple binary classifiers were used to classify different anatomical coverages of CT scans. A one-vs-rest strategy was applied. For a given training set, a template scan was selected from the positive samples and all other scans were registered to it. Each registered scan was then evenly split into k × k × k non-overlapping blocks and for each block the mean intensity was computed. This resulted in a 1 × k3 feature vector for each scan. The feature vectors were then used to train a SVM based classifier. In this feasibility study, four classifiers were built to identify anatomic coverages of brain, chest, abdomen-pelvis, and chest-abdomen-pelvis CT scans. Each classifier was trained and tested using a set of 300 scans from different subjects, composed of 150 positive samples and 150 negative samples. Area under the ROC curve (AUC) of the testing set was measured to evaluate the performance in a two-fold cross validation setting. Our results showed good classification performance with an average AUC of 0.96.
Cheng, Ji; Pullenayegum, Eleanor; Marshall, John K; Thabane, Lehana
2016-01-01
Objectives There is no consensus on whether studies with no observed events in the treatment and control arms, the so-called both-armed zero-event studies, should be included in a meta-analysis of randomised controlled trials (RCTs). Current analytic approaches handled them differently depending on the choice of effect measures and authors' discretion. Our objective is to evaluate the impact of including or excluding both-armed zero-event (BA0E) studies in meta-analysis of RCTs with rare outcome events through a simulation study. Method We simulated 2500 data sets for different scenarios varying the parameters of baseline event rate, treatment effect and number of patients in each trial, and between-study variance. We evaluated the performance of commonly used pooling methods in classical meta-analysis—namely, Peto, Mantel-Haenszel with fixed-effects and random-effects models, and inverse variance method with fixed-effects and random-effects models—using bias, root mean square error, length of 95% CI and coverage. Results The overall performance of the approaches of including or excluding BA0E studies in meta-analysis varied according to the magnitude of true treatment effect. Including BA0E studies introduced very little bias, decreased mean square error, narrowed the 95% CI and increased the coverage when no true treatment effect existed. However, when a true treatment effect existed, the estimates from the approach of excluding BA0E studies led to smaller bias than including them. Among all evaluated methods, the Peto method excluding BA0E studies gave the least biased results when a true treatment effect existed. Conclusions We recommend including BA0E studies when treatment effects are unlikely, but excluding them when there is a decisive treatment effect. Providing results of including and excluding BA0E studies to assess the robustness of the pooled estimated effect is a sensible way to communicate the results of a meta-analysis when the treatment effects are unclear. PMID:27531725
NASA Astrophysics Data System (ADS)
Saulskiy, V. K.
2005-01-01
Multisatellite systems with linear structure (SLS) are defined, and their application for a continuous global or zonal coverage of the Earth’s surface is justified. It is demonstrated that in some cases these systems turned out to be better than usually recommended kinematically regular systems by G.V. Mozhaev, delta systems of J.G. Walker, and polar systems suggested by F.W. Gobets, L. Rider, and W.S. Adams. When a comparison is made using the criterion of a minimum radius of one-satellite coverage circle, the SLS beat the other systems for the majority of satellite numbers from the range 20 63, if the global continuous single coverage of the Earth is required. In the case of a zonal continuous single coverage of the latitude belt ±65°, the SLS are preferable at almost all numbers of satellites from 38 to 100, and further at any values up to 200 excluding 144.
42 CFR 411.15 - Particular services excluded from coverage.
Code of Federal Regulations, 2010 CFR
2010-10-01
... aneurysms (AAA), cardiovascular disease screening tests, diabetes screening tests, a screening... conditions and limitation specified in § 410.19 of this chapter. (13) In the case of cardiovascular disease screening tests for the early detection of cardiovascular disease or abnormalities associated with an...
40 CFR 725.8 - Coverage of this part.
Code of Federal Regulations, 2010 CFR
2010-07-01
... ACT REPORTING REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS General Provisions and.... (1) Any microorganism which would be excluded from the definition of “chemical substance” in section... exclusion applies only to a microbial mixture as a whole and not to any microorganisms and other chemical...
40 CFR 725.8 - Coverage of this part.
Code of Federal Regulations, 2013 CFR
2013-07-01
... ACT REPORTING REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS General Provisions and.... (1) Any microorganism which would be excluded from the definition of “chemical substance” in section... exclusion applies only to a microbial mixture as a whole and not to any microorganisms and other chemical...
40 CFR 725.8 - Coverage of this part.
Code of Federal Regulations, 2014 CFR
2014-07-01
... ACT REPORTING REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS General Provisions and.... (1) Any microorganism which would be excluded from the definition of “chemical substance” in section... exclusion applies only to a microbial mixture as a whole and not to any microorganisms and other chemical...
40 CFR 725.8 - Coverage of this part.
Code of Federal Regulations, 2012 CFR
2012-07-01
... ACT REPORTING REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS General Provisions and.... (1) Any microorganism which would be excluded from the definition of “chemical substance” in section... exclusion applies only to a microbial mixture as a whole and not to any microorganisms and other chemical...
40 CFR 725.8 - Coverage of this part.
Code of Federal Regulations, 2011 CFR
2011-07-01
... ACT REPORTING REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS General Provisions and.... (1) Any microorganism which would be excluded from the definition of “chemical substance” in section... exclusion applies only to a microbial mixture as a whole and not to any microorganisms and other chemical...
Kahende, Jennifer; England, Lucinda; Zhang, Lei; Mowery, Paul; Xu, Xin; Sevilimedu, Varadan; Rolle, Italia
2017-01-01
Objective To assess state coverage and utilization of Medicaid smoking cessation medication benefits among fee-for-service enrollees who smoked cigarettes. Methods We used the linked National Health Interview Survey (survey years 1995, 1997–2005) and the Medicaid Analytic eXtract files (1999–2008) to assess utilization of smoking cessation medication benefits among 5,982 cigarette smokers aged 18–64 years enrolled in Medicaid fee-for-service whose state Medicaid insurance covered at least one cessation medication. We excluded visits during pregnancy, and those covered by managed care or under dual enrollment (Medicaid and Medicare). Multivariate logistic regression was used to determine correlates of cessation medication benefit utilization among Medicaid fee-for-service enrollees, including measures of drug coverage (comprehensive cessation medication coverage, number of medications in state benefit, varenicline coverage), individual-level demographics at NHIS interview, age at Medicaid enrollment, and state-level cigarette excise taxes, statewide smoke-free laws, and per-capita tobacco control funding. Results In 1999, the percent of smokers with ≥1 medication claims was 5.7% in the 30 states that covered at least one Food and Drug Administration (FDA)-approved cessation medication; this increased to 9.9% in 2008 in the 44 states that covered at least one FDA-approved medication (p<0.01). Cessation medication utilization was greater among older individuals (≥ 25 years), females, non-Hispanic whites, and those with higher educational attainment. Comprehensive coverage, the number of smoking cessation medications covered and varenicline coverage were all positively associated with utilization; cigarette excise tax and per-capita tobacco control funding were also positively associated with utilization. Conclusions Utilization of medication benefits among fee-for-service Medicaid enrollees increased from 1999–2008 and varied by individual and state-level characteristics. Given that the Affordable Care Act bars state Medicaid programs from excluding any FDA-approved cessation medications from coverage as of January 2014, monitoring Medicaid cessation medication claims may be beneficial for informing efforts to increase utilization and maximize smoking cessation. PMID:28207744
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, Y; Huang, Z; Lo, S
2015-06-15
Purpose: To improve Gamma Knife SRS treatment efficiency for brain metastases and compare the differences of treatment time and radiobiological effects between two different planning methods of automatic filling and manual placement of shots with inverse planning. Methods: T1-weighted MRI images with gadolinium contrast from five patients with a single brain metastatic-lesion were used in this retrospective study. Among them, two were from primary breast cancer, two from primary melanoma cancer and one from primary prostate cancer. For each patient, two plans were generated in Leksell GammaPlan10.1.1 for radiosurgical treatment with a Leksell GammaKnife Perfexion machine: one with automatic filling,more » automatic sector configuration and inverse optimization (Method1); and the other with manual placement of shots, manual setup of collimator sizes, manual setup of sector blocking and inverse optimization (Method2). Dosimetric quality of the plans was evaluated with parameters of Coverage, Selectivity, Gradient-Index and DVH. Beam-on Time, Number-of-Shots and Tumor Control Probability(TCP) were compared for the two plans while keeping their dosimetric quality very similar. Relative reduction of Beam-on Time and Number-of-Shots were calculated as the ratios among the two plans and used for quantitative analysis. Results: With very similar dosimetric and radiobiological plan quality, plans created with Method 2 had significantly reduced treatment time. Relative reduction of Beam-on Time ranged from 20% to 51 % (median:29%,p=0.001), and reduction of Number-of-Shots ranged from 5% to 67% (median:40%,p=0.0002), respectively. Time of plan creation for Method1 and Method2 was similar, approximately 20 minutes, excluding the time for tumor delineation. TCP calculated for the tumors from differential DVHs did not show significant difference between the two plans (p=0.35). Conclusion: The method of manual setup combined with inverse optimization in LGP for treatment of brain metastatic lesions with the Perfexion can achieve significantly higher time efficiency without degrading treatment quality.« less
Automated Generation and Assessment of Autonomous Systems Test Cases
NASA Technical Reports Server (NTRS)
Barltrop, Kevin J.; Friberg, Kenneth H.; Horvath, Gregory A.
2008-01-01
This slide presentation reviews some of the issues concerning verification and validation testing of autonomous spacecraft routinely culminates in the exploration of anomalous or faulted mission-like scenarios using the work involved during the Dawn mission's tests as examples. Prioritizing which scenarios to develop usually comes down to focusing on the most vulnerable areas and ensuring the best return on investment of test time. Rules-of-thumb strategies often come into play, such as injecting applicable anomalies prior to, during, and after system state changes; or, creating cases that ensure good safety-net algorithm coverage. Although experience and judgment in test selection can lead to high levels of confidence about the majority of a system's autonomy, it's likely that important test cases are overlooked. One method to fill in potential test coverage gaps is to automatically generate and execute test cases using algorithms that ensure desirable properties about the coverage. For example, generate cases for all possible fault monitors, and across all state change boundaries. Of course, the scope of coverage is determined by the test environment capabilities, where a faster-than-real-time, high-fidelity, software-only simulation would allow the broadest coverage. Even real-time systems that can be replicated and run in parallel, and that have reliable set-up and operations features provide an excellent resource for automated testing. Making detailed predictions for the outcome of such tests can be difficult, and when algorithmic means are employed to produce hundreds or even thousands of cases, generating predicts individually is impractical, and generating predicts with tools requires executable models of the design and environment that themselves require a complete test program. Therefore, evaluating the results of large number of mission scenario tests poses special challenges. A good approach to address this problem is to automatically score the results based on a range of metrics. Although the specific means of scoring depends highly on the application, the use of formal scoring - metrics has high value in identifying and prioritizing anomalies, and in presenting an overall picture of the state of the test program. In this paper we present a case study based on automatic generation and assessment of faulted test runs for the Dawn mission, and discuss its role in optimizing the allocation of resources for completing the test program.
24 CFR 901.15 - Indicator #2, modernization.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicator #2, modernization. 901.15... DEVELOPMENT PUBLIC HOUSING MANAGEMENT ASSESSMENT PROGRAM § 901.15 Indicator #2, modernization. This indicator is automatically excluded if a PHA does not have a modernization program. This indicator examines the...
Application of the SRI cloud-tracking technique to rapid-scan GOES observations
NASA Technical Reports Server (NTRS)
Wolf, D. E.; Endlich, R. M.
1980-01-01
An automatic cloud tracking system was applied to multilayer clouds associated with severe storms. The method was tested using rapid scan observations of Hurricane Eloise obtained by the GOES satellite on 22 September 1975. Cloud tracking was performed using clustering based either on visible or infrared data. The clusters were tracked using two different techniques. The data of 4 km and 8 km resolution of the automatic system yielded comparable in accuracy and coverage to those obtained by NASA analysts using the Atmospheric and Oceanographic Information Processing System.
NASA Astrophysics Data System (ADS)
Fustich, C. D.
1980-03-01
A series of transformer room fire tests are reported to demonstate the shock hazard present when automatic sprinklers operate over energized electrical equipment. Fire protection was provided by standard 0.5 inch pendent automatic sprinklers temperature rated at 135 F and installed to give approximately 150 sq ft per head coverage. A 480 v dry transformer was used in the room to provide a three phase, four wire distribution system. It is shown that the induced currents in the test room during the various tests are relatively small and pose no appreciable personnel shock hazard.
Integrated testing system FiTest for diagnosis of PCBA
NASA Astrophysics Data System (ADS)
Bogdan, Arkadiusz; Lesniak, Adam
2016-12-01
This article presents the innovative integrated testing system FiTest for automatic, quick inspection of printed circuit board assemblies (PCBA) manufactured in Surface Mount Technology (SMT). Integration of Automatic Optical Inspection (AOI), In-Circuit Tests (ICT) and Functional Circuit Tests (FCT) resulted in universal hardware platform for testing variety of electronic circuits. The platform provides increased test coverage, decreased level of false calls and optimization of test duration. The platform is equipped with powerful algorithms performing tests in a stable and repetitive way and providing effective management of diagnosis.
29 CFR 1952.380 - Description of the plan.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Justice that the Act is consistent with the State's Law and Constitution. Federal procedural regulations... Commonwealth's Act for the private sector are essentially identical to those in the Federal Act, and Puerto Rico intends to adopt all Federal standards. The Commonwealth will exclude from coverage all industries...
Lower Mississippi River Ports and Waterways Safety System (PAWSS) RF coverage test results
DOT National Transportation Integrated Search
1999-11-01
The Coast Guard plans to operate an Automatic Identification System (AID) Digital Selective Calling (DSC) based transponder system as part of the Ports and Waterways Safety System (PAWSS) in the lower Mississippi River. the AIS uses two duplex channe...
Testing Strategies for Model-Based Development
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.
2006-01-01
This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.
19 CFR 212.03 - Proceedings covered.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 19 Customs Duties 3 2012-04-01 2012-04-01 false Proceedings covered. 212.03 Section 212.03 Customs... proceeding brought by the Commission upon its own complaint. (c) If a proceeding includes both matters covered by the Act and matters specifically excluded from coverage, any award made will include only fees...
19 CFR 212.03 - Proceedings covered.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 19 Customs Duties 3 2013-04-01 2013-04-01 false Proceedings covered. 212.03 Section 212.03 Customs... proceeding brought by the Commission upon its own complaint. (c) If a proceeding includes both matters covered by the Act and matters specifically excluded from coverage, any award made will include only fees...
19 CFR 212.03 - Proceedings covered.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 19 Customs Duties 3 2014-04-01 2014-04-01 false Proceedings covered. 212.03 Section 212.03 Customs... proceeding brought by the Commission upon its own complaint. (c) If a proceeding includes both matters covered by the Act and matters specifically excluded from coverage, any award made will include only fees...
19 CFR 212.03 - Proceedings covered.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 19 Customs Duties 3 2011-04-01 2011-04-01 false Proceedings covered. 212.03 Section 212.03 Customs... proceeding brought by the Commission upon its own complaint. (c) If a proceeding includes both matters covered by the Act and matters specifically excluded from coverage, any award made will include only fees...
Tudor and Stuart Drama. Goldentree Bibliographies.
ERIC Educational Resources Information Center
Ribner, Irving, Comp.
This selective bibliography, a guide to scholarship in Tudor and Stuart drama, attempts to provide ample coverage of the major topics and authors, with emphasis on work published since 1920. References excluded are most non-English studies, studies devoted exclusively to anonymous plays or those of minor authors, and unpublished dissertations.…
29 CFR 2.15 - Protection of witnesses.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 1 2010-07-01 2010-07-01 true Protection of witnesses. 2.15 Section 2.15 Labor Office of the Secretary of Labor GENERAL REGULATIONS Audiovisual Coverage of Administrative Hearings § 2.15 Protection of witnesses. A witness has the right, prior to or during his testimony, to exclude audiovisual...
20 CFR 404.1018b - Medicare qualified government employment.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 20 Employees' Benefits 2 2011-04-01 2011-04-01 false Medicare qualified government employment. 404... Excluded from Employment § 404.1018b Medicare qualified government employment. (a) General. The work of a Federal, State, or local government employee not otherwise subject to Social Security coverage may...
War Coverage: The Case of the Falklands.
ERIC Educational Resources Information Center
Bellando, Edourado
The Falkland-Malvinas conflict is a classic example of how a government can manage news in wartime. The rules of the game as evinced by the British government and Ministry of Defense were simple and effective. They controlled access to the fighting, controlled all communications facilities, excluded all neutral correspondents and carefully…
Notable licensing deals in the biopharma industry in the third quarter of 2017.
D'Souza, P
2017-10-01
During the third quarter of 2017, Cortellis Competitive Intelligence registered 949 new deals (excluding mergers and acquisitions) added as part of its ongoing coverage of pharmaceutical licensing activity compared to 1,007 in Q2 this year and 1,023 in Q3 the previous year.
42 CFR 426.518 - NCD record furnished to the aggrieved party.
Code of Federal Regulations, 2010 CFR
2010-10-01
... HUMAN SERVICES (CONTINUED) MEDICARE PROGRAM REVIEW OF NATIONAL COVERAGE DETERMINATIONS AND LOCAL... section, the NCD record consists of any document or material that CMS considered during the development of... decision memoranda. (5) An index of documents considered that are excluded under paragraph (b) of this...
42 CFR 426.518 - NCD record furnished to the aggrieved party.
Code of Federal Regulations, 2011 CFR
2011-10-01
... HUMAN SERVICES (CONTINUED) MEDICARE PROGRAM REVIEW OF NATIONAL COVERAGE DETERMINATIONS AND LOCAL... section, the NCD record consists of any document or material that CMS considered during the development of... decision memoranda. (5) An index of documents considered that are excluded under paragraph (b) of this...
20 CFR 404.1210 - Optionally excluded services.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Section 404.1210 Employees' Benefits SOCIAL SECURITY ADMINISTRATION FEDERAL OLD-AGE, SURVIVORS AND... selectively by coverage groups. They are: (a) Services in any class or classes of elective positions; (b) Services in any class or classes of part-time positions; (c) Services in any class or classes of positions...
5 CFR 870.206 - Accidental death and dismemberment.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Accidental death and dismemberment. 870....206 Accidental death and dismemberment. (a) (1) Accidental death and dismemberment coverage is an automatic part of Basic and Option A insurance for employees. (2) There is no accidental death and...
Code of Federal Regulations, 2013 CFR
2013-04-01
... authority to assist a Federal law enforcement authority in the protection of the President of the United... will be extended. (b) Coverage for officers of the U.S. Park Police and those officers of the Uniformed...
Code of Federal Regulations, 2011 CFR
2011-04-01
... authority to assist a Federal law enforcement authority in the protection of the President of the United... will be extended. (b) Coverage for officers of the U.S. Park Police and those officers of the Uniformed...
Code of Federal Regulations, 2014 CFR
2014-04-01
... authority to assist a Federal law enforcement authority in the protection of the President of the United... will be extended. (b) Coverage for officers of the U.S. Park Police and those officers of the Uniformed...
Code of Federal Regulations, 2010 CFR
2010-04-01
... authority to assist a Federal law enforcement authority in the protection of the President of the United... will be extended. (b) Coverage for officers of the U.S. Park Police and those officers of the Uniformed...
Code of Federal Regulations, 2012 CFR
2012-04-01
... authority to assist a Federal law enforcement authority in the protection of the President of the United... will be extended. (b) Coverage for officers of the U.S. Park Police and those officers of the Uniformed...
5 CFR 870.206 - Accidental death and dismemberment.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 5 Administrative Personnel 2 2014-01-01 2014-01-01 false Accidental death and dismemberment. 870....206 Accidental death and dismemberment. (a)(1) Accidental death and dismemberment coverage is an automatic part of Basic and Option A insurance for employees. (2) There is no accidental death and...
5 CFR 870.206 - Accidental death and dismemberment.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Accidental death and dismemberment. 870....206 Accidental death and dismemberment. (a)(1) Accidental death and dismemberment coverage is an automatic part of Basic and Option A insurance for employees. (2) There is no accidental death and...
5 CFR 870.206 - Accidental death and dismemberment.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 5 Administrative Personnel 2 2013-01-01 2013-01-01 false Accidental death and dismemberment. 870....206 Accidental death and dismemberment. (a)(1) Accidental death and dismemberment coverage is an automatic part of Basic and Option A insurance for employees. (2) There is no accidental death and...
5 CFR 870.206 - Accidental death and dismemberment.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 5 Administrative Personnel 2 2012-01-01 2012-01-01 false Accidental death and dismemberment. 870....206 Accidental death and dismemberment. (a)(1) Accidental death and dismemberment coverage is an automatic part of Basic and Option A insurance for employees. (2) There is no accidental death and...
7 CFR 1806.2 - Companies and policies.
Code of Federal Regulations, 2010 CFR
2010-01-01
... during the period a building is under construction if the policy otherwise meets the requirements of this Instruction. If such a policy or endorsement does not automatically convert to full coverage when the building... first be cleared with the National Office. (8) Loss or damage covered. Buildings must be insured against...
Automatic Testcase Generation for Flight Software
NASA Technical Reports Server (NTRS)
Bushnell, David Henry; Pasareanu, Corina; Mackey, Ryan M.
2008-01-01
The TacSat3 project is applying Integrated Systems Health Management (ISHM) technologies to an Air Force spacecraft for operational evaluation in space. The experiment will demonstrate the effectiveness and cost of ISHM and vehicle systems management (VSM) technologies through onboard operation for extended periods. We present two approaches to automatic testcase generation for ISHM: 1) A blackbox approach that views the system as a blackbox, and uses a grammar-based specification of the system's inputs to automatically generate *all* inputs that satisfy the specifications (up to prespecified limits); these inputs are then used to exercise the system. 2) A whitebox approach that performs analysis and testcase generation directly on a representation of the internal behaviour of the system under test. The enabling technologies for both these approaches are model checking and symbolic execution, as implemented in the Ames' Java PathFinder (JPF) tool suite. Model checking is an automated technique for software verification. Unlike simulation and testing which check only some of the system executions and therefore may miss errors, model checking exhaustively explores all possible executions. Symbolic execution evaluates programs with symbolic rather than concrete values and represents variable values as symbolic expressions. We are applying the blackbox approach to generating input scripts for the Spacecraft Command Language (SCL) from Interface and Control Systems. SCL is an embedded interpreter for controlling spacecraft systems. TacSat3 will be using SCL as the controller for its ISHM systems. We translated the SCL grammar into a program that outputs scripts conforming to the grammars. Running JPF on this program generates all legal input scripts up to a prespecified size. Script generation can also be targeted to specific parts of the grammar of interest to the developers. These scripts are then fed to the SCL Executive. ICS's in-house coverage tools will be run to measure code coverage. Because the scripts exercise all parts of the grammar, we expect them to provide high code coverage. This blackbox approach is suitable for systems for which we do not have access to the source code. We are applying whitebox test generation to the Spacecraft Health INference Engine (SHINE) that is part of the ISHM system. In TacSat3, SHINE will execute an on-board knowledge base for fault detection and diagnosis. SHINE converts its knowledge base into optimized C code which runs onboard TacSat3. SHINE can translate its rules into an intermediate representation (Java) suitable for analysis with JPF. JPF will analyze SHINE's Java output using symbolic execution, producing testcases that can provide either complete or directed coverage of the code. Automatically generated test suites can provide full code coverage and be quickly regenerated when code changes. Because our tools analyze executable code, they fully cover the delivered code, not just models of the code. This approach also provides a way to generate tests that exercise specific sections of code under specific preconditions. This capability gives us more focused testing of specific sections of code.
Recent technical advances in general purpose mobile Satcom aviation terminals
NASA Technical Reports Server (NTRS)
Sydor, John T.
1990-01-01
A second general aviation amplitude companded single sideband (ACSSB) aeronautical terminal was developed for use with the Ontario Air Ambulance Service (OAAS). This terminal is designed to have automatic call set up and take down and to interface with the Public Service Telephone Network (PSTN) through a ground earth station hub controller. The terminal has integrated RF and microprocessor hardware which allows such functions as beam steering and automatic frequency control to be software controlled. The terminal uses a conformal patch array system to provide almost full azimuthal coverage. Antenna beam steering is executed without relying on aircraft supplied orientation information.
Code of Federal Regulations, 2013 CFR
2013-07-01
... declassification of Formerly Restricted Data (FRD) (as defined in 10 CFR 1045.3) may only be performed after... the FRD marking may be removed. Declassification of Transclassified Foreign Nuclear Information (TFNI... record that contains RD, FRD, or TFNI shall be excluded from automatic declassification and referred by...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-17
... recreational vessels shorter than sixty-five feet were excluded from the definition of ``employee.'' The... (expanding coverage to land-based workers who met the situs and status tests) took effect thirty days after... general parlance, ``repair'' means to restore or mend. See, e.g., The New Shorter Oxford English...
77 FR 67743 - Federal Employees Health Benefits Program Coverage for Certain Intermittent Employees
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
... firefighters and fire protection personnel. 77 FR 42417. In addition, in recognition of the fact that there may... agencies to attract and bring emergency workers on board quickly and in recognition of the hazardous conditions those employees often face, OPM has concluded that its current policy of categorically excluding...
14 CFR 205.5 - Minimum coverage.
Code of Federal Regulations, 2011 CFR
2011-01-01
... commuter air carriers but excluding U.S. air taxi operators and Canadian charter air taxi operators, shall...,000 times 75 percent of the number of passenger seats installed in the aircraft. (c) U.S. air taxi... each occurrence for loss of or damage to property. (2) U.S. air taxi operators carrying passengers in...
14 CFR 205.5 - Minimum coverage.
Code of Federal Regulations, 2010 CFR
2010-01-01
... commuter air carriers but excluding U.S. air taxi operators and Canadian charter air taxi operators, shall...,000 times 75 percent of the number of passenger seats installed in the aircraft. (c) U.S. air taxi... each occurrence for loss of or damage to property. (2) U.S. air taxi operators carrying passengers in...
29 CFR 2590.715-2704 - Prohibition of preexisting condition exclusions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... offered by Issuer N. N's policy excludes benefits for oral surgery required as a result of a traumatic injury if the injury occurred before the effective date of coverage under the policy. (ii) Conclusion. In this Example 1, the exclusion of benefits for oral surgery required as a result of a traumatic injury...
29 CFR 2590.715-2704 - Prohibition of preexisting condition exclusions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... offered by Issuer N. N's policy excludes benefits for oral surgery required as a result of a traumatic injury if the injury occurred before the effective date of coverage under the policy. (ii) Conclusion. In this Example 1, the exclusion of benefits for oral surgery required as a result of a traumatic injury...
29 CFR 2590.715-2704 - Prohibition of preexisting condition exclusions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... offered by Issuer N. N's policy excludes benefits for oral surgery required as a result of a traumatic injury if the injury occurred before the effective date of coverage under the policy. (ii) Conclusion. In this Example 1, the exclusion of benefits for oral surgery required as a result of a traumatic injury...
29 CFR 2590.715-2704 - Prohibition of preexisting condition exclusions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... offered by Issuer N. N's policy excludes benefits for oral surgery required as a result of a traumatic injury if the injury occurred before the effective date of coverage under the policy. (ii) Conclusion. In this Example 1, the exclusion of benefits for oral surgery required as a result of a traumatic injury...
Interactive vs. automatic ultrasound image segmentation methods for staging hepatic lipidosis.
Weijers, Gert; Starke, Alexander; Haudum, Alois; Thijssen, Johan M; Rehage, Jürgen; De Korte, Chris L
2010-07-01
The aim of this study was to test the hypothesis that automatic segmentation of vessels in ultrasound (US) images can produce similar or better results in grading fatty livers than interactive segmentation. A study was performed in postpartum dairy cows (N=151), as an animal model of human fatty liver disease, to test this hypothesis. Five transcutaneous and five intraoperative US liver images were acquired in each animal and a liverbiopsy was taken. In liver tissue samples, triacylglycerol (TAG) was measured by biochemical analysis and hepatic diseases other than hepatic lipidosis were excluded by histopathologic examination. Ultrasonic tissue characterization (UTC) parameters--Mean echo level, standard deviation (SD) of echo level, signal-to-noise ratio (SNR), residual attenuation coefficient (ResAtt) and axial and lateral speckle size--were derived using a computer-aided US (CAUS) protocol and software package. First, the liver tissue was interactively segmented by two observers. With increasing fat content, fewer hepatic vessels were visible in the ultrasound images and, therefore, a smaller proportion of the liver needed to be excluded from these images. Automatic-segmentation algorithms were implemented and it was investigated whether better results could be achieved than with the subjective and time-consuming interactive-segmentation procedure. The automatic-segmentation algorithms were based on both fixed and adaptive thresholding techniques in combination with a 'speckle'-shaped moving-window exclusion technique. All data were analyzed with and without postprocessing as contained in CAUS and with different automated-segmentation techniques. This enabled us to study the effect of the applied postprocessing steps on single and multiple linear regressions ofthe various UTC parameters with TAG. Improved correlations for all US parameters were found by using automatic-segmentation techniques. Stepwise multiple linear-regression formulas where derived and used to predict TAG level in the liver. Receiver-operating-characteristics (ROC) analysis was applied to assess the performance and area under the curve (AUC) of predicting TAG and to compare the sensitivity and specificity of the methods. Best speckle-size estimates and overall performance (R2 = 0.71, AUC = 0.94) were achieved by using an SNR-based adaptive automatic-segmentation method (used TAG threshold: 50 mg/g liver wet weight). Automatic segmentation is thus feasible and profitable.
Relationship of Hospital Staff Coverage and Delivery Room Resuscitation Practices to Birth Asphyxia.
Tu, Joanna H; Profit, Jochen; Melsop, Kathryn; Brown, Taylor; Davis, Alexis; Main, Elliot; Lee, Henry C
2017-02-01
Objective The objective of this study was to assess utilization of specialist coverage and checklists in perinatal settings and to examine utilization by birth asphyxia rates. Design This is a survey study of California maternity hospitals concerning checklist use to prepare for delivery room resuscitation and 24-hour in-house specialist coverage (pediatrician/neonatologist, obstetrician, and obstetric anesthesiologist) and results linked to hospital birth asphyxia rates (preterm and low weight births were excluded). Results Of 253 maternity hospitals, 138 responded (55%); 59 (43%) indicated checklist use, and in-house specialist coverage ranged from 38% (pediatrician/neonatologist) to 54% (anesthesiology). In-house coverage was more common in urban versus rural hospitals for all specialties ( p < 0.0001), but checklist use was not significantly different ( p = 0.88). Higher birth volume hospitals had more specialist coverage ( p < 0.0001), whereas checklist use did not differ ( p = 0.3). In-house obstetric coverage was associated with lower asphyxia rates (odds ratio: 0.34; 95% confidence interval [CI]: 0.20, 0.58) in a regression model accounting for other providers. Checklist use was not associated with birth asphyxia (odds ratio: 1.12; 95% CI: 0.75, 1.68). Conclusion Higher birth volume and urban hospitals demonstrated greater in-house specialist coverage, but checklist use was similar across all hospitals. Current data suggest that in-house obstetric coverage has greater impact on asphyxia than other specialist coverage or checklist use. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Automatic generation of stop word lists for information retrieval and analysis
Rose, Stuart J
2013-01-08
Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.
A President's Formula: Involving the Entire Faculty in Community Services
ERIC Educational Resources Information Center
Vaughan, George B.
1975-01-01
All Community college faculty members should automatically have 20 percent of their time assigned to community service courses. This will bring community services into the mainstream of the educational program and will assure financial coverage of community service courses. The resulting slack in the regular curriculum should be filled by…
Lodha, A; Brown, N; Soraisham, A; Amin, H; Tang, S; Singhal, N
2017-08-01
To compare short- and long-term neurodevelopmental outcomes at 3 years of corrected age of preterm infants cared for by 24-hour in-house staff neonatologists and those cared for by staff neonatologists during daytime only. Retrospective analysis of prospectively collected follow-up data on all nonanomalous preterm infants from 1998 to 2004 excluding year 2001 as a washout period. Infants were divided into two groups based on care provided by staff neonatologists: 24-hour in-house coverage (24-hour coverage 1998-2000) and daytime coverage (day coverage 2002-2004). Short- and long-term outcomes were compared. A total of 387 (78%) of the screened infants were included. Twenty-four-hour coverage (n=179) and day coverage (n=208) groups had a median birth weight (BW) of 875 g (range 470-1250) and 922 g (480-1530; P=0.028), respectively, and both had a median gestational age of 27 weeks. In the day coverage group, a smaller proportion of mothers had chorioamnionitis (20% vs. 30%; P=0.025), received less antibiotics (62% vs. 73%; P=0.023), and infants had fewer cases of confirmed sepsis (14% vs. 23%; P=0.022). In the day coverage group, a larger number of infants had respiratory distress syndrome (87% vs. 77%; P=0.011) and required prolonged mechanical ventilation (median 31 vs. 21 days; P=0.002). The incidence of major neurodevelopmental impairment was not significantly different between the two groups (odds ratio 0.76; 95% confidence interval 0.34-1.65). Duration of mechanical ventilation was reduced with 24-hour in-house coverage by staff neonatologists. However, 24-hour coverage was not associated with any difference in neurodevelopmental (ND) outcomes at 3-year corrected age.
Combining Space-Based and In-Situ Measurements to Track Flooding in Thailand
NASA Technical Reports Server (NTRS)
Chien, Steve; Doubleday, Joshua; Mclaren, David; Tran, Daniel; Tanpipat, Veerachai; Chitradon, Royal; Boonya-aaroonnet, Surajate; Thanapakpawin, Porranee; Khunboa, Chatchai; Leelapatra, Watis;
2011-01-01
We describe efforts to integrate in-situ sensing, space-borne sensing, hydrological modeling, active control of sensing, and automatic data product generation to enhance monitoring and management of flooding. In our approach, broad coverage sensors and missions such as MODIS, TRMM, and weather satellite information and in-situ weather and river gauging information are all inputs to track flooding via river basin and sub-basin hydrological models. While these inputs can provide significant information as to the major flooding, targetable space measurements can provide better spatial resolution measurements of flooding extent. In order to leverage such assets we automatically task observations in response to automated analysis indications of major flooding. These new measurements are automatically processed and assimilated with the other flooding data. We describe our ongoing efforts to deploy this system to track major flooding events in Thailand.
ERIC Educational Resources Information Center
Feiner, Susan F.
1993-01-01
Reports on a study of introductory college economics textbooks on the quality and quantity of coverage of the economic status of women and minorities. Finds that textbooks still have a tendency to exclude women and minorities from the general discussion and disguise the multiplicity of explanations for observed differences. (CFR)
34 CFR 21.33 - Allowable fees and expenses.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Education Office of the Secretary, Department of Education EQUAL ACCESS TO JUSTICE How Does One Apply for an Award? § 21.33 Allowable fees and expenses. (a) A prevailing party may apply for an award of fees and... covered by the Act and issues excluded from coverage, the applicant may apply only for an award of fees...
34 CFR 21.33 - Allowable fees and expenses.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Education Office of the Secretary, Department of Education EQUAL ACCESS TO JUSTICE How Does One Apply for an Award? § 21.33 Allowable fees and expenses. (a) A prevailing party may apply for an award of fees and... covered by the Act and issues excluded from coverage, the applicant may apply only for an award of fees...
29 CFR 779.262 - Excise taxes at the retail level.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Coverage Excise Taxes § 779.262 Excise taxes at the retail level. (a) Federal excise taxes are imposed at.... Such excise taxes are levied at the retail level on any liquid fuel sold for use, or used in a diesel... levied at the retail level, and thus excludable when separately stated, depends, of course, upon the law...
Life sciences licensing deals in the fourth quarter of 2017: updates and trends.
D'Souza, P
2018-02-01
During the fourth quarter of 2017, Cortellis Competitive Intelligence registered 1,107 new deals (excluding mergers & acquisitions) as part of its ongoing coverage of licensing activity in the life sciences sector compared to 1,043 in the third quarter and 1,035 in the fourth quarter of 2016. Copyright 2018 Clarivate Analytics.
Thirapatarapong, Wilawan; Thomas, Randal J; Pack, Quinn; Sharma, Saurabh; Squires, Ray W
2014-01-01
Although cardiac rehabilitation (CR) improves outcomes in patients with heart failure (HF), studies suggest variable uptake by patients with HF, as well as variable coverage by insurance carriers. The purpose of this study was to determine the percentage of large commercial health insurance companies that provide coverage for outpatient (CR) for patients with HF. We identified a sample of the largest US commercial health care providers and analyzed their CR coverage policies for patients with HF. We surveyed 44 large private health care insurance companies, reviewed company Web sites, and, when unclear, contacted companies by e-mail or telephone. We excluded insurance clearinghouses because they did not directly provide health care insurance. Of 44 eligible insurance companies, 29 (66%) reported that they provide coverage for outpatient CR in patients with HF. The majority of companies (83%) covered CR for patients with any type of HF. A minority (10%) did not cover CR for patients with HF if it was considered a preexisting condition. A significant percentage of commercial health care insurance companies in the United States report that they currently cover outpatient CR for patients with HF. Because health insurance coverage is associated with patient participation in CR, it is anticipated that patients with HF will increasingly participate in CR in coming years.
Fronstin, Paul
2009-01-01
HEALTH CARE TAX CAP: With health reform a major priority of the new 111th Congress and President Barack Obama, this Issue Briefexamines the administrative and implementation issues that arise from one of the major reform proposals: Capping the exclusion of employment-based health coverage from workers' taxable income. The amount that employers contribute toward workers' health coverage is generally excluded, without limit, from workers' taxable income. In addition, workers whose employers sponsor flexible spending accounts are able to pay out-of-pocket expenses with pretax dollars. Employers can also make available a premium conversion arrangement, which allows workers to pay their share of the premium for employment-based coverage with pretax dollars. In 2005, a presidential advisory board concluded that limiting the amount of tax-preferred health coverage could lower overall private-sector health spending. The panel recommended a cap on the amount of employment-based health coverage individuals can exclude from their income tax, as a way to reduce health spending. In his 2008 "Call to Action" for health care reform, Sen. Max Baucus (D-MT), chairman of the Senate Finance Committee, states that "Congress should explore ways to restructure the current tax incentives to encourage more efficient spending on health and to target our tax dollars more effectively and fairly." While a tax cap on health coverage sounds simple, for many employers, it could be difficult to administer and results would vary by employer based on the type of health benefit plan, the size and demographics of their work force, and even where the workers live. The change would be especially difficult for self-insured employers that do not pay insurance premiums, since they would have to set the "premium equivalent" for each worker. This would not only be costly for employers, depending upon the requirements set out by law, but could also create fairness and tax issues for many affected workers. For self-insured employers, calculating insurance premium costs under a tax cap could be done fairly easily using the COBRA premium. However, whether self-insured employers would be able to use the least costly method to determine the value of coverage would have to be determined by law and/or regulations. THE SEC. 89 EXPERIENCE: Sec. 89 of the Tax Reform Act of 1986, which attempted to make employee benefits more standard and fair, became so controversial that it was repealed by Congress in 1989--in part because the regulations created regulatory burdens that were so complicated and costly as to be unworkable. Similarly, valuation calculations under a health coverage tax cap could become overly burdensome if the lessons from Sec. 89 are not heeded.
Childhood vaccination coverage rates among military dependents in the United States.
Dunn, Angela C; Black, Carla L; Arnold, John; Brodine, Stephanie; Waalen, Jill; Binkin, Nancy
2015-05-01
The Military Health System provides universal coverage of all recommended childhood vaccinations. Few studies have examined the effect that being insured by the Military Health System has on childhood vaccination coverage. The purpose of this study was to compare the coverage of the universally recommended vaccines among military dependents versus other insured and uninsured children using a nationwide sample of children. The National Immunization Survey is a multistage, random-digit dialing survey designed to measure vaccination coverage estimates of US children aged 19 to 35 months old. Data from 2007 through 2012 were combined to permit comparison of vaccination coverage among military dependent and all other children. Among military dependents, 28.0% of children aged 19 to 35 months were not up to date on the 4:3:1:3:3:1 vaccination series excluding Haemophilus influenzae type b vaccine compared with 21.1% of all other children (odds ratio: 1.4; 95% confidence interval: 1.2-1.6). After controlling for sociodemographic characteristics, compared with all other US children, military dependent children were more likely to be incompletely vaccinated (odds ratio: 1.3; 95% confidence interval: 1.1-1.5). Lower vaccination coverage rates among US military dependent children might be due to this population being highly mobile. However, the lack of a military-wide childhood immunization registry and incomplete documentation of vaccinations could contribute to the lower vaccination coverage rates seen in this study. These results suggest the need for further investigation to evaluate vaccination coverage of children with complete ascertainment of vaccination history, and if lower immunization rates are verified, assessment of reasons for lower vaccination coverage rates among military dependent children. Copyright © 2015 by the American Academy of Pediatrics.
Indexing Aids at Corporate Websites: The Use of Robots.txt and META Tags.
ERIC Educational Resources Information Center
Drott, M. Carl
2002-01-01
This study examine 60 corporate Web sites to see if they provided support for automatic indexing, particularly use of the robots.txt and Meta tags for keywords and description. Discusses the use of Java and cookies and suggests that an increase in indexing aids would improve overall index coverage of the Web. (Author/LRW)
Kim, Heejun; Bian, Jiantao; Mostafa, Javed; Jonnalagadda, Siddhartha; Del Fiol, Guilherme
2016-01-01
Motivation: Clinicians need up-to-date evidence from high quality clinical trials to support clinical decisions. However, applying evidence from the primary literature requires significant effort. Objective: To examine the feasibility of automatically extracting key clinical trial information from ClinicalTrials.gov. Methods: We assessed the coverage of ClinicalTrials.gov for high quality clinical studies that are indexed in PubMed. Using 140 random ClinicalTrials.gov records, we developed and tested rules for the automatic extraction of key information. Results: The rate of high quality clinical trial registration in ClinicalTrials.gov increased from 0.2% in 2005 to 17% in 2015. Trials reporting results increased from 3% in 2005 to 19% in 2015. The accuracy of the automatic extraction algorithm for 10 trial attributes was 90% on average. Future research is needed to improve the algorithm accuracy and to design information displays to optimally present trial information to clinicians.
Sensor-driven area coverage for an autonomous fixed-wing unmanned aerial vehicle.
Paull, Liam; Thibault, Carl; Nagaty, Amr; Seto, Mae; Li, Howard
2014-09-01
Area coverage with an onboard sensor is an important task for an unmanned aerial vehicle (UAV) with many applications. Autonomous fixed-wing UAVs are more appropriate for larger scale area surveying since they can cover ground more quickly. However, their non-holonomic dynamics and susceptibility to disturbances make sensor coverage a challenging task. Most previous approaches to area coverage planning are offline and assume that the UAV can follow the planned trajectory exactly. In this paper, this restriction is removed as the aircraft maintains a coverage map based on its actual pose trajectory and makes control decisions based on that map. The aircraft is able to plan paths in situ based on sensor data and an accurate model of the on-board camera used for coverage. An information theoretic approach is used that selects desired headings that maximize the expected information gain over the coverage map. In addition, the branch entropy concept previously developed for autonomous underwater vehicles is extended to UAVs and ensures that the vehicle is able to achieve its global coverage mission. The coverage map over the workspace uses the projective camera model and compares the expected area of the target on the ground and the actual area covered on the ground by each pixel in the image. The camera is mounted on a two-axis gimbal and can either be stabilized or optimized for maximal coverage. Hardware-in-the-loop simulation results and real hardware implementation on a fixed-wing UAV show the effectiveness of the approach. By including the already developed automatic takeoff and landing capabilities, we now have a fully automated and robust platform for performing aerial imagery surveys.
Williams, Gemma A; Parmar, Divya; Dkhimi, Fahdi; Asante, Felix; Arhinful, Daniel; Mladovsky, Philipa
2017-08-01
To help reduce child mortality and reach universal health coverage, Ghana extended free membership of the National Health Insurance Scheme (NHIS) to children (under-18s) in 2008. However, despite the introduction of premium waivers, a substantial proportion of children remain uninsured. Thus far, few studies have explored why enrolment of children in NHIS may remain low, despite the absence of significant financial barriers to membership. In this paper we therefore look beyond economic explanations of access to health insurance to explore additional wider determinants of enrolment in the NHIS. In particular, we investigate whether social exclusion, as measured through a sociocultural, political and economic lens, can explain poor enrolment rates of children. Data were collected from a cross-sectional survey of 4050 representative households conducted in Ghana in 2012. Household indices were created to measure sociocultural, political and economic exclusion, and logistic regressions were conducted to study determinants of enrolment at the individual and household levels. Our results indicate that socioculturally, economically and politically excluded children are less likely to enrol in the NHIS. Furthermore, households excluded in all dimensions were more likely to be non-enrolled or partially-enrolled (i.e. not all children enrolled within the household) than fully-enrolled. These results suggest that equity in access for socially excluded children has not yet been achieved. Efforts should be taken to improve coverage by removing the remaining small, annually renewable registration fee, implementing and publicising the new clause that de-links premium waivers from parental membership, establishing additional scheme administrative offices in remote areas, holding regular registration sessions in schools and conducting outreach sessions and providing registration support to female guardians of children. Ensuring equitable access to NHIS will contribute substantially to improving child health and reducing child mortality in Ghana. Copyright © 2017 Elsevier Ltd. All rights reserved.
CNNdel: Calling Structural Variations on Low Coverage Data Based on Convolutional Neural Networks
2017-01-01
Many structural variations (SVs) detection methods have been proposed due to the popularization of next-generation sequencing (NGS). These SV calling methods use different SV-property-dependent features; however, they all suffer from poor accuracy when running on low coverage sequences. The union of results from these tools achieves fairly high sensitivity but still produces low accuracy on low coverage sequence data. That is, these methods contain many false positives. In this paper, we present CNNdel, an approach for calling deletions from paired-end reads. CNNdel gathers SV candidates reported by multiple tools and then extracts features from aligned BAM files at the positions of candidates. With labeled feature-expressed candidates as a training set, CNNdel trains convolutional neural networks (CNNs) to distinguish true unlabeled candidates from false ones. Results show that CNNdel works well with NGS reads from 26 low coverage genomes of the 1000 Genomes Project. The paper demonstrates that convolutional neural networks can automatically assign the priority of SV features and reduce the false positives efficaciously. PMID:28630866
Automatic learning-based beam angle selection for thoracic IMRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amit, Guy; Marshall, Andrea; Purdie, Thomas G., E-mail: tom.purdie@rmp.uhn.ca
Purpose: The treatment of thoracic cancer using external beam radiation requires an optimal selection of the radiation beam directions to ensure effective coverage of the target volume and to avoid unnecessary treatment of normal healthy tissues. Intensity modulated radiation therapy (IMRT) planning is a lengthy process, which requires the planner to iterate between choosing beam angles, specifying dose–volume objectives and executing IMRT optimization. In thorax treatment planning, where there are no class solutions for beam placement, beam angle selection is performed manually, based on the planner’s clinical experience. The purpose of this work is to propose and study a computationallymore » efficient framework that utilizes machine learning to automatically select treatment beam angles. Such a framework may be helpful for reducing the overall planning workload. Methods: The authors introduce an automated beam selection method, based on learning the relationships between beam angles and anatomical features. Using a large set of clinically approved IMRT plans, a random forest regression algorithm is trained to map a multitude of anatomical features into an individual beam score. An optimization scheme is then built to select and adjust the beam angles, considering the learned interbeam dependencies. The validity and quality of the automatically selected beams evaluated using the manually selected beams from the corresponding clinical plans as the ground truth. Results: The analysis included 149 clinically approved thoracic IMRT plans. For a randomly selected test subset of 27 plans, IMRT plans were generated using automatically selected beams and compared to the clinical plans. The comparison of the predicted and the clinical beam angles demonstrated a good average correspondence between the two (angular distance 16.8° ± 10°, correlation 0.75 ± 0.2). The dose distributions of the semiautomatic and clinical plans were equivalent in terms of primary target volume coverage and organ at risk sparing and were superior over plans produced with fixed sets of common beam angles. The great majority of the automatic plans (93%) were approved as clinically acceptable by three radiation therapy specialists. Conclusions: The results demonstrated the feasibility of utilizing a learning-based approach for automatic selection of beam angles in thoracic IMRT planning. The proposed method may assist in reducing the manual planning workload, while sustaining plan quality.« less
Self-calibrating models for dynamic monitoring and diagnosis
NASA Technical Reports Server (NTRS)
Kuipers, Benjamin
1994-01-01
The present goal in qualitative reasoning is to develop methods for automatically building qualitative and semiquantitative models of dynamic systems and to use them for monitoring and fault diagnosis. The qualitative approach to modeling provides a guarantee of coverage while our semiquantitative methods support convergence toward a numerical model as observations are accumulated. We have developed and applied methods for automatic creation of qualitative models, developed two methods for obtaining tractable results on problems that were previously intractable for qualitative simulation, and developed more powerful methods for learning semiquantitative models from observations and deriving semiquantitative predictions from them. With these advances, qualitative reasoning comes significantly closer to realizing its aims as a practical engineering method.
Infrared-enhanced TV for fire detection
NASA Technical Reports Server (NTRS)
Hall, J. R.
1978-01-01
Closed-circuit television is superior to conventional smoke or heat sensors for detecting fires in large open spaces. Single TV camera scans entire area, whereas many conventional sensors and maze of interconnecting wiring might be required to get same coverage. Camera is monitored by person who would trip alarm if fire were detected, or electronic circuitry could process camera signal for fully-automatic alarm system.
Resolution Enhanced Magnetic Sensing System for Wide Coverage Real Time UXO Detection
NASA Astrophysics Data System (ADS)
Zalevsky, Zeev; Bregman, Yuri; Salomonski, Nizan; Zafrir, Hovav
2012-09-01
In this paper we present a new high resolution automatic detection algorithm based upon a Wavelet transform and then validate it in marine related experiments. The proposed approach allows obtaining an automatic detection in a very low signal to noise ratios. The amount of calculations is reduced, the magnetic trend is depressed and the probability of detection/ false alarm rate can easily be controlled. Moreover, the algorithm enables to distinguish between close targets. In the algorithm we use the physical dependence of the magnetic field of a magnetic dipole in order to define a Wavelet mother function that later on can detect magnetic targets modeled as dipoles and embedded in noisy surrounding, at improved resolution. The proposed algorithm was realized on synthesized targets and then validated in field experiments involving a marine surface-floating system for wide coverage real time unexploded ordinance (UXO) detection and mapping. The detection probability achieved in the marine experiment was above 90%. The horizontal radial error of most of the detected targets was only 16 m and two baseline targets that were immersed about 20 m one to another could easily be distinguished.
ERIC Educational Resources Information Center
Storm, Lance; Tressoldi, Patrizio E.; Utts, Jessica
2013-01-01
Rouder, Morey, and Province (2013) stated that (a) the evidence-based case for psi in Storm, Tressoldi, and Di Risio's (2010) meta-analysis is supported only by a number of studies that used manual randomization, and (b) when these studies are excluded so that only investigations using automatic randomization are evaluated (and some additional…
Long-term drought sensitivity of trees in second-growth forests in a humid region
Neil Pederson; Kacie Tackett; Ryan W. McEwan; Stacy Clark; Adrienne Cooper; Glade Brosi; Ray Eaton; R. Drew Stockwell
2012-01-01
Classical field methods of reconstructing drought using tree rings in humid, temperate regions typically target old trees from drought-prone sites. This approach limits investigators to a handful of species and excludes large amounts of data that might be useful, especially for coverage gaps in large-scale networks. By sampling in more âtypicalâ forests, network...
Joseph, Tiffany D
2017-10-01
Recent policy debates have centered on health reform and who should benefit from such policy. Most immigrants are excluded from the 2010 Affordable Care Act (ACA) due to federal restrictions on public benefits for certain immigrants. But, some subnational jurisdictions have extended coverage options to federally ineligible immigrants. Yet, less is known about the effectiveness of such inclusive reforms for providing coverage and care to immigrants in those jurisdictions. This article examines the relationship between coverage and health care access for immigrants under comprehensive health reform in the Boston metropolitan area. The article uses data from interviews conducted with a total of 153 immigrants, health care professionals, and immigrant and health advocacy organization employees under the Massachusetts and ACA health reforms. Findings indicate that respondents across the various stakeholder groups perceive that immigrants' documentation status minimizes their ability to access health care even when they have health coverage. Specifically, respondents expressed that intersecting public policies, concerns that using health services would jeopardize future legalization proceedings, and immigrants' increased likelihood of deportation en route to medical appointments negatively influenced immigrants' health care access. Thus, restrictive federal policies and national-level anti-immigrant sentiment can undermine inclusive subnational policies in socially progressive places. Copyright © 2017 by Duke University Press.
Health-financing reforms in southeast Asia: challenges in achieving universal coverage.
Tangcharoensathien, Viroj; Patcharanarumol, Walaiporn; Ir, Por; Aljunid, Syed Mohamed; Mukti, Ali Ghufron; Akkhavong, Kongsap; Banzon, Eduardo; Huong, Dang Boi; Thabrany, Hasbullah; Mills, Anne
2011-03-05
In this sixth paper of the Series, we review health-financing reforms in seven countries in southeast Asia that have sought to reduce dependence on out-of-pocket payments, increase pooled health finance, and expand service use as steps towards universal coverage. Laos and Cambodia, both resource-poor countries, have mostly relied on donor-supported health equity funds to reach the poor, and reliable funding and appropriate identification of the eligible poor are two major challenges for nationwide expansion. For Thailand, the Philippines, Indonesia, and Vietnam, social health insurance financed by payroll tax is commonly used for formal sector employees (excluding Malaysia), with varying outcomes in terms of financial protection. Alternative payment methods have different implications for provider behaviour and financial protection. Two alternative approaches for financial protection of the non-poor outside the formal sector have emerged-contributory arrangements and tax-financed schemes-with different abilities to achieve high population coverage rapidly. Fiscal space and mobilisation of payroll contributions are both important in accelerating financial protection. Expanding coverage of good-quality services and ensuring adequate human resources are also important to achieve universal coverage. As health-financing reform is complex, institutional capacity to generate evidence and inform policy is essential and should be strengthened. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sokolov, Leonid V.
2010-08-01
There is a need of measuring distributed pressure on the aircraft engine inlet with high precision within a wide operating temperature range in the severe environment to improve the efficiency of aircraft engine control. The basic solutions and principles of designing high-temperature (to 523K) microelectromechanical pressure sensors based on a membrane-type SOI heterostructure with a monolithic integral tensoframe (MEMS-SOIMT) are proposed in accordance with the developed concept, which excludes the use of electric p-n junctions in semiconductor microelectromechanical sensors. The MEMS-SOIMT technology relies on the group processes of microelectronics and micromechanics for high-precision microprofiling of a three-dimension micromechanical structure, which exclude high-temperature silicon doping processes.
Error and Error Mitigation in Low-Coverage Genome Assemblies
Hubisz, Melissa J.; Lin, Michael F.; Kellis, Manolis; Siepel, Adam
2011-01-01
The recent release of twenty-two new genome sequences has dramatically increased the data available for mammalian comparative genomics, but twenty of these new sequences are currently limited to ∼2× coverage. Here we examine the extent of sequencing error in these 2× assemblies, and its potential impact in downstream analyses. By comparing 2× assemblies with high-quality sequences from the ENCODE regions, we estimate the rate of sequencing error to be 1–4 errors per kilobase. While this error rate is fairly modest, sequencing error can still have surprising effects. For example, an apparent lineage-specific insertion in a coding region is more likely to reflect sequencing error than a true biological event, and the length distribution of coding indels is strongly distorted by error. We find that most errors are contributed by a small fraction of bases with low quality scores, in particular, by the ends of reads in regions of single-read coverage in the assembly. We explore several approaches for automatic sequencing error mitigation (SEM), making use of the localized nature of sequencing error, the fact that it is well predicted by quality scores, and information about errors that comes from comparisons across species. Our automatic methods for error mitigation cannot replace the need for additional sequencing, but they do allow substantial fractions of errors to be masked or eliminated at the cost of modest amounts of over-correction, and they can reduce the impact of error in downstream phylogenomic analyses. Our error-mitigated alignments are available for download. PMID:21340033
Loomis, C Keanin
2002-12-01
Under the Pregnancy Discrimination Act (PDA), employers are prohibited from discriminating against women by treating pregnancy and childbirth different from other medical conditions. Employers who offer medical benefits to their employees have thus been required to cover pregnancy-related medical costs on the same terms as other medical coverage. The cost of prescription contraception, however, has generally not been covered by employer-sponsored medical plans, even while other prescription drugs were. This Note examines the recent case of Erickson v. Bartell Drug Co., which challenged this practice of excluding prescription contraception coverage as discriminatory under the PDA, and argues that further federal legislation is necessary to ensure the equal treatment of women in the workplace.
Torres, Viviana; Cerda, Mauricio; Knaup, Petra; Löpprich, Martin
2016-01-01
An important part of the electronic information available in Hospital Information System (HIS) has the potential to be automatically exported to Electronic Data Capture (EDC) platforms for improving clinical research. This automation has the advantage of reducing manual data transcription, a time consuming and prone to errors process. However, quantitative evaluations of the process of exporting data from a HIS to an EDC system have not been reported extensively, in particular comparing with manual transcription. In this work an assessment to study the quality of an automatic export process, focused in laboratory data from a HIS is presented. Quality of the laboratory data was assessed in two types of processes: (1) a manual process of data transcription, and (2) an automatic process of data transference. The automatic transference was implemented as an Extract, Transform and Load (ETL) process. Then, a comparison was carried out between manual and automatic data collection methods. The criteria to measure data quality were correctness and completeness. The manual process had a general error rate of 2.6% to 7.1%, obtaining the lowest error rate if data fields with a not clear definition were removed from the analysis (p < 10E-3). In the case of automatic process, the general error rate was 1.9% to 12.1%, where lowest error rate is obtained when excluding information missing in the HIS but transcribed to the EDC from other physical sources. The automatic ETL process can be used to collect laboratory data for clinical research if data in the HIS as well as physical documentation not included in HIS, are identified previously and follows a standardized data collection protocol.
ISS Progress 68 Docking Coverage
2017-10-16
The unpiloted Russian ISS Progress 68 cargo craft arrived at the International Space Station Oct. 16 on a resupply mission following a two day journey following its launch from the Baikonur Cosmodrome in Kazakhstan. The Progress delivered almost three tons of food, fuel and supplies for the Expedition 53 crew. The Progress automatically linked up to the Pirs Docking Compartment, where it will remain until next March.
NASA Astrophysics Data System (ADS)
Xiao, Jianyong; Bai, Xiaoyong; Zhou, Dequan; Qian, Qinghuan; Zeng, Cheng; Chen, Fei
2018-01-01
Vegetation coverage dynamics is affected by climatic, topography and human activities, which is an important indicator reflecting the regional ecological environment. Revealing the spatial-temporal characteristics of vegetation coverage is of great significance to the protection and management of ecological environment. Based on MODIS NDVI data and the Maximum Value Composites (MVC), we excluded soil spectrum interference to calculate Fractional Vegetation Coverage (FVC). Then the long-term FVC was used to calculate the spatial pattern and temporal variation of vegetation in Wujiang River Basin from 2000 to 2016 by using Trend analysis and Hurst index. The relationship between topography and spatial distribution of FVC was analyzed. The main conclusions are as follows: (1) The multi-annual mean vegetation coverage reveals a spatial distribution variation characteristic of low value in midstream and high level in other parts of the basin, owing a mean value of 0.6567. (2) From 2000 to 2016, the FVC of the Wujiang River Basin fluctuated between 0.6110 and 0.7380, and the overall growth rate of FVC was 0.0074/a. (3) The area of vegetation coverage tending to improve is more than that going to degrade in the future. Grass land, Arable land and Others improved significantly; karst rocky desertification comprehensive management project lead to persistent vegetation coverage improvement of Grass land, Arable land and Others. Residential land is covered with obviously degraded vegetation, resulting of urban sprawl; (4) The spatial distribution of FVC is positively correlated with TNI. Researches of spatial-temporal evolution of vegetation coverage have significant meaning for the ecological environment protection and management of the Wujiang River Basin.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nam, Hyeong Soo; Kim, Chang-Soo; Yoo, Hongki, E-mail: kjwmm@korea.ac.kr, E-mail: hyoo@hanyang.ac.kr
Purpose: Intravascular optical coherence tomography (IV-OCT) is a high-resolution imaging method used to visualize the microstructure of arterial walls in vivo. IV-OCT enables the clinician to clearly observe and accurately measure stent apposition and neointimal coverage of coronary stents, which are associated with side effects such as in-stent thrombosis. In this study, the authors present an algorithm for quantifying stent apposition and neointimal coverage by automatically detecting lumen contours and stent struts in IV-OCT images. Methods: The algorithm utilizes OCT intensity images and their first and second gradient images along the axial direction to detect lumen contours and stent strutmore » candidates. These stent strut candidates are classified into true and false stent struts based on their features, using an artificial neural network with one hidden layer and ten nodes. After segmentation, either the protrusion distance (PD) or neointimal thickness (NT) for each strut is measured automatically. In randomly selected image sets covering a large variety of clinical scenarios, the results of the algorithm were compared to those of manual segmentation by IV-OCT readers. Results: Stent strut detection showed a 96.5% positive predictive value and a 92.9% true positive rate. In addition, case-by-case validation also showed comparable accuracy for most cases. High correlation coefficients (R > 0.99) were observed for PD and NT between the algorithmic and the manual results, showing little bias (0.20 and 0.46 μm, respectively) and a narrow range of limits of agreement (36 and 54 μm, respectively). In addition, the algorithm worked well in various clinical scenarios and even in cases with a low level of stent malapposition and neointimal coverage. Conclusions: The presented automatic algorithm enables robust and fast detection of lumen contours and stent struts and provides quantitative measurements of PD and NT. In addition, the algorithm was validated using various clinical cases to demonstrate its reliability. Therefore, this technique can be effectively utilized for clinical trials on stent-related side effects, including in-stent thrombosis and in-stent restenosis.« less
Annual immunisation coverage report, 2010.
Hull, Brynley; Dey, Aditi; Menzies, Rob; McIntyre, Peter
2013-03-31
This, the fourth annual immunisation coverage report, documents trends during 2010 for a range of standard measures derived from Australian Childhood Immunisation Register (ACIR) data. These include coverage at standard age milestones and for individual vaccines included on the National Immunisation Program (NIP). For the first time, coverage from other sources for adolescents and the elderly are included. The proportion of children 'fully vaccinated' at 12, 24 and 60 months of age was 91.6%, 92.1% and 89.1% respectively. For vaccines available on the NIP but not currently assessed for 'fully immunised' status or for eligibility for incentive payments (rotavirus and pneumococcal at 12 months and meningococcal C and varicella at 24 months) coverage varied. Although pneumococcal vaccine had similar coverage at 12 months to other vaccines, coverage was lower for rotavirus at 12 months (84.7%) and varicella at 24 months (83.0%). Overall coverage at 24 months of age exceeded that at 12 months of age nationally and for most jurisdictions, but as receipt of varicella vaccine at 18 months is excluded from calculations, this represents delayed immunisation, with some contribution from immunisation incentives. The 'fully immunised' coverage estimates for immunisations due by 60 months increased substantially in 2009, reaching almost 90% in 2010, probably related to completed immunisation by 60 months of age being introduced in 2009 as a requirement for GP incentive payments. As previously documented, vaccines recommended for Indigenous children only (hepatitis A and pneumococcal polysaccharide vaccine) had suboptimal coverage at around 57%. Delayed receipt of vaccines by Indigenous children at the 60-month milestone age improved from 56% to 62% but the disparity in on-time vaccination between Indigenous and non-Indigenous children at earlier age milestones did not improve. Coverage data for human papillomavirus (HPV)from the national HPV register are consistent with high coverage in the school-based program (73%) but were lower for the catch-up program for women outside school (30-38%). Coverage estimates for vaccines on the NIP from 65 years of age were comparable with other developed countries.
SHIELD: FITGALAXY -- A Software Package for Automatic Aperture Photometry of Extended Sources
NASA Astrophysics Data System (ADS)
Marshall, Melissa
2013-01-01
Determining the parameters of extended sources, such as galaxies, is a common but time-consuming task. Finding a photometric aperture that encompasses the majority of the flux of a source and identifying and excluding contaminating objects is often done by hand - a lengthy and difficult to reproduce process. To make extracting information from large data sets both quick and repeatable, I have developed a program called FITGALAXY, written in IDL. This program uses minimal user input to automatically fit an aperture to, and perform aperture and surface photometry on, an extended source. FITGALAXY also automatically traces the outlines of surface brightness thresholds and creates surface brightness profiles, which can then be used to determine the radial properties of a source. Finally, the program performs automatic masking of contaminating sources. Masks and apertures can be applied to multiple images (regardless of the WCS solution or plate scale) in order to accurately measure the same source at different wavelengths. I present the fluxes, as measured by the program, of a selection of galaxies from the Local Volume Legacy Survey. I then compare these results with the fluxes given by Dale et al. (2009) in order to assess the accuracy of FITGALAXY.
Federal Parity and Access to Behavioral Health Care in Private Health Plans.
Hodgkin, Dominic; Horgan, Constance M; Stewart, Maureen T; Quinn, Amity E; Creedon, Timothy B; Reif, Sharon; Garnick, Deborah W
2018-04-01
The 2008 Mental Health Parity and Addiction Equity Act (MHPAEA) sought to improve access to behavioral health care by regulating health plans' coverage and management of services. Health plans have some discretion in how to achieve compliance with MHPAEA, leaving questions about its likely effects on health plan policies. In this study, the authors' objective was to determine how private health plans' coverage and management of behavioral health treatment changed after the federal parity law's full implementation. A nationally representative survey of commercial health plans was conducted in 60 market areas across the continental United States, achieving response rates of 89% in 2010 (weighted N=8,431) and 80% in 2014 (weighted N=6,974). Senior executives at responding plans were interviewed regarding behavioral health services in each year and (in 2014) regarding changes. Student's t tests were used to examine changes in services covered, cost-sharing, and prior authorization requirements for both behavioral health and general medical care. In 2014, 68% of insurance products reported having expanded behavioral health coverage since 2010. Exclusion of eating disorder coverage was eliminated between 2010 (23%) and 2014 (0%). However, more products reported excluding autism treatment in 2014 (24%) than 2010 (8%). Most plans reported no change to prior-authorization requirements between 2010 and 2014. Implementation of federal parity legislation appears to have been accompanied by continuing improvement in behavioral health coverage. The authors did not find evidence of widespread noncompliance or of unintended effects, such as dropping coverage of behavioral health care altogether.
Wide coverage biomedical event extraction using multiple partially overlapping corpora
2013-01-01
Background Biomedical events are key to understanding physiological processes and disease, and wide coverage extraction is required for comprehensive automatic analysis of statements describing biomedical systems in the literature. In turn, the training and evaluation of extraction methods requires manually annotated corpora. However, as manual annotation is time-consuming and expensive, any single event-annotated corpus can only cover a limited number of semantic types. Although combined use of several such corpora could potentially allow an extraction system to achieve broad semantic coverage, there has been little research into learning from multiple corpora with partially overlapping semantic annotation scopes. Results We propose a method for learning from multiple corpora with partial semantic annotation overlap, and implement this method to improve our existing event extraction system, EventMine. An evaluation using seven event annotated corpora, including 65 event types in total, shows that learning from overlapping corpora can produce a single, corpus-independent, wide coverage extraction system that outperforms systems trained on single corpora and exceeds previously reported results on two established event extraction tasks from the BioNLP Shared Task 2011. Conclusions The proposed method allows the training of a wide-coverage, state-of-the-art event extraction system from multiple corpora with partial semantic annotation overlap. The resulting single model makes broad-coverage extraction straightforward in practice by removing the need to either select a subset of compatible corpora or semantic types, or to merge results from several models trained on different individual corpora. Multi-corpus learning also allows annotation efforts to focus on covering additional semantic types, rather than aiming for exhaustive coverage in any single annotation effort, or extending the coverage of semantic types annotated in existing corpora. PMID:23731785
Kim, Heejun; Bian, Jiantao; Mostafa, Javed; Jonnalagadda, Siddhartha; Del Fiol, Guilherme
2016-01-01
Motivation: Clinicians need up-to-date evidence from high quality clinical trials to support clinical decisions. However, applying evidence from the primary literature requires significant effort. Objective: To examine the feasibility of automatically extracting key clinical trial information from ClinicalTrials.gov. Methods: We assessed the coverage of ClinicalTrials.gov for high quality clinical studies that are indexed in PubMed. Using 140 random ClinicalTrials.gov records, we developed and tested rules for the automatic extraction of key information. Results: The rate of high quality clinical trial registration in ClinicalTrials.gov increased from 0.2% in 2005 to 17% in 2015. Trials reporting results increased from 3% in 2005 to 19% in 2015. The accuracy of the automatic extraction algorithm for 10 trial attributes was 90% on average. Future research is needed to improve the algorithm accuracy and to design information displays to optimally present trial information to clinicians. PMID:28269867
An image-based approach for automatic detecting five true-leaves stage of cotton
NASA Astrophysics Data System (ADS)
Li, Yanan; Cao, Zhiguo; Wu, Xi; Yu, Zhenghong; Wang, Yu; Bai, Xiaodong
2013-10-01
Cotton, as one of the four major economic crops, is of great significance to the development of the national economy. Monitoring cotton growth status by automatic image-based detection makes sense due to its low-cost, low-labor and the capability of continuous observations. However, little research has been done to improve close observation of different growth stages of field crops using digital cameras. Therefore, algorithms proposed by us were developed to detect the growth information and predict the starting date of cotton automatically. In this paper, we introduce an approach for automatic detecting five true-leaves stage, which is a critical growth stage of cotton. On account of the drawbacks caused by illumination and the complex background, we cannot use the global coverage as the unique standard of judgment. Consequently, we propose a new method to determine the five true-leaves stage through detecting the node number between the main stem and the side stems, based on the agricultural meteorological observation specification. The error of the results between the predicted starting date with the proposed algorithm and artificial observations is restricted to no more than one day.
Automatic River Network Extraction from LIDAR Data
NASA Astrophysics Data System (ADS)
Maderal, E. N.; Valcarcel, N.; Delgado, J.; Sevilla, C.; Ojeda, J. C.
2016-06-01
National Geographic Institute of Spain (IGN-ES) has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI) within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network) and hydrological criteria (flow accumulation river network), and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files), and process; using local virtualization and the Amazon Web Service (AWS), which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri) and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.
Automatic definition of the oncologic EHR data elements from NCIT in OWL.
Cuggia, Marc; Bourdé, Annabel; Turlin, Bruno; Vincendeau, Sebastien; Bertaud, Valerie; Bohec, Catherine; Duvauferrier, Régis
2011-01-01
Semantic interoperability based on ontologies allows systems to combine their information and process them automatically. The ability to extract meaningful fragments from ontology is a key for the ontology re-use and the construction of a subset will help to structure clinical data entries. The aim of this work is to provide a method for extracting a set of concepts for a specific domain, in order to help to define data elements of an oncologic EHR. a generic extraction algorithm was developed to extract, from the NCIT and for a specific disease (i.e. prostate neoplasm), all the concepts of interest into a sub-ontology. We compared all the concepts extracted to the concepts encoded manually contained into the multi-disciplinary meeting report form (MDMRF). We extracted two sub-ontologies: sub-ontology 1 by using a single key concept and sub-ontology 2 by using 5 additional keywords. The coverage of sub-ontology 2 to the MDMRF concepts was 51%. The low rate of coverage is due to the lack of definition or mis-classification of the NCIT concepts. By providing a subset of concepts focused on a particular domain, this extraction method helps at optimizing the binding process of data elements and at maintaining and enriching a domain ontology.
An accurate method of extracting fat droplets in liver images for quantitative evaluation
NASA Astrophysics Data System (ADS)
Ishikawa, Masahiro; Kobayashi, Naoki; Komagata, Hideki; Shinoda, Kazuma; Yamaguchi, Masahiro; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie
2015-03-01
The steatosis in liver pathological tissue images is a promising indicator of nonalcoholic fatty liver disease (NAFLD) and the possible risk of hepatocellular carcinoma (HCC). The resulting values are also important for ensuring the automatic and accurate classification of HCC images, because the existence of many fat droplets is likely to create errors in quantifying the morphological features used in the process. In this study we propose a method that can automatically detect, and exclude regions with many fat droplets by using the feature values of colors, shapes and the arrangement of cell nuclei. We implement the method and confirm that it can accurately detect fat droplets and quantify the fat droplet ratio of actual images. This investigation also clarifies the effective characteristics that contribute to accurate detection.
NASA Astrophysics Data System (ADS)
Azzoni, Roberto Sergio; Senese, Antonella; Zerboni, Andrea; Maugeri, Maurizio; Smiraglia, Claudio; Diolaiuti, Guglielmina Adele
2016-03-01
In spite of the quite abundant literature focusing on fine debris deposition over glacier accumulation areas, less attention has been paid to the glacier melting surface. Accordingly, we proposed a novel method based on semi-automatic image analysis to estimate ice albedo from fine debris coverage (d). Our procedure was tested on the surface of a wide Alpine valley glacier (the Forni Glacier, Italy), in summer 2011, 2012 and 2013, acquiring parallel data sets of in situ measurements of ice albedo and high-resolution surface images. Analysis of 51 images yielded d values ranging from 0.01 to 0.63 and albedo was found to vary from 0.06 to 0.32. The estimated d values are in a linear relation with the natural logarithm of measured ice albedo (R = -0.84). The robustness of our approach in evaluating d was analyzed through five sensitivity tests, and we found that it is largely replicable. On the Forni Glacier, we also quantified a mean debris coverage rate (Cr) equal to 6 g m-2 per day during the ablation season of 2013, thus supporting previous studies that describe ongoing darkening phenomena at Alpine debris-free glaciers surface. In addition to debris coverage, we also considered the impact of water (both from melt and rainfall) as a factor that tunes albedo: meltwater occurs during the central hours of the day, decreasing the albedo due to its lower reflectivity; instead, rainfall causes a subsequent mean daily albedo increase slightly higher than 20 %, although it is short-lasting (from 1 to 4 days).
Shenolikar, Rahul; Bruno, Amanda Schofield; Eaddy, Michael; Cantrell, Christopher
2011-01-01
Background Several studies have examined the impact of formulary management strategies on medication use in the elderly, but little has been done to synthesize the findings to determine whether the results show consistent trends. Objective To summarize the effects of formulary controls (ie, tiered copays, step edits, prior authorization, and generic substitution) on medication use in the Medicare population to inform future Medicare Part D and other coverage decisions. Methods This systematic review included research articles (found via PubMed, Google Scholar, and specific scientific journals) that evaluated the impact of drug coverage or cost-sharing on medication use in elderly (aged ≥65 years) Medicare beneficiaries. The impact of drug coverage was assessed by comparing patients with some drug coverage to those with no drug coverage or by comparing varying levels of drug coverage (eg, full coverage vs $1000 coverage or capped benefits vs noncapped benefits). Articles that were published before 1995, were not original empirical research, were published in languages other than English, or focused on populations other than Medicare beneficiaries were excluded. All studies selected were classified as positive, negative, or neutral based on the significance of the relationship (P <.05 or as otherwise specified) between the formulary control mechanism and the medication use, and on the direction of that relationship. Results Included were a total of 47 research articles (published between 1995 and 2009) that evaluated the impact of drug coverage or cost-sharing on medication use in Medicare beneficiaries. Overall, 24 studies examined the impact of the level of drug coverage on medication use; of these, 96% (N = 23) supported the association between better drug coverage (ie, branded and generic vs generic-only coverage, capped benefit vs noncapped benefit, supplemental drug insurance vs no supplemental drug insurance) or having some drug coverage and enhanced medication use. Furthermore, 84% (N = 16) of the 19 studies that examined the effect of cost-sharing on medication use demonstrated that decreased cost-sharing was significantly associated with improved medication use. Conclusion Current evidence from the literature suggests that restricting drug coverage or increasing out-of-pocket expenses for Medicare beneficiaries may lead to decreased medication use in the elderly, with all its potential implications. PMID:25126370
Expanding health insurance for children: examining the alternatives.
Fronstin, P; Pierron, B
1997-07-01
This Issue Brief examines the issue of uninsured children. The budget reconciliation legislation currently under congressional consideration earmarks $16 billion for new initiatives to provide health insurance coverage to approximately 5 million of the 10 million uninsured children during the next five years. Proposals to expand coverage among children include the use of tax credits, subsidies, vouchers, Medicaid program expansion, and expansion of state programs. However, these proposals do not address the decline in employment-based health insurance coverage--the underlying cause of the lack of coverage, to the extent that a cause can be identified. What is worse, some proposals to expand health insurance among children may discourage employers from offering coverage. Between 1987 and 1995, the percentage of children with employment-based health insurance declined from 66.7 percent to 58.6 percent. Despite this trend, the percentage of children without any form of health insurance coverage barely increased. In 1987, 13.1 percent were uninsured, compared with 13.8 percent in 1995. Medicaid program expansions helped to alleviate the effects of the decline in employment-based health insurance coverage among children and the potential increase in the number of uninsured children. Between 1987 and 1995, the percentage of children enrolled in the Medicaid program increased from 15.5 percent to 23.2 percent. Some questions to consider in assessing approaches to improving children's health insurance coverage include the following: If the government intervenes, should it do so through a compulsory mechanism or a voluntary system? Is the employment-based system "worth saving" for children? In other words, are the market interventions necessary to keep this system functioning for children too regulatory, too intrusive, and too cumbersome to be practical? In addition to reforming the employment-based system, what reforms are necessary in order to reach those families who have no coverage through the work place? Which approaches are both efficient and politically acceptable? Employment-based coverage of children will likely continue. The challenge for lawmakers is to find a way to cover more uninsured children without eroding employment-based coverage. Several current legislative proposals attempt to avoid this problem by excluding children who have access to employment-based coverage. Without such a requirement, the opportunity to purchase coverage at a discount would create incentives for some low-income employees to drop dependent/family coverage, which in turn could lead some employers to drop their health plans.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voet, Peter W. J.; Dirkx, Maarten L. P.; Breedveld, Sebastiaan
2013-07-15
Purpose: To compare IMRT planning strategies for prostate cancer patients with metal hip prostheses.Methods: All plans were generated fully automatically (i.e., no human trial-and-error interactions) using iCycle, the authors' in-house developed algorithm for multicriterial selection of beam angles and optimization of fluence profiles, allowing objective comparison of planning strategies. For 18 prostate cancer patients (eight with bilateral hip prostheses, ten with a right-sided unilateral prosthesis), two planning strategies were evaluated: (i) full exclusion of beams containing beamlets that would deliver dose to the target after passing a prosthesis (IMRT{sub remove}) and (ii) exclusion of those beamlets only (IMRT{sub cut}). Plansmore » with optimized coplanar and noncoplanar beam arrangements were generated. Differences in PTV coverage and sparing of organs at risk (OARs) were quantified. The impact of beam number on plan quality was evaluated.Results: Especially for patients with bilateral hip prostheses, IMRT{sub cut} significantly improved rectum and bladder sparing compared to IMRT{sub remove}. For 9-beam coplanar plans, rectum V{sub 60Gy} reduced by 17.5%{+-} 15.0% (maximum 37.4%, p= 0.036) and rectum D{sub mean} by 9.4%{+-} 7.8% (maximum 19.8%, p= 0.036). Further improvements in OAR sparing were achievable by using noncoplanar beam setups, reducing rectum V{sub 60Gy} by another 4.6%{+-} 4.9% (p= 0.012) for noncoplanar 9-beam IMRT{sub cut} plans. Large reductions in rectum dose delivery were also observed when increasing the number of beam directions in the plans. For bilateral implants, the rectum V{sub 60Gy} was 37.3%{+-} 12.1% for coplanar 7-beam plans and reduced on average by 13.5% (maximum 30.1%, p= 0.012) for 15 directions.Conclusions: iCycle was able to automatically generate high quality plans for prostate cancer patients with prostheses. Excluding only beamlets that passed through the prostheses (IMRT{sub cut} strategy) significantly improved OAR sparing. Noncoplanar beam arrangements and, to a larger extent, increasing the number of treatment beams further improved plan quality.« less
Terminologies for text-mining; an experiment in the lipoprotein metabolism domain
Alexopoulou, Dimitra; Wächter, Thomas; Pickersgill, Laura; Eyre, Cecilia; Schroeder, Michael
2008-01-01
Background The engineering of ontologies, especially with a view to a text-mining use, is still a new research field. There does not yet exist a well-defined theory and technology for ontology construction. Many of the ontology design steps remain manual and are based on personal experience and intuition. However, there exist a few efforts on automatic construction of ontologies in the form of extracted lists of terms and relations between them. Results We share experience acquired during the manual development of a lipoprotein metabolism ontology (LMO) to be used for text-mining. We compare the manually created ontology terms with the automatically derived terminology from four different automatic term recognition (ATR) methods. The top 50 predicted terms contain up to 89% relevant terms. For the top 1000 terms the best method still generates 51% relevant terms. In a corpus of 3066 documents 53% of LMO terms are contained and 38% can be generated with one of the methods. Conclusions Given high precision, automatic methods can help decrease development time and provide significant support for the identification of domain-specific vocabulary. The coverage of the domain vocabulary depends strongly on the underlying documents. Ontology development for text mining should be performed in a semi-automatic way; taking ATR results as input and following the guidelines we described. Availability The TFIDF term recognition is available as Web Service, described at PMID:18460175
Visual content highlighting via automatic extraction of embedded captions on MPEG compressed video
NASA Astrophysics Data System (ADS)
Yeo, Boon-Lock; Liu, Bede
1996-03-01
Embedded captions in TV programs such as news broadcasts, documentaries and coverage of sports events provide important information on the underlying events. In digital video libraries, such captions represent a highly condensed form of key information on the contents of the video. In this paper we propose a scheme to automatically detect the presence of captions embedded in video frames. The proposed method operates on reduced image sequences which are efficiently reconstructed from compressed MPEG video and thus does not require full frame decompression. The detection, extraction and analysis of embedded captions help to capture the highlights of visual contents in video documents for better organization of video, to present succinctly the important messages embedded in the images, and to facilitate browsing, searching and retrieval of relevant clips.
The role of ERTS in the establishment and of a nationwide land cover information system
NASA Technical Reports Server (NTRS)
Abram, P.; Tullos, J.
1974-01-01
The economic potential of utilizing an ERTS type satellite in the development, updating, and maintenance of a nation-wide land cover information system in the post-1977 time frame was examined. Several alternative acquisition systems were evaluated for land cover data acquisition, processing, and interpretation costs in order to determine, on a total life cycle cost basis, under which conditions of user demand (i.e., area of coverage, frequency of coverage, timeliness of information, and level of information detail) an ERTS type satellite would be cost effective, and what the annual cost savings benefits would be. It was concluded that a three satellite system with high and low altitude aircraft and ground survey team utilizing automatic interpretation and classification techniques is an economically sound proposal.
Buti, Jacopo; Baccini, Michela; Nieri, Michele; La Marca, Michele; Pini-Prato, Giovan P
2013-04-01
The aim of this work was to conduct a Bayesian network meta-analysis (NM) of randomized controlled trials (RCTs) to establish a ranking in efficacy and the best technique for coronally advanced flap (CAF)-based root coverage procedures. A literature search on PubMed, Cochrane libraries, EMBASE, and hand-searched journals until June 2012 was conducted to identify RCTs on treatments of Miller Class I and II gingival recessions with at least 6 months of follow-up. The treatment outcomes were recession reduction (RecRed), clinical attachment gain (CALgain), keratinized tissue gain (KTgain), and complete root coverage (CRC). Twenty-nine studies met the inclusion criteria, 20 of which were classified as at high risk of bias. The CAF+connective tissue graft (CTG) combination ranked highest in effectiveness for RecRed (Probability of being the best = 40%) and CALgain (Pr = 33%); CAF+enamel matrix derivative (EMD) was slightly better for CRC; CAF+Collagen Matrix (CM) appeared effective for KTgain (Pr = 69%). Network inconsistency was low for all outcomes excluding CALgain. CAF+CTG might be considered the gold standard in root coverage procedures. The low amount of inconsistency gives support to the reliability of the present findings. © 2012 John Wiley & Sons A/S.
Gunja, Munira Z; Collins, Sara R; Doty, Michelle M; Beautel, Sophie
2017-08-01
ISSUE: Prior to the Affordable Care Act (ACA), one-third of women who tried to buy a health plan on their own were either turned down, charged a higher premium because of their health, or had specific health problems excluded from their plans. Beginning in 2010, ACA consumer protections, particularly coverage for preventive care screenings with no cost-sharing and a ban on plan benefit limits, improved the quality of health insurance for women. In 2014, the law’s major insurance reforms helped millions of women who did not have employer insurance to gain coverage through the ACA’s marketplaces or through Medicaid. GOALS: To examine the effects of ACA health reforms on women’s coverage and access to care. METHOD: Analysis of the Commonwealth Fund Biennial Health Insurance Surveys, 2001–2016. FINDINGS AND CONCLUSIONS: Women ages 19 to 64 who shopped for new coverage on their own found it significantly easier to find affordable plans in 2016 compared to 2010. The percentage of women who reported delaying or skipping needed care because of costs fell to an all-time low. Insured women were more likely than uninsured women to receive preventive screenings, including Pap tests and mammograms.
a Fast Approach for Stitching of Aerial Images
NASA Astrophysics Data System (ADS)
Moussa, A.; El-Sheimy, N.
2016-06-01
The last few years have witnessed an increasing volume of aerial image data because of the extensive improvements of the Unmanned Aerial Vehicles (UAVs). These newly developed UAVs have led to a wide variety of applications. A fast assessment of the achieved coverage and overlap of the acquired images of a UAV flight mission is of great help to save the time and cost of the further steps. A fast automatic stitching of the acquired images can help to visually assess the achieved coverage and overlap during the flight mission. This paper proposes an automatic image stitching approach that creates a single overview stitched image using the acquired images during a UAV flight mission along with a coverage image that represents the count of overlaps between the acquired images. The main challenge of such task is the huge number of images that are typically involved in such scenarios. A short flight mission with image acquisition frequency of one second can capture hundreds to thousands of images. The main focus of the proposed approach is to reduce the processing time of the image stitching procedure by exploiting the initial knowledge about the images positions provided by the navigation sensors. The proposed approach also avoids solving for all the transformation parameters of all the photos together to save the expected long computation time if all the parameters were considered simultaneously. After extracting the points of interest of all the involved images using Scale-Invariant Feature Transform (SIFT) algorithm, the proposed approach uses the initial image's coordinates to build an incremental constrained Delaunay triangulation that represents the neighborhood of each image. This triangulation helps to match only the neighbor images and therefore reduces the time-consuming features matching step. The estimated relative orientation between the matched images is used to find a candidate seed image for the stitching process. The pre-estimated transformation parameters of the images are employed successively in a growing fashion to create the stitched image and the coverage image. The proposed approach is implemented and tested using the images acquired through a UAV flight mission and the achieved results are presented and discussed.
Rodwin, Victor G.
2003-01-01
The French health system combines universal coverage with a public–private mix of hospital and ambulatory care and a higher volume of service provision than in the United States. Although the system is far from perfect, its indicators of health status and consumer satisfaction are high; its expenditures, as a share of gross domestic product, are far lower than in the United States; and patients have an extraordinary degree of choice among providers. Lessons for the United States include the importance of government’s role in providing a statutory framework for universal health insurance; recognition that piecemeal reform can broaden a partial program (like Medicare) to cover, eventually, the entire population; and understanding that universal coverage can be achieved without excluding private insurers from the supplementary insurance market. PMID:12511380
Can Skin Allograft Occasionally Act as a Permanent Coverage in Deep Burns? A Pilot Study
Rezaei, Ezzatollah; Beiraghi-Toosi, Arash; Ahmadabadi, Ali; Tavousi, Seyed Hassan; Alipour Tabrizi, Arash; Fotuhi, Kazem; Jabbari Nooghabi, Mehdi; Manafi, Amir; Ahmadi Moghadam, Shokoofeh
2017-01-01
BACKGROUND Skin allograft is the gold standard of wound coverage in patients with extensive burns; however, it is considered as a temporary wound coverage and rejection of the skin allograft is considered inevitable. In our study, skin allograft as a permanent coverage in deep burns is evaluated. METHODS Skin allograft survival was assessed in 38 patients from March 2009 to March 2014, retrospectively. Because of the lack of tissue specimen from the skin donors, patients with long skin allograft survival in whom the gender of donor and recipient of allograft was the same were excluded. Seven cases with skin allograft longevity and opposite gender in donor and recipient were finally enrolled. A polymerase chain reaction (PCR) test on the biopsy specimen from recipients and donors were undertaken. RESULTS PCR on the biopsy specimen from recipients confirmed those specimens belong to the donors. All patients received allograft from the opposite sex. Two (28.57%) patients received allograft from their first-degree blood relatives, and in one (14.29%) case, the allograft was harvested from an alive individual with no blood relation. The rest were harvested from multiorgan donors. In eight months of follow up, no clinical evidence of graft rejection was noted. CONCLUSION Long term persistence of skin allograft in patients is worthy of more attention. Further studies An increase in knowledge of factors influencing this longevity could realize the dream of burn surgeons to achieve a permanent coverage other than autograft for major burn patients. PMID:28289620
Can Skin Allograft Occasionally Act as a Permanent Coverage in Deep Burns? A Pilot Study.
Rezaei, Ezzatollah; Beiraghi-Toosi, Arash; Ahmadabadi, Ali; Tavousi, Seyed Hassan; Alipour Tabrizi, Arash; Fotuhi, Kazem; Jabbari Nooghabi, Mehdi; Manafi, Amir; Ahmadi Moghadam, Shokoofeh
2017-01-01
Skin allograft is the gold standard of wound coverage in patients with extensive burns; however, it is considered as a temporary wound coverage and rejection of the skin allograft is considered inevitable. In our study, skin allograft as a permanent coverage in deep burns is evaluated. Skin allograft survival was assessed in 38 patients from March 2009 to March 2014, retrospectively. Because of the lack of tissue specimen from the skin donors, patients with long skin allograft survival in whom the gender of donor and recipient of allograft was the same were excluded. Seven cases with skin allograft longevity and opposite gender in donor and recipient were finally enrolled. A polymerase chain reaction (PCR) test on the biopsy specimen from recipients and donors were undertaken. PCR on the biopsy specimen from recipients confirmed those specimens belong to the donors. All patients received allograft from the opposite sex. Two (28.57%) patients received allograft from their first-degree blood relatives, and in one (14.29%) case, the allograft was harvested from an alive individual with no blood relation. The rest were harvested from multiorgan donors. In eight months of follow up, no clinical evidence of graft rejection was noted. Long term persistence of skin allograft in patients is worthy of more attention. Further studies An increase in knowledge of factors influencing this longevity could realize the dream of burn surgeons to achieve a permanent coverage other than autograft for major burn patients.
NASA Astrophysics Data System (ADS)
Kortström, Jari; Tiira, Timo; Kaisko, Outi
2016-03-01
The Institute of Seismology of University of Helsinki is building a new local seismic network, called OBF network, around planned nuclear power plant in Northern Ostrobothnia, Finland. The network will consist of nine new stations and one existing station. The network should be dense enough to provide azimuthal coverage better than 180° and automatic detection capability down to ML -0.1 within a radius of 25 km from the site.The network construction work began in 2012 and the first four stations started operation at the end of May 2013. We applied an automatic seismic signal detection and event location system to a network of 13 stations consisting of the four new stations and the nearest stations of Finnish and Swedish national seismic networks. Between the end of May and December 2013 the network detected 214 events inside the predefined area of 50 km radius surrounding the planned nuclear power plant site. Of those detections, 120 were identified as spurious events. A total of 74 events were associated with known quarries and mining areas. The average location error, calculated as a difference between the announced location from environment authorities and companies and the automatic location, was 2.9 km. During the same time period eight earthquakes between magnitude range 0.1-1.0 occurred within the area. Of these seven could be automatically detected. The results from the phase 1 stations of the OBF network indicates that the planned network can achieve its goals.
Pipeline Reduction of Binary Light Curves from Large-Scale Surveys
NASA Astrophysics Data System (ADS)
Prša, Andrej; Zwitter, Tomaž
2007-08-01
One of the most important changes in observational astronomy of the 21st Century is a rapid shift from classical object-by-object observations to extensive automatic surveys. As CCD detectors are getting better and their prices are getting lower, more and more small and medium-size observatories are refocusing their attention to detection of stellar variability through systematic sky-scanning missions. This trend is additionally powered by the success of pioneering surveys such as ASAS, DENIS, OGLE, TASS, their space counterpart Hipparcos and others. Such surveys produce massive amounts of data and it is not at all clear how these data are to be reduced and analysed. This is especially striking in the eclipsing binary (EB) field, where most frequently used tools are optimized for object-by-object analysis. A clear need for thorough, reliable and fully automated approaches to modeling and analysis of EB data is thus obvious. This task is very difficult because of limited data quality, non-uniform phase coverage and parameter degeneracy. The talk will review recent advancements in putting together semi-automatic and fully automatic pipelines for EB data processing. Automatic procedures have already been used to process the Hipparcos data, LMC/SMC observations, OGLE and ASAS catalogs etc. We shall discuss the advantages and shortcomings of these procedures and overview the current status of automatic EB modeling pipelines for the upcoming missions such as CoRoT, Kepler, Gaia and others.
Maintaining and Enhancing Diversity of Sampled Protein Conformations in Robotics-Inspired Methods.
Abella, Jayvee R; Moll, Mark; Kavraki, Lydia E
2018-01-01
The ability to efficiently sample structurally diverse protein conformations allows one to gain a high-level view of a protein's energy landscape. Algorithms from robot motion planning have been used for conformational sampling, and several of these algorithms promote diversity by keeping track of "coverage" in conformational space based on the local sampling density. However, large proteins present special challenges. In particular, larger systems require running many concurrent instances of these algorithms, but these algorithms can quickly become memory intensive because they typically keep previously sampled conformations in memory to maintain coverage estimates. In addition, robotics-inspired algorithms depend on defining useful perturbation strategies for exploring the conformational space, which is a difficult task for large proteins because such systems are typically more constrained and exhibit complex motions. In this article, we introduce two methodologies for maintaining and enhancing diversity in robotics-inspired conformational sampling. The first method addresses algorithms based on coverage estimates and leverages the use of a low-dimensional projection to define a global coverage grid that maintains coverage across concurrent runs of sampling. The second method is an automatic definition of a perturbation strategy through readily available flexibility information derived from B-factors, secondary structure, and rigidity analysis. Our results show a significant increase in the diversity of the conformations sampled for proteins consisting of up to 500 residues when applied to a specific robotics-inspired algorithm for conformational sampling. The methodologies presented in this article may be vital components for the scalability of robotics-inspired approaches.
NASA Technical Reports Server (NTRS)
1976-01-01
A set of planning guidelines is presented to help law enforcement agencies and vehicle fleet operators decide which automatic vehicle monitoring (AVM) system could best meet their performance requirements. Improvements in emergency response times and resultant cost benefits obtainable with various operational and planned AVM systems may be synthesized and simulated by means of special computer programs for model city parameters applicable to small, medium and large urban areas. Design characteristics of various AVM systems and the implementation requirements are illustrated and cost estimated for the vehicles, the fixed sites and the base equipments. Vehicle location accuracies for different RF links and polling intervals are analyzed. Actual applications and coverage data are tabulated for seven cities whose police departments actively cooperated in the study.
Summarizing an Ontology: A "Big Knowledge" Coverage Approach.
Zheng, Ling; Perl, Yehoshua; Elhanan, Gai; Ochs, Christopher; Geller, James; Halper, Michael
2017-01-01
Maintenance and use of a large ontology, consisting of thousands of knowledge assertions, are hampered by its scope and complexity. It is important to provide tools for summarization of ontology content in order to facilitate user "big picture" comprehension. We present a parameterized methodology for the semi-automatic summarization of major topics in an ontology, based on a compact summary of the ontology, called an "aggregate partial-area taxonomy", followed by manual enhancement. An experiment is presented to test the effectiveness of such summarization measured by coverage of a given list of major topics of the corresponding application domain. SNOMED CT's Specimen hierarchy is the test-bed. A domain-expert provided a list of topics that serves as a gold standard. The enhanced results show that the aggregate taxonomy covers most of the domain's main topics.
Repliscan: a tool for classifying replication timing regions.
Zynda, Gregory J; Song, Jawon; Concia, Lorenzo; Wear, Emily E; Hanley-Bowdoin, Linda; Thompson, William F; Vaughn, Matthew W
2017-08-07
Replication timing experiments that use label incorporation and high throughput sequencing produce peaked data similar to ChIP-Seq experiments. However, the differences in experimental design, coverage density, and possible results make traditional ChIP-Seq analysis methods inappropriate for use with replication timing. To accurately detect and classify regions of replication across the genome, we present Repliscan. Repliscan robustly normalizes, automatically removes outlying and uninformative data points, and classifies Repli-seq signals into discrete combinations of replication signatures. The quality control steps and self-fitting methods make Repliscan generally applicable and more robust than previous methods that classify regions based on thresholds. Repliscan is simple and effective to use on organisms with different genome sizes. Even with analysis window sizes as small as 1 kilobase, reliable profiles can be generated with as little as 2.4x coverage.
1989-12-01
analyser epinephrine, dopamine, Cigala, F.; shop and sorting, and General Rodo cortisol measured Ricco, M.; matched for age, exposed personal dosimeters ...old sured by calibrated person- BP with semi-automatic de Vries, F. F.; duction depart- 31% 25-34 yrs.; al dosimeters with a short Waterpik...with without noise strain, history of hyper- tension excluded. 84 Table 3-23: continued. Suammuy of Epid oSl c Sades - contnud Bias and Potential
Host Genes and Resistance/Sensitivity to Military Priority Pathogens
2011-06-01
tularensis (FT Schu S4) that yields a significantly different outcome to infection in B6 and D2 mice. Both strains succumb to infection at essentially the...Figure 2). Some of the group sizes are too small to yield statistically relevant findings, and additional studies will be performed with these strains as...generated approximately 100-fold coverage of the DBA/2J genome (Table 2) and sequenced 99.96% of the DBA/2J genome (excluding gaps in the reference
NASA Astrophysics Data System (ADS)
Pries, V. V.; Proskuriakov, N. E.
2018-04-01
To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.
Burns, Rachel M.; Pacula, Rosalie L.; Bauhoff, Sebastian; Gordon, Adam J.; Hendrikson, Hollie; Leslie, Douglas L.; Stein, Bradley D.
2015-01-01
Background State Medicaid policies play an important role in Medicaid-enrollees' access to and use of opioid agonists, such as methadone and buprenorphine, in the treatment of opioid use disorders. Little information is available, however, regarding the evolution of state policies facilitating or hindering access to opioid agonists among Medicaid-enrollees. Methods During 2013-14, we surveyed state Medicaid officials and other designated state substance abuse treatment specialists about their state's recent history of Medicaid coverage and policies pertaining to methadone and buprenorphine. We describe the evolution of such coverage and policies and present an overview of the Medicaid policy environment with respect to opioid agonist therapy from 2004 to 2013. Results Among our sample of 45 states with information on buprenorphine and methadone coverage, we found a gradual trend toward adoption of coverage for opioid agonist therapies in state Medicaid agencies. In 2013, only 11% of states in our sample (n=5) had Medicaid policies that excluded coverage for methadone and buprenorphine, while 71% (n=32) had adopted or maintained policies to cover both buprenorphine and methadone among Medicaid-enrollees. We also noted an increase in policies over the time period that may have hindered access to buprenorphine and/or methadone. Conclusions There appears to be a trend for states to enact policies increasing Medicaid coverage of opioid agonist therapies, while in recent years also enacting policies, such as prior authorization requirements, that potentially serve as barriers to opioid agonist therapy utilization. Greater empirical information about the potential benefits and potential unintended consequences of such policies can provide policymakers and others with a more informed understanding of their policy decisions. PMID:26566761
Bhatia, Amiya; Ferreira, Leonardo Zanini; Barros, Aluísio J D; Victora, Cesar Gomes
2017-08-18
Birth registration, and the possession of a birth certificate as proof of registration, has long been recognized as a fundamental human right. Data from a functioning civil registration and vital statistics (CRVS) system allows governments to benefit from accurate and universal data on birth and death rates. However, access to birth certificates remains challenging and unequal in many low and middle-income countries. This paper examines wealth, urban/rural and gender inequalities in birth certificate coverage. We analyzed nationally representative household surveys from 94 countries between 2000 and 2014 using Demographic Health Surveys and Multiple Indicator Cluster Surveys. Birth certificate coverage among children under five was examined at the national and regional level. Absolute measures of inequality were used to measure inequalities in birth certificate coverage by wealth quintile, urban/rural residence and sex of the child. Over four million children were included in the analysis. Birth certificate coverage was over 90% in 29 countries and below 50% in 36 countries, indicating that more than half the children under five surveyed in these countries did not have a birth certificate. Eastern & Southern Africa had the lowest average birth certificate coverage (26.9%) with important variability among countries. Significant wealth inequalities in birth certificate coverage were observed in 74 countries and in most UNICEF regions, and urban/rural inequalities were present in 60 countries. Differences in birth certificate coverage between girls and boys tended to be small. We show that wealth and urban/rural inequalities in birth certificate coverage persist in most low and middle income countries, including countries where national birth certificate coverage is between 60 and 80%. Weak CRVS systems, particularly in South Asia and Africa lead rural and poor children to be systematically excluded from the benefits tied to a birth certificate, and prevent these children from being counted in national health data. Greater funding and attention is needed to strengthen CRVS systems and equity analyses should inform such efforts, especially as data needs for the Sustainable Development Goals expand. Monitoring disaggregated data on birth certificate coverage is essential to reducing inequalities in who is counted and registered. Strengthening CRVS systems can enable a child's right to identity, improve health data and promote equity.
Policy Choices for Progressive Realization of Universal Health Coverage
Tangcharoensathien, Viroj; Patcharanarumol, Walaiporn; Panichkriangkrai, Warisa; Sommanustweechai, Angkana
2017-01-01
In responses to Norheim’s editorial, this commentary offers reflections from Thailand, how the five unacceptable trade-offs were applied to the universal health coverage (UHC) reforms between 1975 and 2002 when the whole 64 million people were covered by one of the three public health insurance systems. This commentary aims to generate global discussions on how best UHC can be gradually achieved. Not only the proposed five discrete trade-offs within each dimension, there are also trade-offs between the three dimensions of UHC such as population coverage, service coverage and cost coverage. Findings from Thai UHC show that equity is applied for the population coverage extension, when the low income households and the informal sector were the priority population groups for coverage extension by different prepayment schemes in 1975 and 1984, respectively. With an exception of public sector employees who were historically covered as part of fringe benefits were covered well before the poor. The private sector employees were covered last in 1990. Historically, Thailand applied a comprehensive benefit package where a few items are excluded using the negative list; until there was improved capacities on technology assessment that cost-effectiveness are used for the inclusion of new interventions into the benefit package. Not only cost-effectiveness, but long term budget impact, equity and ethical considerations are taken into account. Cost coverage is mostly determined by the fiscal capacities. Close ended budget with mix of provider payment methods are used as a tool for trade-off service coverage and financial risk protection. Introducing copayment in the context of fee-for-service can be harmful to beneficiaries due to supplier induced demands, inefficiency and unpredictable out of pocket payment by households. UHC achieves favorable outcomes as it was implemented when there was a full geographical coverage of primary healthcare coverage in all districts and sub-districts after three decade of health infrastructure investment and health workforce development since 1980s. The legacy of targeting population group by different prepayment mechanisms, leading to fragmentation, discrepancies and inequity across schemes, can be rectified by harmonization at the early phase when these schemes were introduced. Robust public accountability and participation mechanisms are recommended when deciding the UHC strategy. PMID:28812786
Assessing a computerized routine health information system in Mali using LQAS.
Stewart, J C; Schroeder, D G; Marsh, D R; Allhasane, S; Kone, D
2001-09-01
Between 1987 and 1998 Save the Children conducted a child survival programme in Mali with the goal of reducing maternal and child morbidity and mortality. An integral part of this programme was a computerized demographic surveillance and health information system (HIS) that gathered data on individuals on an on-going basis. To assess the overall coverage and quality of the data in the HIS, to identify specific health districts that needed improvements in data collection methods, and to determine particular areas of weakness in data collection. Random samples of 20 mothers with children <5 years were selected in each of 14 health districts. Mothers were interviewed about pregnancies, live births, deaths of children <5, and children's growth monitoring and immunization status. The Lot Quality Assurance Method (LQAS) was used to identify districts in which records and interview results did not meet predetermined levels of acceptability. Data collected in the interviews were combined to estimate overall coverage and quality. When all variables were analyzed, all 14 lots were rejected, and it was estimated that 52% of all events occurring in the community were registered in ProMIS. Much of this poor performance was due to immunization and growth monitoring data, which were not updated due to printer problems. Coverage of events increased (92%) when immunizations and growth monitoring were excluded, and no lots were rejected. When all variables were analyzed for quality of data recorded, six lots were rejected and the overall estimation was 83%. With immunizations and growth monitoring excluded, overall quality was 86% and no lots were rejected. The comprehensive computerized HIS did not meet expectations. This may be due, in part, to the ambitious objective of complete and intensive monitoring of a large population without adequate staff and equipment. Future efforts should consider employing a more targeted and streamlined HIS so that data can be more complete and useful.
Decision-level fusion of SAR and IR sensor information for automatic target detection
NASA Astrophysics Data System (ADS)
Cho, Young-Rae; Yim, Sung-Hyuk; Cho, Hyun-Woong; Won, Jin-Ju; Song, Woo-Jin; Kim, So-Hyeon
2017-05-01
We propose a decision-level architecture that combines synthetic aperture radar (SAR) and an infrared (IR) sensor for automatic target detection. We present a new size-based feature, called target-silhouette to reduce the number of false alarms produced by the conventional target-detection algorithm. Boolean Map Visual Theory is used to combine a pair of SAR and IR images to generate the target-enhanced map. Then basic belief assignment is used to transform this map into a belief map. The detection results of sensors are combined to build the target-silhouette map. We integrate the fusion mass and the target-silhouette map on the decision level to exclude false alarms. The proposed algorithm is evaluated using a SAR and IR synthetic database generated by SE-WORKBENCH simulator, and compared with conventional algorithms. The proposed fusion scheme achieves higher detection rate and lower false alarm rate than the conventional algorithms.
An automatic approach to exclude interlopers from asteroid families
NASA Astrophysics Data System (ADS)
Radović, Viktor; Novaković, Bojan; Carruba, Valerio; Marčeta, Dušan
2017-09-01
Asteroid families are a valuable source of information to many asteroid-related researches, assuming a reliable list of their members could be obtained. However, as the number of known asteroids increases fast it becomes more and more difficult to obtain a robust list of members of an asteroid family. Here, we are proposing a new approach to deal with the problem, based on the well-known hierarchical clustering method. An additional step in the whole procedure is introduced in order to reduce a so-called chaining effect. The main idea is to prevent chaining through an already identified interloper. We show that in this way a number of potential interlopers among family members is significantly reduced. Moreover, we developed an automatic online-based portal to apply this procedure, I.e. to generate a list of family members as well as a list of potential interlopers. The Asteroid Families Portal is freely available to all interested researchers.
2013-01-01
Malaria vectors which predominantly feed indoors upon humans have been locally eliminated from several settings with insecticide treated nets (ITNs), indoor residual spraying or larval source management. Recent dramatic declines of An. gambiae in east Africa with imperfect ITN coverage suggest mosquito populations can rapidly collapse when forced below realistically achievable, non-zero thresholds of density and supporting resource availability. Here we explain why insecticide-based mosquito elimination strategies are feasible, desirable and can be extended to a wider variety of species by expanding the vector control arsenal to cover a broader spectrum of the resources they need to survive. The greatest advantage of eliminating mosquitoes, rather than merely controlling them, is that this precludes local selection for behavioural or physiological resistance traits. The greatest challenges are therefore to achieve high biological coverage of targeted resources rapidly enough to prevent local emergence of resistance and to then continually exclude, monitor for and respond to re-invasion from external populations. PMID:23758937
Smith, Peter M; Mustard, Cameron A; Payne, Jennifer I
2004-01-01
This paper presents a methodology for estimating the size and composition of the Ontario labour force eligible for coverage under the Ontario Workplace Safety & Insurance Act (WSIA). Using customized tabulations from Statistics Canada's Labour Force Survey (LFS), we made adjustments for self-employment, unemployment, part-time employment and employment in specific industrial sectors excluded from insurance coverage under the WSIA. Each adjustment to the LFS reduced the estimates of the insured labour force relative to the total Ontario labour force. These estimates were then developed for major occupational and industrial groups stratified by gender. Additional estimates created to test assumptions used in the methodology produced similar results. The methods described in this paper advance those previously used to estimate the insured labour force, providing researchers with a useful tool to describe trends in the rate of injury across differing occupational, industrial and gender groups in Ontario.
Cheong, Jadeera Phaik Geok; Khoo, Selina; Razman, Rizal
2016-01-01
This study analyzed newspaper coverage of the 2012 London Paralympic Games by 8 Malaysian newspapers. Articles and photographs from 4 English-language and 4 Malay-language newspapers were examined from August 28 (1 day before the Games) to September 10, 2012 (1 day after the Games closing). Tables, graphs, letters, fact boxes, and lists of events were excluded from analysis. A total of 132 articles and 131 photographs were analyzed. Content analysis of the newspaper articles revealed that most (62.8%) of the articles contained positive reference to the athletes with a disability. There were equal numbers (39.1%) of action and static shots of athletes. More articles and photographs of Malaysian (58%) than non-Malaysian (42%) athletes with a disability were identified. Only 14.9% of the articles and photographs were related to female athletes with a disability.
Influence of Media on Seasonal Influenza Epidemic Curves.
Saito, Satoshi; Saito, Norihiro; Itoga, Masamichi; Ozaki, Hiromi; Kimura, Toshiyuki; Okamura, Yuji; Murakami, Hiroshi; Kayaba, Hiroyuki
2016-09-01
Theoretical investigations predicting the epidemic curves of seasonal influenza have been demonstrated so far; however, there is little empirical research using ever accumulated epidemic curves. The effects of vaccine coverage and information distribution on influenza epidemics were evaluated. Four indices for epidemics (i.e., onset-peak duration, onset-end duration, ratio of the onset-peak duration to onset-end duration and steepness of epidemic curves) were defined, and the correlations between these indices and anti-flu drug prescription dose, vaccine coverage, the volume of media and search trend on influenza through internet were analyzed. Epidemiological data on seasonal influenza epidemics from 2002/2003 to 2013/2014 excluding 2009/2010 season were collected from National Institute of Infectious Diseases of Japan. The onset-peak duration and its ratio to onset-end duration correlated inversely with the volume of anti-flu drug prescription. Onset-peak duration correlated positively with media information volume on influenza. The steepness of the epidemic curve, and anti-flu drug prescription dose inversely correlated with the volume of media information. Pre-epidemic search trend and media volume on influenza correlated with the vaccine coverage in the season. Vaccine coverage had no strong effect on epidemic curve. Education through media has an effect on the epidemic curve of seasonal influenza. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Rosinski, Alexander Anthony; Narine, Steven; Yamey, Gavin
2013-01-01
Background In 2010, diarrhea caused 0.75 million child deaths, accounting for nearly 12% of all under-five mortality worldwide. Many evidence-based interventions can reduce diarrhea mortality, including oral rehydration solution (ORS), zinc, and improved sanitation. Yet global coverage levels of such interventions remain low. A new scorecard of diarrhea control, showing how different countries are performing in their control efforts, could draw greater attention to the low coverage levels of proven interventions. Methods We conducted in-depth qualitative interviews with 21 experts, purposively sampled for their relevant academic or implementation expertise, to explore their views on (a) the value of a scorecard of global diarrhea control and (b) which indicators should be included in such a scorecard. We then conducted a ranking exercise in which we compiled a list of all 49 indicators suggested by the experts, sent the list to the 21 experts, and asked them to choose 10 indicators that they would include and 10 that they would exclude from such a scorecard. Finally, we created a “prototype” scorecard based on the 9 highest-ranked indicators. Results Key themes that emerged from coding the interview transcripts were: a scorecard could facilitate country comparisons; it could help to identify best practices, set priorities, and spur donor action; and it could help with goal-setting and accountability in diarrhea control. The nine highest ranking indicators, in descending order, were ORS coverage, rotavirus vaccine coverage, zinc coverage, diarrhea-specific mortality rate, diarrhea prevalence, proportion of population with access to improved sanitation, proportion with access to improved drinking water, exclusive breastfeeding coverage, and measles vaccine coverage. Conclusion A new scorecard of global diarrhea control could help track progress, focus prevention and treatment efforts on the most effective interventions, establish transparency and accountability, and alert donors and ministries of health to inadequacies in diarrhea control efforts. PMID:23874412
Automated coronary artery calcification detection on low-dose chest CT images
NASA Astrophysics Data System (ADS)
Xie, Yiting; Cham, Matthew D.; Henschke, Claudia; Yankelevitz, David; Reeves, Anthony P.
2014-03-01
Coronary artery calcification (CAC) measurement from low-dose CT images can be used to assess the risk of coronary artery disease. A fully automatic algorithm to detect and measure CAC from low-dose non-contrast, non-ECG-gated chest CT scans is presented. Based on the automatically detected CAC, the Agatston score (AS), mass score and volume score were computed. These were compared with scores obtained manually from standard-dose ECG-gated scans and low-dose un-gated scans of the same patient. The automatic algorithm segments the heart region based on other pre-segmented organs to provide a coronary region mask. The mitral valve and aortic valve calcification is identified and excluded. All remaining voxels greater than 180HU within the mask region are considered as CAC candidates. The heart segmentation algorithm was evaluated on 400 non-contrast cases with both low-dose and regular dose CT scans. By visual inspection, 371 (92.8%) of the segmentations were acceptable. The automated CAC detection algorithm was evaluated on 41 low-dose non-contrast CT scans. Manual markings were performed on both low-dose and standard-dose scans for these cases. Using linear regression, the correlation of the automatic AS with the standard-dose manual scores was 0.86; with the low-dose manual scores the correlation was 0.91. Standard risk categories were also computed. The automated method risk category agreed with manual markings of gated scans for 24 cases while 15 cases were 1 category off. For low-dose scans, the automatic method agreed with 33 cases while 7 cases were 1 category off.
NASA Astrophysics Data System (ADS)
Li, Ke; Ye, Chuyang; Yang, Zhen; Carass, Aaron; Ying, Sarah H.; Prince, Jerry L.
2016-03-01
Cerebellar peduncles (CPs) are white matter tracts connecting the cerebellum to other brain regions. Automatic segmentation methods of the CPs have been proposed for studying their structure and function. Usually the performance of these methods is evaluated by comparing segmentation results with manual delineations (ground truth). However, when a segmentation method is run on new data (for which no ground truth exists) it is highly desirable to efficiently detect and assess algorithm failures so that these cases can be excluded from scientific analysis. In this work, two outlier detection methods aimed to assess the performance of an automatic CP segmentation algorithm are presented. The first one is a univariate non-parametric method using a box-whisker plot. We first categorize automatic segmentation results of a dataset of diffusion tensor imaging (DTI) scans from 48 subjects as either a success or a failure. We then design three groups of features from the image data of nine categorized failures for failure detection. Results show that most of these features can efficiently detect the true failures. The second method—supervised classification—was employed on a larger DTI dataset of 249 manually categorized subjects. Four classifiers—linear discriminant analysis (LDA), logistic regression (LR), support vector machine (SVM), and random forest classification (RFC)—were trained using the designed features and evaluated using a leave-one-out cross validation. Results show that the LR performs worst among the four classifiers and the other three perform comparably, which demonstrates the feasibility of automatically detecting segmentation failures using classification methods.
Chen, Ke-ping; Xu, Geng; Wu, Shulin; Tang, Baopeng; Wang, Li; Zhang, Shu
2013-03-01
The present study was to assess the accuracy of automatic atrial and ventricular capture management (ACM and VCM) in determining pacing threshold and the performance of a second-generation automatic atrioventricular (AV) interval extension algorithm for reducing unnecessary ventricular pacing. A total of 398 patients at 32 centres who received an EnPulse dual-chamber pacing/dual-chamber adaptive rate pacing pacemaker (Medtronic, Minneapolis, MN, USA) were enrolled. The last amplitude thresholds as measured by ACM and VCM prior to the 6-month follow-up were compared with manually measured thresholds. Device diagnostics were used to evaluate ACM and VCM and the percentage of ventricular pacing with and without the AV extension algorithm. Modelling was performed to assess longevity gains relating to the use of automaticity features. Atrial and ventricular capture management performed accurately and reliably provided complete capture management in 97% of studied patients. The AV interval extension algorithm reduced the median per cent of right ventricular pacing in patients with sinus node dysfunction from 99.7 to 1.5% at 6-month follow-up and in patients with intermittent AV block (excluding persistent 3° AV block) from 99.9 to 50.2%. On the basis of validated modelling, estimated device longevity could potentially be extended by 1.9 years through the use of the capture management and AV interval extension features. Both ACM and VCM features reliably measured thresholds in nearly all patients; the AV extension algorithm significantly reduced ventricular pacing; and the use of pacemaker automaticity features potentially extends device longevity.
NASA Astrophysics Data System (ADS)
Koelmel, Jeremy P.; Kroeger, Nicholas M.; Gill, Emily L.; Ulmer, Candice Z.; Bowden, John A.; Patterson, Rainey E.; Yost, Richard A.; Garrett, Timothy J.
2017-05-01
Untargeted omics analyses aim to comprehensively characterize biomolecules within a biological system. Changes in the presence or quantity of these biomolecules can indicate important biological perturbations, such as those caused by disease. With current technological advancements, the entire genome can now be sequenced; however, in the burgeoning fields of lipidomics, only a subset of lipids can be identified. The recent emergence of high resolution tandem mass spectrometry (HR-MS/MS), in combination with ultra-high performance liquid chromatography, has resulted in an increased coverage of the lipidome. Nevertheless, identifications from MS/MS are generally limited by the number of precursors that can be selected for fragmentation during chromatographic elution. Therefore, we developed the software IE-Omics to automate iterative exclusion (IE), where selected precursors using data-dependent topN analyses are excluded in sequential injections. In each sequential injection, unique precursors are fragmented until HR-MS/MS spectra of all ions above a user-defined intensity threshold are acquired. IE-Omics was applied to lipidomic analyses in Red Cross plasma and substantia nigra tissue. Coverage of the lipidome was drastically improved using IE. When applying IE-Omics to Red Cross plasma and substantia nigra lipid extracts in positive ion mode, 69% and 40% more molecular identifications were obtained, respectively. In addition, applying IE-Omics to a lipidomics workflow increased the coverage of trace species, including odd-chained and short-chained diacylglycerides and oxidized lipid species. By increasing the coverage of the lipidome, applying IE to a lipidomics workflow increases the probability of finding biomarkers and provides additional information for determining etiology of disease.
Recognizing Action Units for Facial Expression Analysis
Tian, Ying-li; Kanade, Takeo; Cohn, Jeffrey F.
2010-01-01
Most automatic expression analysis systems attempt to recognize a small set of prototypic expressions, such as happiness, anger, surprise, and fear. Such prototypic expressions, however, occur rather infrequently. Human emotions and intentions are more often communicated by changes in one or a few discrete facial features. In this paper, we develop an Automatic Face Analysis (AFA) system to analyze facial expressions based on both permanent facial features (brows, eyes, mouth) and transient facial features (deepening of facial furrows) in a nearly frontal-view face image sequence. The AFA system recognizes fine-grained changes in facial expression into action units (AUs) of the Facial Action Coding System (FACS), instead of a few prototypic expressions. Multistate face and facial component models are proposed for tracking and modeling the various facial features, including lips, eyes, brows, cheeks, and furrows. During tracking, detailed parametric descriptions of the facial features are extracted. With these parameters as the inputs, a group of action units (neutral expression, six upper face AUs and 10 lower face AUs) are recognized whether they occur alone or in combinations. The system has achieved average recognition rates of 96.4 percent (95.4 percent if neutral expressions are excluded) for upper face AUs and 96.7 percent (95.6 percent with neutral expressions excluded) for lower face AUs. The generalizability of the system has been tested by using independent image databases collected and FACS-coded for ground-truth by different research teams. PMID:25210210
Cacari Stone, Lisa; Steimel, Leah; Vasquez-Guzman, Estela; Kaufman, Arthur
2014-04-01
Academic health centers (AHCs) are at the forefront of delivering care to the diverse medically underserved and uninsured populations in the United States, as well as training the majority of the health care workforce, who are professionally obligated to serve all patients regardless of race or immigration status. Despite AHCs' central leadership role in these endeavors, few consolidated efforts have emerged to resolve potential conflicts between national, state, and local policies that exclude certain classifications of immigrants from receiving federal public assistance and health professionals' social missions and ethical oath to serve humanity. For instance, whereas the 2010 Patient Protection and Affordable Care Act provides a pathway to insurance coverage for more than 30 million Americans, undocumented immigrants and legally documented immigrants residing in the United States for less than five years are ineligible for Medicaid and excluded from purchasing any type of coverage through state exchanges. To inform this debate, the authors describe their experience at the University of New Mexico Hospital (UNMH) and discuss how the UNMH has responded to this challenge and overcome barriers. They offer three recommendations for aligning AHCs' social missions and professional ethics with organizational policies: (1) that AHCs determine eligibility for financial assistance based on residency rather than citizenship, (2) that models of medical education and health professions training provide students with service-learning opportunities and applied community experience, and (3) that frontline staff and health care professionals receive standardized training on eligibility policies to minimize discrimination towards immigrant patients.
Mason Tenders agrees to pay $1 million to end ADA litigation.
1995-12-29
The [name removed] District Council Welfare Fund has agreed to pay $1 million to construction workers who have been denied medical coverage for AIDS-related care. The decision establishes self-insured health care benefits programs as covered entities under the Americans with Disabilities Act (ADA). The settlement ends a three-year battle which began in 1992 between [name removed] and fourteen HIV-positive construction workers who were refused medical coverage. The first suit was filed by [name removed]., a construction worker who lost coverage for his HIV-related care in July 1991. At that time, the union fund decided to exclude care for HIV on the grounds that it was too expensive. The Equal Employment Opportunity Commission (EEOC) filed an ADA lawsuit that challenged disability-based distinctions in health insurance. The U.S. Attorney's Office filed a complaint against the union under the Racketeer Influenced and Corrupt Organizations (RICO) statute to end organized crime associated with the union. In late 1994, the government announced a consent decree, settling its racketeering suit against the union. Under the terms of the settlement, [name removed] was awarded $16,000 in damages. In the EEOC case, damages for plan members ranged as high as $50,000.
ABI Base Recall: Automatic Correction and Ends Trimming of DNA Sequences.
Elyazghi, Zakaria; Yazouli, Loubna El; Sadki, Khalid; Radouani, Fouzia
2017-12-01
Automated DNA sequencers produce chromatogram files in ABI format. When viewing chromatograms, some ambiguities are shown at various sites along the DNA sequences, because the program implemented in the sequencing machine and used to call bases cannot always precisely determine the right nucleotide, especially when it is represented by either a broad peak or a set of overlaying peaks. In such cases, a letter other than A, C, G, or T is recorded, most commonly N. Thus, DNA sequencing chromatograms need manual examination: checking for mis-calls and truncating the sequence when errors become too frequent. The purpose of this paper is to develop a program allowing the automatic correction of these ambiguities. This application is a Web-based program powered by Shiny and runs under R platform for an easy exploitation. As a part of the interface, we added the automatic ends clipping option, alignment against reference sequences, and BLAST. To develop and test our tool, we collected several bacterial DNA sequences from different laboratories within Institut Pasteur du Maroc and performed both manual and automatic correction. The comparison between the two methods was carried out. As a result, we note that our program, ABI base recall, accomplishes good correction with a high accuracy. Indeed, it increases the rate of identity and coverage and minimizes the number of mismatches and gaps, hence it provides solution to sequencing ambiguities and saves biologists' time and labor.
Location Distribution Optimization of Photographing Sites for Indoor Panorama Modeling
NASA Astrophysics Data System (ADS)
Zhang, S.; Wu, J.; Zhang, Y.; Zhang, X.; Xin, Z.; Liu, J.
2017-09-01
Generally, panoramas image modeling is costly and time-consuming because of photographing continuously to capture enough photos along the routes, especially in complicated indoor environment. Thus, difficulty follows for a wider applications of panoramic image modeling for business. It is indispensable to make a feasible arrangement of panorama sites locations because the locations influence the clarity, coverage and the amount of panoramic images under the condition of certain device. This paper is aim to propose a standard procedure to generate the specific location and total amount of panorama sites in indoor panoramas modeling. Firstly, establish the functional relationship between one panorama site and its objectives. Then, apply the relationship to panorama sites network. We propose the Distance Clarity function (FC and Fe) manifesting the mathematical relationship between panoramas and objectives distance or obstacle distance. The Distance Buffer function (FB) is modified from traditional buffer method to generate the coverage of panorama site. Secondly, transverse every point in possible area to locate possible panorama site, calculate the clarity and coverage synthetically. Finally select as little points as possible to satiate clarity requirement preferentially and then the coverage requirement. In the experiments, detailed parameters of camera lens are given. Still, more experiments parameters need trying out given that relationship between clarity and distance is device dependent. In short, through the function FC, Fe and FB, locations of panorama sites can be generated automatically and accurately.
GEM System: automatic prototyping of cell-wide metabolic pathway models from genomes.
Arakawa, Kazuharu; Yamada, Yohei; Shinoda, Kosaku; Nakayama, Yoichi; Tomita, Masaru
2006-03-23
Successful realization of a "systems biology" approach to analyzing cells is a grand challenge for our understanding of life. However, current modeling approaches to cell simulation are labor-intensive, manual affairs, and therefore constitute a major bottleneck in the evolution of computational cell biology. We developed the Genome-based Modeling (GEM) System for the purpose of automatically prototyping simulation models of cell-wide metabolic pathways from genome sequences and other public biological information. Models generated by the GEM System include an entire Escherichia coli metabolism model comprising 968 reactions of 1195 metabolites, achieving 100% coverage when compared with the KEGG database, 92.38% with the EcoCyc database, and 95.06% with iJR904 genome-scale model. The GEM System prototypes qualitative models to reduce the labor-intensive tasks required for systems biology research. Models of over 90 bacterial genomes are available at our web site.
Hervás, Marcos; Alsina-Pagès, Rosa Ma; Alías, Francesc; Salvador, Martí
2017-06-08
Fast environmental variations due to climate change can cause mass decline or even extinctions of species, having a dramatic impact on the future of biodiversity. During the last decade, different approaches have been proposed to track and monitor endangered species, generally based on costly semi-automatic systems that require human supervision adding limitations in coverage and time. However, the recent emergence of Wireless Acoustic Sensor Networks (WASN) has allowed non-intrusive remote monitoring of endangered species in real time through the automatic identification of the sound they emit. In this work, an FPGA-based WASN centralized architecture is proposed and validated on a simulated operation environment. The feasibility of the architecture is evaluated in a case study designed to detect the threatened Botaurus stellaris among other 19 cohabiting birds species in The Parc Natural dels Aiguamolls de l'Empord.
Enhancing acronym/abbreviation knowledge bases with semantic information.
Torii, Manabu; Liu, Hongfang
2007-10-11
In the biomedical domain, a terminology knowledge base that associates acronyms/abbreviations (denoted as SFs) with the definitions (denoted as LFs) is highly needed. For the construction such terminology knowledge base, we investigate the feasibility to build a system automatically assigning semantic categories to LFs extracted from text. Given a collection of pairs (SF,LF) derived from text, we i) assess the coverage of LFs and pairs (SF,LF) in the UMLS and justify the need of a semantic category assignment system; and ii) automatically derive name phrases annotated with semantic category and construct a system using machine learning. Utilizing ADAM, an existing collection of (SF,LF) pairs extracted from MEDLINE, our system achieved an f-measure of 87% when assigning eight UMLS-based semantic groups to LFs. The system has been incorporated into a web interface which integrates SF knowledge from multiple SF knowledge bases. Web site: http://gauss.dbb.georgetown.edu/liblab/SFThesurus.
Névéol, Aurélie; Shooshan, Sonya E; Mork, James G; Aronson, Alan R
2007-10-11
This paper reports on the latest results of an Indexing Initiative effort addressing the automatic attachment of subheadings to MeSH main headings recommended by the NLM's Medical Text Indexer. Several linguistic and statistical approaches are used to retrieve and attach the subheadings. Continuing collaboration with NLM indexers also provided insight on how automatic methods can better enhance indexing practice. The methods were evaluated on corpus of 50,000 MEDLINE citations. For main heading/subheading pair recommendations, the best precision is obtained with a post-processing rule method (58%) while the best recall is obtained by pooling all methods (64%). For stand-alone subheading recommendations, the best performance is obtained with the PubMed Related Citations algorithm. Significant progress has been made in terms of subheading coverage. After further evaluation, some of this work may be integrated in the MEDLINE indexing workflow.
Névéol, Aurélie; Shooshan, Sonya E.; Mork, James G.; Aronson, Alan R.
2007-01-01
Objective This paper reports on the latest results of an Indexing Initiative effort addressing the automatic attachment of subheadings to MeSH main headings recommended by the NLM’s Medical Text Indexer. Material and Methods Several linguistic and statistical approaches are used to retrieve and attach the subheadings. Continuing collaboration with NLM indexers also provided insight on how automatic methods can better enhance indexing practice. Results The methods were evaluated on corpus of 50,000 MEDLINE citations. For main heading/subheading pair recommendations, the best precision is obtained with a post-processing rule method (58%) while the best recall is obtained by pooling all methods (64%). For stand-alone subheading recommendations, the best performance is obtained with the PubMed Related Citations algorithm. Conclusion Significant progress has been made in terms of subheading coverage. After further evaluation, some of this work may be integrated in the MEDLINE indexing workflow. PMID:18693897
NASA Astrophysics Data System (ADS)
Wang, Hongcui; Kawahara, Tatsuya
CALL (Computer Assisted Language Learning) systems using ASR (Automatic Speech Recognition) for second language learning have received increasing interest recently. However, it still remains a challenge to achieve high speech recognition performance, including accurate detection of erroneous utterances by non-native speakers. Conventionally, possible error patterns, based on linguistic knowledge, are added to the lexicon and language model, or the ASR grammar network. However, this approach easily falls in the trade-off of coverage of errors and the increase of perplexity. To solve the problem, we propose a method based on a decision tree to learn effective prediction of errors made by non-native speakers. An experimental evaluation with a number of foreign students learning Japanese shows that the proposed method can effectively generate an ASR grammar network, given a target sentence, to achieve both better coverage of errors and smaller perplexity, resulting in significant improvement in ASR accuracy.
Scharl, Arno; Hubmann-Haidvogel, Alexander; Jones, Alistair; Fischl, Daniel; Kamolov, Ruslan; Weichselbraun, Albert; Rafelsberger, Walter
2016-01-01
This paper presents a Web intelligence portal that captures and aggregates news and social media coverage about "Game of Thrones", an American drama television series created for the HBO television network based on George R.R. Martin's series of fantasy novels. The system collects content from the Web sites of Anglo-American news media as well as from four social media platforms: Twitter, Facebook, Google+ and YouTube. An interactive dashboard with trend charts and synchronized visual analytics components not only shows how often Game of Thrones events and characters are being mentioned by journalists and viewers, but also provides a real-time account of concepts that are being associated with the unfolding storyline and each new episode. Positive or negative sentiment is computed automatically, which sheds light on the perception of actors and new plot elements.
Ffuzz: Towards full system high coverage fuzz testing on binary executables.
Zhang, Bin; Ye, Jiaxi; Bi, Xing; Feng, Chao; Tang, Chaojing
2018-01-01
Bugs and vulnerabilities in binary executables threaten cyber security. Current discovery methods, like fuzz testing, symbolic execution and manual analysis, both have advantages and disadvantages when exercising the deeper code area in binary executables to find more bugs. In this paper, we designed and implemented a hybrid automatic bug finding tool-Ffuzz-on top of fuzz testing and selective symbolic execution. It targets full system software stack testing including both the user space and kernel space. Combining these two mainstream techniques enables us to achieve higher coverage and avoid getting stuck both in fuzz testing and symbolic execution. We also proposed two key optimizations to improve the efficiency of full system testing. We evaluated the efficiency and effectiveness of our method on real-world binary software and 844 memory corruption vulnerable programs in the Juliet test suite. The results show that Ffuzz can discover software bugs in the full system software stack effectively and efficiently.
Ships and Maritime Targets Observation Campaigns Using Available C- and X-Band SAR Satellite
NASA Astrophysics Data System (ADS)
Velotto, Domenico; Bentes, Carlos; Lehner, Susanne
2015-04-01
Obviously, radar resolution and swath width are two very important factors when it comes to synthetic aperture radar (SAR) maritime targets detections. The dilemma of using single polarization SAR imagery with higher resolution and coverage or quad- (or dual- polarimetric) imagery with its richness of information, is still unsolved when it comes to this application.In the framework of ESA project MARISS and EU project DOLPHIN, in situ campaigns aimed at solving this dilemma have been carried out. Single and multi- polarimetric SAR data acquired by TerraSAR-X, RADARSAT-2 and COSMO-SkyMed have been acquired with close time gaps and partial coverage overlap. In this way several moving and non-moving maritime targets have been imaged with different polarization, geometry and working frequency. Available ground truth reports provided by Automatic Identification System (AIS) data, nautical chart and wind farm location are used to validate the different types of maritime targets.
Scharl, Arno; Hubmann-Haidvogel, Alexander; Jones, Alistair; Fischl, Daniel; Kamolov, Ruslan; Weichselbraun, Albert; Rafelsberger, Walter
2016-01-01
This paper presents a Web intelligence portal that captures and aggregates news and social media coverage about “Game of Thrones”, an American drama television series created for the HBO television network based on George R.R. Martin’s series of fantasy novels. The system collects content from the Web sites of Anglo-American news media as well as from four social media platforms: Twitter, Facebook, Google+ and YouTube. An interactive dashboard with trend charts and synchronized visual analytics components not only shows how often Game of Thrones events and characters are being mentioned by journalists and viewers, but also provides a real-time account of concepts that are being associated with the unfolding storyline and each new episode. Positive or negative sentiment is computed automatically, which sheds light on the perception of actors and new plot elements. PMID:27065510
ASTER cloud coverage reassessment using MODIS cloud mask products
NASA Astrophysics Data System (ADS)
Tonooka, Hideyuki; Omagari, Kunjuro; Yamamoto, Hirokazu; Tachikawa, Tetsushi; Fujita, Masaru; Paitaer, Zaoreguli
2010-10-01
In the Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER) Project, two kinds of algorithms are used for cloud assessment in Level-1 processing. The first algorithm based on the LANDSAT-5 TM Automatic Cloud Cover Assessment (ACCA) algorithm is used for a part of daytime scenes observed with only VNIR bands and all nighttime scenes, and the second algorithm based on the LANDSAT-7 ETM+ ACCA algorithm is used for most of daytime scenes observed with all spectral bands. However, the first algorithm does not work well for lack of some spectral bands sensitive to cloud detection, and the two algorithms have been less accurate over snow/ice covered areas since April 2008 when the SWIR subsystem developed troubles. In addition, they perform less well for some combinations of surface type and sun elevation angle. We, therefore, have developed the ASTER cloud coverage reassessment system using MODIS cloud mask (MOD35) products, and have reassessed cloud coverage for all ASTER archived scenes (>1.7 million scenes). All of the new cloud coverage data are included in Image Management System (IMS) databases of the ASTER Ground Data System (GDS) and NASA's Land Process Data Active Archive Center (LP DAAC) and used for ASTER product search by users, and cloud mask images are distributed to users through Internet. Daily upcoming scenes (about 400 scenes per day) are reassessed and inserted into the IMS databases in 5 to 7 days after each scene observation date. Some validation studies for the new cloud coverage data and some mission-related analyses using those data are also demonstrated in the present paper.
The Development of the Spanish Fireball Network Using a New All-Sky CCD System
NASA Astrophysics Data System (ADS)
Trigo-Rodríguez, J. M.; Castro-Tirado, A. J.; Llorca, J.; Fabregat, J.; Martínez, V. J.; Reglero, V.; Jelínek, M.; Kubánek, P.; Mateo, T.; Postigo, A. De Ugarte
2004-12-01
We have developed an all-sky charge coupled devices (CCD) automatic system for detecting meteors and fireballs that will be operative in four stations in Spain during 2005. The cameras were developed following the BOOTES-1 prototype installed at the El Arenosillo Observatory in 2002, which is based on a CCD detector of 4096 × 4096 pixels with a fish-eye lens that provides an all-sky image with enough resolution to make accurate astrometric measurements. Since late 2004, a couple of cameras at two of the four stations operate for 30 s in alternate exposures, allowing 100% time coverage. The stellar limiting magnitude of the images is +10 in the zenith, and +8 below ~ 65° of zenithal angle. As a result, the images provide enough comparison stars to make astrometric measurements of faint meteors and fireballs with an accuracy of ~ 2°arcminutes. Using this prototype, four automatic all-sky CCD stations have been developed, two in Andalusia and two in the Valencian Community, to start full operation of the Spanish Fireball Network. In addition to all-sky coverage, we are developing a fireball spectroscopy program using medium field lenses with additional CCD cameras. Here we present the first images obtained from the El Arenosillo and La Mayora stations in Andalusia during their first months of activity. The detection of the Jan 27, 2003 superbolide of ± 17 ± 1 absolute magnitude that overflew Algeria and Morocco is an example of the detection capability of our prototype.
2014-01-01
Automatic reconstruction of metabolic pathways for an organism from genomics and transcriptomics data has been a challenging and important problem in bioinformatics. Traditionally, known reference pathways can be mapped into an organism-specific ones based on its genome annotation and protein homology. However, this simple knowledge-based mapping method might produce incomplete pathways and generally cannot predict unknown new relations and reactions. In contrast, ab initio metabolic network construction methods can predict novel reactions and interactions, but its accuracy tends to be low leading to a lot of false positives. Here we combine existing pathway knowledge and a new ab initio Bayesian probabilistic graphical model together in a novel fashion to improve automatic reconstruction of metabolic networks. Specifically, we built a knowledge database containing known, individual gene / protein interactions and metabolic reactions extracted from existing reference pathways. Known reactions and interactions were then used as constraints for Bayesian network learning methods to predict metabolic pathways. Using individual reactions and interactions extracted from different pathways of many organisms to guide pathway construction is new and improves both the coverage and accuracy of metabolic pathway construction. We applied this probabilistic knowledge-based approach to construct the metabolic networks from yeast gene expression data and compared its results with 62 known metabolic networks in the KEGG database. The experiment showed that the method improved the coverage of metabolic network construction over the traditional reference pathway mapping method and was more accurate than pure ab initio methods. PMID:25374614
Simulation environment based on the Universal Verification Methodology
NASA Astrophysics Data System (ADS)
Fiergolski, A.
2017-01-01
Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.
Paolucci, Francesco; Prinsze, Femmeke; Stam, Pieter J A; van de Ven, Wynand P M M
2009-09-01
In this paper, we simulate several scenarios of the potential premium range for voluntary (supplementary) health insurance, covering benefits which might be excluded from mandatory health insurance (MI). Our findings show that, by adding risk-factors, the minimum premium decreases and the maximum increases. The magnitude of the premium range is especially substantial for benefits such as medical devices and drugs. When removing benefits from MI policymakers should be aware of the implications for the potential reduction of affordability of voluntary health insurance coverage in a competitive market.
Exclusion of patients from pay-for-performance targets by English physicians.
Doran, Tim; Fullwood, Catherine; Reeves, David; Gravelle, Hugh; Roland, Martin
2008-07-17
In the English pay-for-performance program, physicians use a range of criteria to exclude individual patients from the quality calculations that determine their pay. This process, which is called exception reporting, is intended to safeguard patients against inappropriate treatment by physicians seeking to maximize their income. However, exception reporting may allow physicians to inappropriately exclude patients for whom targets have been missed (a practice known as gaming). We analyzed data extracted automatically from clinical computing systems for 8105 family practices in England (96% of all practices), data from the U.K. Census, and data on practice characteristics from the U.K. Department of Health. We determined the rate of exception reporting for 65 clinical activities and the association between this rate and the characteristics of patients and medical practices. From April 2005 through March 2006, physicians excluded a median of 5.3% of patients (interquartile range, 4.0 to 6.9) from the quality calculations. Physicians were most likely to exclude patients from indicators that were related to providing treatments and achieving target levels of intermediate outcomes; they were least likely to exclude patients from indicators that were related to routine checks and measurements and to offers of treatment. The characteristics of patients and practices explained only 2.7% of the variance in exception reporting. We estimate that exception reporting accounted for approximately 1.5% of the cost of the pay-for-performance program. Exception reporting brings substantial benefits to pay-for-performance programs, providing that the process is used appropriately. In England, rates of exception reporting have generally been low, with little evidence of widespread gaming. 2008 Massachusetts Medical Society
Latagliata, Roberto; Carmosino, Ida; Vozella, Federico; Volpicelli, Paola; De Angelis, Federico; Loglisci, Maria Giovanna; Salaroli, Adriano; De Luca, Maria Lucia; Montagna, Chiara; Serrao, Alessandra; Molica, Matteo; Diverio, Daniela; Nanni, Mauro; Mancini, Marco; Breccia, Massimo; Alimena, Giuliana
2017-06-01
Both Dasision and ENESTnd trials had many exclusion criteria, with a possible selection bias compared with the real-life. To address the impact of this bias on the first-line treatment in the current clinical practice, we revised 207 unselected newly diagnosed chronic phase chronic myeloid leukaemia (CML) patients [M/F 108/99, median age 58.8 years, interquartile range 42.3-70.2] treated with front-line imatinib from June 2002 to June 2013 at our Institution, and evaluated how many of them would have been excluded from enrolment in the two trials. Twenty-eight patients (13.5%) should have been excluded by both trials because of polycomorbidities (12), severe cardiomyopathy (five), age > 80 with frailty (three), drug abuse (two) or other severe concomitant diseases (six). In addition, eight patients should have been excluded by Dasision due to isolated chronic obstructive broncopulmonar disease, and 19 patients should have been excluded by ENESTnd due to isolated diabetes (10), arrhythmia (four), acute myocardial infarction > 6 months before CML diagnosis (two), chronic pancreatic disease (two) and peripheral arterial obstructive disease (one). On the whole, 36 patients (17.4%) would have been excluded by Dasision trial and 47 (22.7%) by ENESTnd trial. The patients potentially not eligible for both trials were significantly older and with imatinib had a worse outcome compared with patients potentially eligible. Our data highlight that an automatic transposition of results available in clinical controlled trials into the frontline real-life management of CML patients should be regarded with caution. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
MeSH indexing based on automatically generated summaries.
Jimeno-Yepes, Antonio J; Plaza, Laura; Mork, James G; Aronson, Alan R; Díaz, Alberto
2013-06-26
MEDLINE citations are manually indexed at the U.S. National Library of Medicine (NLM) using as reference the Medical Subject Headings (MeSH) controlled vocabulary. For this task, the human indexers read the full text of the article. Due to the growth of MEDLINE, the NLM Indexing Initiative explores indexing methodologies that can support the task of the indexers. Medical Text Indexer (MTI) is a tool developed by the NLM Indexing Initiative to provide MeSH indexing recommendations to indexers. Currently, the input to MTI is MEDLINE citations, title and abstract only. Previous work has shown that using full text as input to MTI increases recall, but decreases precision sharply. We propose using summaries generated automatically from the full text for the input to MTI to use in the task of suggesting MeSH headings to indexers. Summaries distill the most salient information from the full text, which might increase the coverage of automatic indexing approaches based on MEDLINE. We hypothesize that if the results were good enough, manual indexers could possibly use automatic summaries instead of the full texts, along with the recommendations of MTI, to speed up the process while maintaining high quality of indexing results. We have generated summaries of different lengths using two different summarizers, and evaluated the MTI indexing on the summaries using different algorithms: MTI, individual MTI components, and machine learning. The results are compared to those of full text articles and MEDLINE citations. Our results show that automatically generated summaries achieve similar recall but higher precision compared to full text articles. Compared to MEDLINE citations, summaries achieve higher recall but lower precision. Our results show that automatic summaries produce better indexing than full text articles. Summaries produce similar recall to full text but much better precision, which seems to indicate that automatic summaries can efficiently capture the most important contents within the original articles. The combination of MEDLINE citations and automatically generated summaries could improve the recommendations suggested by MTI. On the other hand, indexing performance might be dependent on the MeSH heading being indexed. Summarization techniques could thus be considered as a feature selection algorithm that might have to be tuned individually for each MeSH heading.
NASA Astrophysics Data System (ADS)
Rieder, Christian; Wirtz, Stefan; Strehlow, Jan; Zidowitz, Stephan; Bruners, Philipp; Isfort, Peter; Mahnken, Andreas H.; Peitgen, Heinz-Otto
2012-02-01
Image-guided radiofrequency ablation (RFA) is becoming a standard procedure for minimally invasive tumor treatment in clinical practice. To verify the treatment success of the therapy, reliable post-interventional assessment of the ablation zone (coagulation) is essential. Typically, pre- and post-interventional CT images have to be aligned to compare the shape, size, and position of tumor and coagulation zone. In this work, we present an automatic workflow for masking liver tissue, enabling a rigid registration algorithm to perform at least as accurate as experienced medical experts. To minimize the effect of global liver deformations, the registration is computed in a local region of interest around the pre-interventional lesion and post-interventional coagulation necrosis. A registration mask excluding lesions and neighboring organs is calculated to prevent the registration algorithm from matching both lesion shapes instead of the surrounding liver anatomy. As an initial registration step, the centers of gravity from both lesions are aligned automatically. The subsequent rigid registration method is based on the Local Cross Correlation (LCC) similarity measure and Newton-type optimization. To assess the accuracy of our method, 41 RFA cases are registered and compared with the manually aligned cases from four medical experts. Furthermore, the registration results are compared with ground truth transformations based on averaged anatomical landmark pairs. In the evaluation, we show that our method allows to automatic alignment of the data sets with equal accuracy as medical experts, but requiring significancy less time consumption and variability.
Immunization rates and timely administration in pre-school and school-aged children.
Heininger, Ulrich; Zuberbühler, Mirjam
2006-02-01
Whereas immunization coverage has been repeatedly assessed in the Swiss population, little is known about the timely administration of universally recommended immunizations in Switzerland and elsewhere. The goal of this study was to determine compliance with official standard immunization recommendations in pre-school and school-aged children in Basel, Switzerland, focusing on coverage rates and timely administration. Of a cohort of children entering kindergarten and third-grade primary school in Basel in 2001, 310 and 310, respectively, were identified in proportion to the overall age-appropriate populations in the four city districts. Foreign-born children were excluded. The data were extracted from immunization records provided voluntarily by parents. Coverage for three doses of diphtheria, tetanus, and poliomyelitis vaccines was >95% and <90% for pertussis and Hib. The rates of age-appropriate booster doses were significantly lower, especially for pertussis and Hib (<60%). Cumulative coverage for measles, mumps, and rubella (MMR) was <90% for the first dose and 33% for the second dose by 10 years of age. All immunizations were administered with significant delays. Coverage for the first three doses of DTP combination vaccines did not reach 90% before 1 year of age and, for the first dose of MMR, a plateau just below 80% was not reached before 3 years of age. Delayed administration of immunizations in childhood, as well as complete lack of booster doses in a significant fraction of children, with important implications for public health have been discovered in this study. This may lead to fatal disease in individuals, epidemics in the community, and threatens national and international targets of disease elimination, such as measles and congenital rubella syndrome.
NASA Technical Reports Server (NTRS)
Loh, Yin C.; Boster, John; Hwu, Shian; Watson, John C.; deSilva, Kanishka; Piatek, Irene (Technical Monitor)
1999-01-01
The Wireless Video System (WVS) provides real-time video coverage of astronaut extra vehicular activities during International Space Station (ISS) assembly. The ISS wireless environment is unique due to the nature of the ISS structure and multiple RF interference sources. This paper describes how the system was developed to combat multipath, blockage, and interference using an automatic antenna switching system. Critical to system performance is the selection of receiver antenna installation locations determined using Uniform Geometrical Theory of Diffraction (GTD) techniques.
Rajasingh, Charlotte Mary; Weiser, Thomas G; Knowlton, Lisa M; Tennakoon, Lakshika; Spain, David A; Staudenmayer, Kristan L
2018-06-01
Traumatic injuries result in a significant disruption to patients' lives, including their ability to work, which may place patients at risk of losing insurance coverage. Our objective was to evaluate the impact of injury on insurance status. We hypothesized that trauma patients with ongoing health needs experience changes in coverage. We used the Nationwide Readmission Database (2013-2014), a nationally representative sample of readmissions in the United States. We included patients aged 27 years to 64 years admitted with any diagnosis of trauma with at least one readmission within 6 months. Patients on Medicare and with missing payer information were excluded. The primary outcome was payer status. 57,281 patients met inclusion criteria, 11,006 (19%) changed insurance payer at readmission. Of these, 21% (n = 2,288) became uninsured, 25% (n = 2,773) gained coverage, and 54% (n = 5,945) switched insurance. Medicaid and Medicare gained the largest fraction of patients (from 16% to 30% and 0% to 18%, respectively), with a decrease in private payer coverage (37% to 17%). In multivariate analysis, patients who were younger (27-35 years vs. 56-64 years; odds ratio [OR], 1.30; p < 0.001); lived in a zip code with average income in the lowest quartile (vs. the highest quartile; OR, 1.37; p < 0.001); and had three or more comorbidities (vs. none; OR, 1.61; p < 0.001) were more likely to experience a change in insurance. Approximately one fifth of trauma patients who are readmitted within 6 months of their injury experience a change in insurance coverage. Most switch between insurers, but nearly a quarter lose their insurance. The government adopts a large fraction of these patients, indicating a growing reliance on government programs like Medicaid. Trauma patients face challenges after injury, and a change in insurance may add to this burden. Future policy and quality improvement initiatives should consider addressing this challenge. Epidemiologic, level III.
The quest for universal health coverage: achieving social protection for all in Mexico.
Knaul, Felicia Marie; González-Pier, Eduardo; Gómez-Dantés, Octavio; García-Junco, David; Arreola-Ornelas, Héctor; Barraza-Lloréns, Mariana; Sandoval, Rosa; Caballero, Francisco; Hernández-Avila, Mauricio; Juan, Mercedes; Kershenobich, David; Nigenda, Gustavo; Ruelas, Enrique; Sepúlveda, Jaime; Tapia, Roberto; Soberón, Guillermo; Chertorivski, Salomón; Frenk, Julio
2012-10-06
Mexico is reaching universal health coverage in 2012. A national health insurance programme called Seguro Popular, introduced in 2003, is providing access to a package of comprehensive health services with financial protection for more than 50 million Mexicans previously excluded from insurance. Universal coverage in Mexico is synonymous with social protection of health. This report analyses the road to universal coverage along three dimensions of protection: against health risks, for patients through quality assurance of health care, and against the financial consequences of disease and injury. We present a conceptual discussion of the transition from labour-based social security to social protection of health, which implies access to effective health care as a universal right based on citizenship, the ethical basis of the Mexican reform. We discuss the conditions that prompted the reform, as well as its design and inception, and we describe the 9-year, evidence-driven implementation process, including updates and improvements to the original programme. The core of the report concentrates on the effects and impacts of the reform, based on analysis of all published and publically available scientific literature and new data. Evidence indicates that Seguro Popular is improving access to health services and reducing the prevalence of catastrophic and impoverishing health expenditures, especially for the poor. Recent studies also show improvement in effective coverage. This research then addresses persistent challenges, including the need to translate financial resources into more effective, equitable and responsive health services. A next generation of reforms will be required and these include systemic measures to complete the reorganisation of the health system by functions. The paper concludes with a discussion of the implications of the Mexican quest to achieve universal health coverage and its relevance for other low-income and middle-income countries. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Aebi, Christine; Gröbner, Julian; Kämpfer, Niklaus; Vuilleumier, Laurent
2017-04-01
Our study analyses climatologies of cloud fraction, cloud type and cloud radiative effect depending on different parameters at two stations in Switzerland. The calculations have been performed for shortwave (0.3 - 3 μm) and longwave (3 - 100 μm) radiation separately. Information about fractional cloud coverage and cloud type is automatically retrieved from images taken by visible all-sky cameras at the two stations Payerne (490 m asl) and Davos (1594 m asl) using a cloud detection algorithm developed by PMOD/WRC (Wacker et al., 2015). Radiation data are retrieved from pyranometers and pyrgeometers, the cloud base height from a ceilometer and IWV data from GPS measurements. Interestingly, Davos and Payerne show different trends in terms of cloud coverage and cloud fraction regarding seasonal variations. The absolute longwave cloud radiative effect (LCE) for low-level clouds and a cloud coverage of 8 octas has a median value between 61 and 72 Wm-2. It is shown that the fractional cloud coverage, the cloud base height (CBH) and integrated water vapour (IWV) all have an influence on the magnitude of the LCE and will be illustrated with key examples. The relative values of the shortwave cloud radiative effect (SCE) for low-level clouds and a cloud coverage of 8 octas are between -88 to -62 %. The SCE is also influenced by the latter parameters, but also if the sun is covered or not by clouds. At both stations situations of shortwave radiation cloud enhancements have been observed and will be discussed. Wacker S., J. Gröbner, C. Zysset, L. Diener, P. Tzoumanikas, A. Kazantzidis, L. Vuilleumier, R. Stöckli, S. Nyeki, and N. Kämpfer (2015) Cloud observations in Switzerland using hemispherical sky cameras, J. Geophys. Res. Atmos, 120, 695-707.
An Image Segmentation Based on a Genetic Algorithm for Determining Soil Coverage by Crop Residues
Ribeiro, Angela; Ranz, Juan; Burgos-Artizzu, Xavier P.; Pajares, Gonzalo; Sanchez del Arco, Maria J.; Navarrete, Luis
2011-01-01
Determination of the soil coverage by crop residues after ploughing is a fundamental element of Conservation Agriculture. This paper presents the application of genetic algorithms employed during the fine tuning of the segmentation process of a digital image with the aim of automatically quantifying the residue coverage. In other words, the objective is to achieve a segmentation that would permit the discrimination of the texture of the residue so that the output of the segmentation process is a binary image in which residue zones are isolated from the rest. The RGB images used come from a sample of images in which sections of terrain were photographed with a conventional camera positioned in zenith orientation atop a tripod. The images were taken outdoors under uncontrolled lighting conditions. Up to 92% similarity was achieved between the images obtained by the segmentation process proposed in this paper and the templates made by an elaborate manual tracing process. In addition to the proposed segmentation procedure and the fine tuning procedure that was developed, a global quantification of the soil coverage by residues for the sampled area was achieved that differed by only 0.85% from the quantification obtained using template images. Moreover, the proposed method does not depend on the type of residue present in the image. The study was conducted at the experimental farm “El Encín” in Alcalá de Henares (Madrid, Spain). PMID:22163966
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Andrew, E-mail: aojones@geisinger.edu; Treas, Jared; Yavoich, Brian
2014-01-01
The aim of the study was to investigate the differences between intraoperative and postoperative dosimetry for transrectal ultrasound–guided transperineal prostate implants using cesium-131 ({sup 131}Cs). Between 2006 and 2010, 166 patients implanted with {sup 131}Cs had both intraoperative and postoperative dosimetry studies. All cases were monotherapy and doses of 115 were prescribed to the prostate. The dosimetric properties (D{sub 90}, V{sub 150}, and V{sub 100} for the prostate) of the studies were compared. Two conformity indices were also calculated and compared. Finally, the prostate was automatically sectioned into 6 sectors (anterior and posterior sectors at the base, midgland, and apex)more » and the intraoperative and postoperative dosimetry was compared in each individual sector. Postoperative dosimetry showed statistically significant changes (p < 0.01) in every dosimetric value except V{sub 150}. In each significant case, the postoperative plans showed lower dose coverage. The conformity indexes also showed a bimodal frequency distribution with the index indicating poorer dose conformity in the postoperative plans. Sector analysis revealed less dose coverage postoperatively in the base and apex sectors with an increase in dose to the posterior midgland sector. Postoperative dosimetry overall and in specific sectors of the prostate differs significantly from intraoperative planning. Care must be taken during the intraoperative planning stage to ensure complete dose coverage of the prostate with the understanding that the final postoperative dosimetry will show less dose coverage.« less
NASA Astrophysics Data System (ADS)
Lu, Hong; Gargesha, Madhusudhana; Wang, Zhao; Chamie, Daniel; Attizani, Guilherme F.; Kanaya, Tomoaki; Ray, Soumya; Costa, Marco A.; Rollins, Andrew M.; Bezerra, Hiram G.; Wilson, David L.
2013-02-01
Intravascular OCT (iOCT) is an imaging modality with ideal resolution and contrast to provide accurate in vivo assessments of tissue healing following stent implantation. Our Cardiovascular Imaging Core Laboratory has served >20 international stent clinical trials with >2000 stents analyzed. Each stent requires 6-16hrs of manual analysis time and we are developing highly automated software to reduce this extreme effort. Using classification technique, physically meaningful image features, forward feature selection to limit overtraining, and leave-one-stent-out cross validation, we detected stent struts. To determine tissue coverage areas, we estimated stent "contours" by fitting detected struts and interpolation points from linearly interpolated tissue depths to a periodic cubic spline. Tissue coverage area was obtained by subtracting lumen area from the stent area. Detection was compared against manual analysis of 40 pullbacks. We obtained recall = 90+/-3% and precision = 89+/-6%. When taking struts deemed not bright enough for manual analysis into consideration, precision improved to 94+/-6%. This approached inter-observer variability (recall = 93%, precision = 96%). Differences in stent and tissue coverage areas are 0.12 +/- 0.41 mm2 and 0.09 +/- 0.42 mm2, respectively. We are developing software which will enable visualization, review, and editing of automated results, so as to provide a comprehensive stent analysis package. This should enable better and cheaper stent clinical trials, so that manufacturers can optimize the myriad of parameters (drug, coverage, bioresorbable versus metal, etc.) for stent design.
Tangcharoensathien, Viroj; Patcharanarumol, Walaiporn; Panichkriangkrai, Warisa; Sommanustweechai, Angkana
2016-07-31
In responses to Norheim's editorial, this commentary offers reflections from Thailand, how the five unacceptable trade-offs were applied to the universal health coverage (UHC) reforms between 1975 and 2002 when the whole 64 million people were covered by one of the three public health insurance systems. This commentary aims to generate global discussions on how best UHC can be gradually achieved. Not only the proposed five discrete trade-offs within each dimension, there are also trade-offs between the three dimensions of UHC such as population coverage, service coverage and cost coverage. Findings from Thai UHC show that equity is applied for the population coverage extension, when the low income households and the informal sector were the priority population groups for coverage extension by different prepayment schemes in 1975 and 1984, respectively. With an exception of public sector employees who were historically covered as part of fringe benefits were covered well before the poor. The private sector employees were covered last in 1990. Historically, Thailand applied a comprehensive benefit package where a few items are excluded using the negative list; until there was improved capacities on technology assessment that cost-effectiveness are used for the inclusion of new interventions into the benefit package. Not only cost-effectiveness, but long term budget impact, equity and ethical considerations are taken into account. Cost coverage is mostly determined by the fiscal capacities. Close ended budget with mix of provider payment methods are used as a tool for trade-off service coverage and financial risk protection. Introducing copayment in the context of fee-for-service can be harmful to beneficiaries due to supplier induced demands, inefficiency and unpredictable out of pocket payment by households. UHC achieves favorable outcomes as it was implemented when there was a full geographical coverage of primary healthcare coverage in all districts and sub-districts after three decade of health infrastructure investment and health workforce development since 1980s. The legacy of targeting population group by different prepayment mechanisms, leading to fragmentation, discrepancies and inequity across schemes, can be rectified by harmonization at the early phase when these schemes were introduced. Robust public accountability and participation mechanisms are recommended when deciding the UHC strategy. © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Editing ERTS-1 data to exclude land aids cluster analysis of water targets
NASA Technical Reports Server (NTRS)
Erb, R. B. (Principal Investigator)
1973-01-01
The author has identified the following significant results. It has been determined that an increase in the number of spectrally distinct coastal water types is achieved when data values over the adjacent land areas are excluded from the processing routine. This finding resulted from an automatic clustering analysis of ERTS-1 system corrected MSS scene 1002-18134 of 25 July 1972 over Monterey Bay, California. When the entire study area data set was submitted to the clustering only two distinct water classes were extracted. However, when the land area data points were removed from the data set and resubmitted to the clustering routine, four distinct groupings of water features were identified. Additionally, unlike the previous separation, the four types could be correlated to features observable in the associated ERTS-1 imagery. This exercise demonstrates that by proper selection of data submitted to the processing routine, based upon the specific application of study, additional information may be extracted from the ERTS-1 MSS data.
A robust automatic phase correction method for signal dense spectra
NASA Astrophysics Data System (ADS)
Bao, Qingjia; Feng, Jiwen; Chen, Li; Chen, Fang; Liu, Zao; Jiang, Bin; Liu, Chaoyang
2013-09-01
A robust automatic phase correction method for Nuclear Magnetic Resonance (NMR) spectra is presented. In this work, a new strategy combining ‘coarse tuning' with ‘fine tuning' is introduced to correct various spectra accurately. In the ‘coarse tuning' procedure, a new robust baseline recognition method is proposed for determining the positions of the tail ends of the peaks, and then the preliminary phased spectra are obtained by minimizing the objective function based on the height difference of these tail ends. After the ‘coarse tuning', the peaks in the preliminary corrected spectra can be categorized into three classes: positive, negative, and distorted. Based on the classification result, a new custom negative penalty function used in the step of ‘fine tuning' is constructed to avoid the negative peak points in the spectra excluded in the negative peaks and distorted peaks. Finally, the fine phased spectra can be obtained by minimizing the custom negative penalty function. This method is proven to be very robust for it is tolerant to low signal-to-noise ratio, large baseline distortion and independent of the starting search points of phasing parameters. The experimental results on both 1D metabonomics spectra with over-crowded peaks and 2D spectra demonstrate the high efficiency of this automatic method.
Pokupec, Rajko; Mrazovac, Danijela; Popović-Suić, Smiljka; Mrazovac, Visnja; Kordić, Rajko; Petricek, Igor
2013-04-01
Early detection of a refractive error and its correction are extremely important for the prevention of amblyopia (poor vision). The golden standard in the detection of refractive errors is retinoscopy--a method where the pupils are dilated in order to exclude accomodation. This results in a more accurate measurement of a refractive error. Automatic computer refractometer is also in use. The study included 30 patients, 15 boys, 15 girls aged 4-16. The first examination was conducted with refractometer on narrow pupils. Retinoscopy, followed by another examination with refractometer was performed on pupils dilated with mydriatic drops administered 3 times. The results obtained with three methods were compared. They indicate that in narrow pupils the autorefractometer revealed an increased diopter value in nearsightedness (myopia), the minus overcorrection, whereas findings obtained with retinoscopy and autorefractometer in mydriasis cycloplegia, were much more accurate. The results were statistically processed, which confirmed the differences between obtained measurements. These findings are consistent with the results of studies conducted by other authors. Automatic refractometry on narrow pupils has proven to be a method for detection of refractive errors in children. However, the exact value of the refractive error is obtained only in mydriasis--with retinoscopy or an automatic refractometer on dilated pupils.
International river basins of the world
Wolf, Aaron T.; Natharius, Jeffrey A.; Danielson, Jeffrey J.; Ward, Brian S.; Pender, Jan K.
1999-01-01
It is becoming acknowledged that water is likely to be the most pressing environmental concern of the next century. Difficulties in river basin management are only exacerbated when the resource crosses international boundaries. One critical aid in the assessment of international waters has been the Register of International Rivers a compendium which listed 214 international waterways that cover 47% of the earth's continental land surface. The Register, though, was last updated in 1978 by the now defunct United Nations Department of Economic and Social Affairs. The purpose of this paper is to update the Register in order to reflect the quantum changes that have taken place over the last 22 years, both in global geopolitics and in map coverage and technology. By accessing digital elevation models at spatial resolutions of 30 arc seconds, corroborating at a unified global map coverage of at least 1:1 000 000, and superimposing the results over complete coverage of current political boundaries, we are able to provide a new register which lists 261 international rivers, covering 45.3% of the land surface of the earth (excluding Antarctica). This paper lists all international rivers with their watershed areas, the nations which share each watershed,their respective territorial percentages, and notes on changes in or disputes over international boundaries since 1978.
[Measles vaccination campaign for vulnerable populations: lessons learned].
Laurence, Sophie; Chappuis, Marielle; Lucas, Dorinela; Duteurtre, Martin; Corty, Jean-François
2013-01-01
Between 2008 and 2011, a measles epidemic raged in France. Immunization coverage in France, already insufficient in the general population, is even more worrying for deprived populations in whom exposure to the disease and the risk of complications are much higher. In this context, Medecins du Monde (MdM), the General Council of the Seine-Saint-Denis (CG93) and the Territorial Directorate of the Regional Health Agency (DTARS) implemented a measles vaccination campaign among the Rom population of the department. The objective was to improve coverage of this population by providing ambulatory services in collaboration between various field partners in a single public health project. Twenty-two of the known Rom settlements were selected to receive vaccination. MdM was in charge of logistics, mediation and vaccinations at 13 sites and the DTARS and CG93 were in charge of vaccination at another 9 sites with support from MdM for mediation and logistics. Between January and June 2012, 250 persons were vaccinated, 34.7% of the target population. Coverage of the population after the vaccination campaign was still very low. The partnership between MdM, DTARS and CG93 helped to create a positive mobile action experience and extended prevention actions towards the most vulnerable populations excluded from conventional health care structures.
The democratization of health in Mexico: financial innovations for universal coverage
Frenk, Julio; Knaul, Felicia Marie
2009-01-01
Abstract In 2003, the Mexican Congress approved a reform establishing the Sistema de Protección Social en Salud [System of Social Protection in Health], whereby public funding for health is being increased by one percent of the 2003 gross domestic product over seven years to guarantee universal health insurance. Poor families that had been excluded from traditional social security can now enrol in a new public insurance scheme known as Seguro Popular [People’s Insurance], which assures legislated access to a comprehensive set of health-care entitlements. This paper describes the financial innovations behind the expansion of health-care coverage in Mexico to everyone and their effects. Evidence shows improvements in mobilization of additional public resources; availability of health infrastructure and drugs; service utilization; effective coverage; and financial protection. Future challenges are discussed, among them the need for additional public funding to extend access to costly interventions for non-communicable diseases not yet covered by the new insurance scheme, and to improve the technical quality of care and the responsiveness of the health system. Eventually, the progress achieved so far will have to be reflected in health outcomes, which will continue to be evaluated so that Mexico can meet the ultimate criterion of reform success: better health through equity, quality and fair financing. PMID:19649369
Medicare coverage for patients with diabetes. A national plan with individual consequences.
Ashkenazy, R; Abrahamson, M J
2006-04-01
The prevalence of diabetes in the U.S. Medicare population is growing at an alarming rate. From 1980 to 2004, the number of people aged 65 or older with diagnosed diabetes increased from 2.3 million to 5.8 million. According to the Centers for Medicare and Medicaid (CMS), 32% of Medicare spending is attributed to the diabetes population. Since its inception, Medicare has expanded medical coverage of monitoring devices, screening tests and visits, educational efforts, and preventive medical services for its diabetic enrollees. However, oral antidiabetic agents and insulin were excluded from reimbursement. In 2003, Congress passed the Medicare Modernization Act that includes a drug benefit to be administered either through Medicare Advantage drug plans or privately sponsored prescription drug plans for implementation in January 2006. In this article we highlight key patient and drug plan characteristics and resources that providers may focus upon to assist their patients choose a coverage plan. Using a case example, we illustrate the variable financial impact the adoption of Medicare part D may have on beneficiaries with diabetes due to their economic status. We further discuss the potential consequences the legislation will have on diabetic patients enrolled in Medicare, their providers, prescribing strategies, and the diabetes market.
The democratization of health in Mexico: financial innovations for universal coverage.
Frenk, Julio; Gómez-Dantés, Octavio; Knaul, Felicia Marie
2009-07-01
In 2003, the Mexican Congress approved a reform establishing the Sistema de Protección Social en Salud [System of Social Protection in Health], whereby public funding for health is being increased by one percent of the 2003 gross domestic product over seven years to guarantee universal health insurance. Poor families that had been excluded from traditional social security can now enrol in a new public insurance scheme known as Seguro Popular [People's Insurance], which assures legislated access to a comprehensive set of health-care entitlements. This paper describes the financial innovations behind the expansion of health-care coverage in Mexico to everyone and their effects. Evidence shows improvements in mobilization of additional public resources; availability of health infrastructure and drugs; service utilization; effective coverage; and financial protection. Future challenges are discussed, among them the need for additional public funding to extend access to costly interventions for non-communicable diseases not yet covered by the new insurance scheme, and to improve the technical quality of care and the responsiveness of the health system. Eventually, the progress achieved so far will have to be reflected in health outcomes, which will continue to be evaluated so that Mexico can meet the ultimate criterion of reform success: better health through equity, quality and fair financing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winkel, D; Bol, GH; Asselen, B van
Purpose: To develop an automated radiotherapy treatment planning and optimization workflow for prostate cancer in order to generate clinical treatment plans. Methods: A fully automated radiotherapy treatment planning and optimization workflow was developed based on the treatment planning system Monaco (Elekta AB, Stockholm, Sweden). To evaluate our method, a retrospective planning study (n=100) was performed on patients treated for prostate cancer with 5 field intensity modulated radiotherapy, receiving a dose of 35×2Gy to the prostate and vesicles and a simultaneous integrated boost of 35×0.2Gy to the prostate only. A comparison was made between the dosimetric values of the automatically andmore » manually generated plans. Operator time to generate a plan and plan efficiency was measured. Results: A comparison of the dosimetric values show that automatically generated plans yield more beneficial dosimetric values. In automatic plans reductions of 43% in the V72Gy of the rectum and 13% in the V72Gy of the bladder are observed when compared to the manually generated plans. Smaller variance in dosimetric values is seen, i.e. the intra- and interplanner variability is decreased. For 97% of the automatically generated plans and 86% of the clinical plans all criteria for target coverage and organs at risk constraints are met. The amount of plan segments and monitor units is reduced by 13% and 9% respectively. Automated planning requires less than one minute of operator time compared to over an hour for manual planning. Conclusion: The automatically generated plans are highly suitable for clinical use. The plans have less variance and a large gain in time efficiency has been achieved. Currently, a pilot study is performed, comparing the preference of the clinician and clinical physicist for the automatic versus manual plan. Future work will include expanding our automated treatment planning method to other tumor sites and develop other automated radiotherapy workflows.« less
Reaction Mechanism Generator: Automatic construction of chemical kinetic mechanisms
NASA Astrophysics Data System (ADS)
Gao, Connie W.; Allen, Joshua W.; Green, William H.; West, Richard H.
2016-06-01
Reaction Mechanism Generator (RMG) constructs kinetic models composed of elementary chemical reaction steps using a general understanding of how molecules react. Species thermochemistry is estimated through Benson group additivity and reaction rate coefficients are estimated using a database of known rate rules and reaction templates. At its core, RMG relies on two fundamental data structures: graphs and trees. Graphs are used to represent chemical structures, and trees are used to represent thermodynamic and kinetic data. Models are generated using a rate-based algorithm which excludes species from the model based on reaction fluxes. RMG can generate reaction mechanisms for species involving carbon, hydrogen, oxygen, sulfur, and nitrogen. It also has capabilities for estimating transport and solvation properties, and it automatically computes pressure-dependent rate coefficients and identifies chemically-activated reaction paths. RMG is an object-oriented program written in Python, which provides a stable, robust programming architecture for developing an extensible and modular code base with a large suite of unit tests. Computationally intensive functions are cythonized for speed improvements.
[Factors associated with the use of dental health services].
Dho, María Silvina
2018-02-01
This paper seeks to analyze the factors associated with the use of dental health services (UDHS) by adults in the city of Corrientes, Argentina. A cross-sectional study was conducted. Information concerning the study variables was collected via a home survey. The sample size was established with a 95% confidence interval level (381 individuals). A simple random sampling design was used, which was complemented with a non-probability quota sampling. The data was analyzed using SPSS version 21.0 and Epidat version 3.1 softwares. Socio-economic level, dental health coverage, perception of oral health care, perception of oral health, knowledge about oral health, and oral hygiene habits were significantly associated with the UDHS over the last twelve months. These same factors, excluding dental health coverage and knowledge about oral health, were associated with the UDHS for routine dental check-ups. Measures should be implemented to increase the UDHS for prevention purposes in men and women of all socio-economic levels, particularly in less-privileged individuals.
Access to health care and social protection.
Martin, Philippe
2012-06-01
In France, the access to healthcare has been conceived as a social right and is mainly managed through the coverage of the population by the National Health Insurance, which is a part of the whole French social security scheme. This system was based on the so-called Bismarckian model, which implies that it requires full employment and solid family links, as the insured persons are the workers and their dependents. This paper examines the typical problems that this system has to face as far as the right to healthcare is concerned. First, it addresses the need to introduce some universal coverage programs, in order to integrate the excluded population. Then, it addresses the issue of financial sustainability as the structural weakness of the French system--in which healthcare is still mainly provided by private practice physicians and governed by the principle of freedom--leads to conceive and implement complex forms of regulations between the State, the Social security institutions and the healthcare providers.
On the coverage of the pMSSM by simplified model results
NASA Astrophysics Data System (ADS)
Ambrogi, Federico; Kraml, Sabine; Kulkarni, Suchita; Laa, Ursula; Lessa, Andre; Waltenberger, Wolfgang
2018-03-01
We investigate to which extent the SUSY search results published by ATLAS and CMS in the context of simplified models actually cover the more realistic scenarios of a full model. Concretely, we work within the phenomenological MSSM (pMSSM) with 19 free parameters and compare the constraints obtained from SModelS v1.1.1 with those from the ATLAS pMSSM study in arXiv:1508.06608. We find that about 40-45% of the points excluded by ATLAS escape the currently available simplified model constraints. For these points we identify the most relevant topologies which are not tested by the current simplified model results. In particular, we find that topologies with asymmetric branches, including 3-jet signatures from gluino-squark associated production, could be important for improving the current constraining power of simplified models results. Furthermore, for a better coverage of light stops and sbottoms, constraints for decays via heavier neutralinos and charginos, which subsequently decay visibly to the lightest neutralino are also needed.
SU-F-J-194: Development of Dose-Based Image Guided Proton Therapy Workflow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pham, R; Sun, B; Zhao, T
Purpose: To implement image-guided proton therapy (IGPT) based on daily proton dose distribution. Methods: Unlike x-ray therapy, simple alignment based on anatomy cannot ensure proper dose coverage in proton therapy. Anatomy changes along the beam path may lead to underdosing the target, or overdosing the organ-at-risk (OAR). With an in-room mobile computed tomography (CT) system, we are developing a dose-based IGPT software tool that allows patient positioning and treatment adaption based on daily dose distributions. During an IGPT treatment, daily CT images are acquired in treatment position. After initial positioning based on rigid image registration, proton dose distribution is calculatedmore » on daily CT images. The target and OARs are automatically delineated via deformable image registration. Dose distributions are evaluated to decide if repositioning or plan adaptation is necessary in order to achieve proper coverage of the target and sparing of OARs. Besides online dose-based image guidance, the software tool can also map daily treatment doses to the treatment planning CT images for offline adaptive treatment. Results: An in-room helical CT system is commissioned for IGPT purposes. It produces accurate CT numbers that allow proton dose calculation. GPU-based deformable image registration algorithms are developed and evaluated for automatic ROI-delineation and dose mapping. The online and offline IGPT functionalities are evaluated with daily CT images of the proton patients. Conclusion: The online and offline IGPT software tool may improve the safety and quality of proton treatment by allowing dose-based IGPT and adaptive proton treatments. Research is partially supported by Mevion Medical Systems.« less
Who will be denied Medicare prescription drug subsidies because of the asset test?
Rice, Thomas; Desmond, Katherine
2006-01-01
To determine the number and characteristics of Medicare beneficiaries who will be excluded from low-income prescription drug subsidies because they do not qualify under an asset test. Cross-sectional, using the US Census Bureau's Survey of Income and Program Participation (SIPP); results were based on interviews occurring between October 2002 and January 2003. The sample included 9278 Medicare beneficiaries, 2929 with incomes below 150% of the federal poverty level (FPL). Using SIPP, each sample member's income was compared to the FPL. Income was adjusted to include only liquid assets and primary residences. The number of individuals excluded by the asset test and their characteristics and types of assets responsible were calculated. Of 13.97 million noninstitutionalized Medicare beneficiaries, 2.37 million (17%) with low incomes would be excluded from subsidized drug coverage due to the asset test. Compared to higher-income beneficiaries, the excluded individuals tended to be older, female, widowed, and living alone. Almost half of their assets were checking and savings accounts. Half of the individuals failing the test had assets less than 35,000 dollars above the allowing thresholds. Widows are disproportionately affected by the asset test. When a husband dies, income plummets but accumulated assets often exceed those allowed under Medicare legislation. During their working years Americans are encouraged to save for retirement, but by accumulating modest amounts of assets, these same people often will then not qualify for low-income drug subsidies. Modifying or eliminating the asset test would help protect individuals disadvantaged by low incomes who have modest amounts of asset holdings.
Template-based automatic breast segmentation on MRI by excluding the chest region
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Muqing; Chen, Jeon-Hor; Wang, Xiaoyong
2013-12-15
Purpose: Methods for quantification of breast density on MRI using semiautomatic approaches are commonly used. In this study, the authors report on a fully automatic chest template-based method. Methods: Nonfat-suppressed breast MR images from 31 healthy women were analyzed. Among them, one case was randomly selected and used as the template, and the remaining 30 cases were used for testing. Unlike most model-based breast segmentation methods that use the breast region as the template, the chest body region on a middle slice was used as the template. Within the chest template, three body landmarks (thoracic spine and bilateral boundary ofmore » the pectoral muscle) were identified for performing the initial V-shape cut to determine the posterior lateral boundary of the breast. The chest template was mapped to each subject's image space to obtain a subject-specific chest model for exclusion. On the remaining image, the chest wall muscle was identified and excluded to obtain clean breast segmentation. The chest and muscle boundaries determined on the middle slice were used as the reference for the segmentation of adjacent slices, and the process continued superiorly and inferiorly until all 3D slices were segmented. The segmentation results were evaluated by an experienced radiologist to mark voxels that were wrongly included or excluded for error analysis. Results: The breast volumes measured by the proposed algorithm were very close to the radiologist's corrected volumes, showing a % difference ranging from 0.01% to 3.04% in 30 tested subjects with a mean of 0.86% ± 0.72%. The total error was calculated by adding the inclusion and the exclusion errors (so they did not cancel each other out), which ranged from 0.05% to 6.75% with a mean of 3.05% ± 1.93%. The fibroglandular tissue segmented within the breast region determined by the algorithm and the radiologist were also very close, showing a % difference ranging from 0.02% to 2.52% with a mean of 1.03% ± 1.03%. The total error by adding the inclusion and exclusion errors ranged from 0.16% to 11.8%, with a mean of 2.89% ± 2.55%. Conclusions: The automatic chest template-based breast MRI segmentation method worked well for cases with different body and breast shapes and different density patterns. Compared to the radiologist-established truth, the mean difference in segmented breast volume was approximately 1%, and the total error by considering the additive inclusion and exclusion errors was approximately 3%. This method may provide a reliable tool for MRI-based segmentation of breast density.« less
A methodology for automatic intensity-modulated radiation treatment planning for lung cancer
NASA Astrophysics Data System (ADS)
Zhang, Xiaodong; Li, Xiaoqiang; Quan, Enzhuo M.; Pan, Xiaoning; Li, Yupeng
2011-07-01
In intensity-modulated radiotherapy (IMRT), the quality of the treatment plan, which is highly dependent upon the treatment planner's level of experience, greatly affects the potential benefits of the radiotherapy (RT). Furthermore, the planning process is complicated and requires a great deal of iteration, and is often the most time-consuming aspect of the RT process. In this paper, we describe a methodology to automate the IMRT planning process in lung cancer cases, the goal being to improve the quality and consistency of treatment planning. This methodology (1) automatically sets beam angles based on a beam angle automation algorithm, (2) judiciously designs the planning structures, which were shown to be effective for all the lung cancer cases we studied, and (3) automatically adjusts the objectives of the objective function based on a parameter automation algorithm. We compared treatment plans created in this system (mdaccAutoPlan) based on the overall methodology with plans from a clinical trial of IMRT for lung cancer run at our institution. The 'autoplans' were consistently better, or no worse, than the plans produced by experienced medical dosimetrists in terms of tumor coverage and normal tissue sparing. We conclude that the mdaccAutoPlan system can potentially improve the quality and consistency of treatment planning for lung cancer.
NASA Technical Reports Server (NTRS)
Walsh, T. M.; Morello, S. A.; Reeder, J. P.
1976-01-01
An exercise to support the Federal Aviation Administration in demonstrating the U.S. candidate for an international microwave landing system (MLS) was conducted by NASA. During this demonstration the MLS was utilized to provide the TCV Boeing 737 research airplane with guidance for automatic control during transition from conventional RNAV to MLS RNAV in curved, descending flight; flare; touchdown; and roll-out. Flight profiles, system configuration, displays, and operating procedures used in the demonstration are described, and preliminary results of flight data analysis are discussed. Recent experiences with manually controlled flight in the NAFEC MLS environment are also discussed. The demonstration shows that in automatic three-dimensional flight, the volumetric signal coverage of the MLS can be exploited to enable a commercial carrier class airplane to perform complex curved, descending paths with precision turns into short final approaches terminating in landing and roll-out, even when subjected to strong and gusty tail and cross wind components and severe wind shear.
Image acquisition device of inspection robot based on adaptive rotation regulation of polarizer
NASA Astrophysics Data System (ADS)
Dong, Maoqi; Wang, Xingguang; Liang, Tao; Yang, Guoqing; Zhang, Chuangyou; Gao, Faqin
2017-12-01
An image processing device of inspection robot with adaptive polarization adjustment is proposed, that the device includes the inspection robot body, the image collecting mechanism, the polarizer and the polarizer automatic actuating device. Where, the image acquisition mechanism is arranged at the front of the inspection robot body for collecting equipment image data in the substation. Polarizer is fixed on the automatic actuating device of polarizer, and installed in front of the image acquisition mechanism, and that the optical axis of the camera vertically goes through the polarizer and the polarizer rotates with the optical axis of the visible camera as the central axis. The simulation results show that the system solves the fuzzy problems of the equipment that are caused by glare, reflection of light and shadow, and the robot can observe details of the running status of electrical equipment. And the full coverage of the substation equipment inspection robot observation target is achieved, which ensures the safe operation of the substation equipment.
Ffuzz: Towards full system high coverage fuzz testing on binary executables
2018-01-01
Bugs and vulnerabilities in binary executables threaten cyber security. Current discovery methods, like fuzz testing, symbolic execution and manual analysis, both have advantages and disadvantages when exercising the deeper code area in binary executables to find more bugs. In this paper, we designed and implemented a hybrid automatic bug finding tool—Ffuzz—on top of fuzz testing and selective symbolic execution. It targets full system software stack testing including both the user space and kernel space. Combining these two mainstream techniques enables us to achieve higher coverage and avoid getting stuck both in fuzz testing and symbolic execution. We also proposed two key optimizations to improve the efficiency of full system testing. We evaluated the efficiency and effectiveness of our method on real-world binary software and 844 memory corruption vulnerable programs in the Juliet test suite. The results show that Ffuzz can discover software bugs in the full system software stack effectively and efficiently. PMID:29791469
Design and Implementation of a Novel Portable 360° Stereo Camera System with Low-Cost Action Cameras
NASA Astrophysics Data System (ADS)
Holdener, D.; Nebiker, S.; Blaser, S.
2017-11-01
The demand for capturing indoor spaces is rising with the digitalization trend in the construction industry. An efficient solution for measuring challenging indoor environments is mobile mapping. Image-based systems with 360° panoramic coverage allow a rapid data acquisition and can be processed to georeferenced 3D images hosted in cloud-based 3D geoinformation services. For the multiview stereo camera system presented in this paper, a 360° coverage is achieved with a layout consisting of five horizontal stereo image pairs in a circular arrangement. The design is implemented as a low-cost solution based on a 3D printed camera rig and action cameras with fisheye lenses. The fisheye stereo system is successfully calibrated with accuracies sufficient for the applied measurement task. A comparison of 3D distances with reference data delivers maximal deviations of 3 cm on typical distances in indoor space of 2-8 m. Also the automatic computation of coloured point clouds from the stereo pairs is demonstrated.
Perceiving pain in others: validation of a dual processing model.
McCrystal, Kalie N; Craig, Kenneth D; Versloot, Judith; Fashler, Samantha R; Jones, Daniel N
2011-05-01
Accurate perception of another person's painful distress would appear to be accomplished through sensitivity to both automatic (unintentional, reflexive) and controlled (intentional, purposive) behavioural expression. We examined whether observers would construe diverse behavioural cues as falling within these domains, consistent with cognitive neuroscience findings describing activation of both automatic and controlled neuroregulatory processes. Using online survey methodology, 308 research participants rated behavioural cues as "goal directed vs. non-goal directed," "conscious vs. unconscious," "uncontrolled vs. controlled," "fast vs. slow," "intentional (deliberate) vs. unintentional," "stimulus driven (obligatory) vs. self driven," and "requiring contemplation vs. not requiring contemplation." The behavioural cues were the 39 items provided by the PROMIS pain behaviour bank, constructed to be representative of the diverse possibilities for pain expression. Inter-item correlations among rating scales provided evidence of sufficient internal consistency justifying a single score on an automatic/controlled dimension (excluding the inconsistent fast vs. slow scale). An initial exploratory factor analysis on 151 participant data sets yielded factors consistent with "controlled" and "automatic" actions, as well as behaviours characterized as "ambiguous." A confirmatory factor analysis using the remaining 151 data sets replicated EFA findings, supporting theoretical predictions that observers would distinguish immediate, reflexive, and spontaneous reactions (primarily facial expression and paralinguistic features of speech) from purposeful and controlled expression (verbal behaviour, instrumental behaviour requiring ongoing, integrated responses). There are implicit dispositions to organize cues signaling pain in others into the well-defined categories predicted by dual process theory. Copyright © 2011 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.
Junger, Axel; Brenck, Florian; Hartmann, Bernd; Klasen, Joachim; Quinzio, Lorenzo; Benson, Matthias; Michel, Achim; Röhrig, Rainer; Hempelmann, Gunter
2004-07-01
The most recent approach to estimate nursing resources consumption has led to the generation of the Nine Equivalents of Nursing Manpower use Score (NEMS). The objective of this prospective study was to establish a completely automatically generated calculation of the NEMS using a patient data management system (PDMS) database and to validate this approach by comparing the results with those of the conventional manual method. Prospective study. Operative intensive care unit of a university hospital. Patients admitted to the ICU between 24 July 2002 and 22 August 2002. Patients under the age of 16 years, and patients undergoing cardiovascular surgery or with burn injuries were excluded. None. The NEMS of all patients was calculated automatically with a PDMS and manually by a physician in parallel. The results of the two methods were compared using the Bland and Altman approach, the interclass correlation coefficient (ICC), and the kappa-statistic. On 20 consecutive working days, the NEMS was calculated in 204 cases. The Bland Altman analysis did not show significant differences in NEMS scoring between the two methods. The ICC (95% confidence intervals) 0.87 (0.84-0.90) revealed a high inter-rater agreement between the PDMS and the physician. The kappa-statistic showed good results (kappa>0.55) for all NEMS items apart from the item "supplementary ventilatory care". This study demonstrates that automatical calculation of the NEMS is possible with high accuracy by means of a PDMS. This may lead to a decrease in consumption of nursing resources.
Automatically inserted technical details improve radiology report accuracy.
Abujudeh, Hani H; Govindan, Siddharth; Narin, Ozden; Johnson, Jamlik Omari; Thrall, James H; Rosenthal, Daniel I
2011-09-01
To assess the effect of automatically inserted technical details on the concordance of a radiology report header with the actual procedure performed. The study was IRB approved and informed consent was waived. We obtained radiology report audit data from the hospital's compliance office from the period of January 2005 through December 2009 spanning a total of 20 financial quarters. A "discordance percentage" was defined as the percentage of total studies in which a procedure code change was made during auditing. Using Chi-square analysis we compared discordance percentages between reports with manually inserted technical details (MITD) and automatically inserted technical details (AITD). The second quarter data of 2007 was not included in the analysis as the switch from MITD to AITD occurred during this quarter. The hospital's compliance office audited 9,110 studies from 2005-2009. Excluding the 564 studies in the second quarter of 2007, we analyzed a total of 8,546 studies, 3,948 with MITD and 4,598 with AITD. The discordance percentage in the MITD group was 3.95% (156/3,948, range per quarter, 1.5- 6.1%). The AITD discordance percentage was 1.37% (63/4,598, range per quarter, 0.0-2.6%). A Chi-square analysis determined a statistically significant difference between the 2 groups (P < 0.001). There was a statistically significant improvement in the concordance of a radiology report header with the performed procedure using automatically inserted technical details compared to manually inserted details. Copyright © 2011 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Sharfo, Abdul Wahab M; Breedveld, Sebastiaan; Voet, Peter W J; Heijkoop, Sabrina T; Mens, Jan-Willem M; Hoogeman, Mischa S; Heijmen, Ben J M
2016-01-01
To develop and validate fully automated generation of VMAT plan-libraries for plan-of-the-day adaptive radiotherapy in locally-advanced cervical cancer. Our framework for fully automated treatment plan generation (Erasmus-iCycle) was adapted to create dual-arc VMAT treatment plan libraries for cervical cancer patients. For each of 34 patients, automatically generated VMAT plans (autoVMAT) were compared to manually generated, clinically delivered 9-beam IMRT plans (CLINICAL), and to dual-arc VMAT plans generated manually by an expert planner (manVMAT). Furthermore, all plans were benchmarked against 20-beam equi-angular IMRT plans (autoIMRT). For all plans, a PTV coverage of 99.5% by at least 95% of the prescribed dose (46 Gy) had the highest planning priority, followed by minimization of V45Gy for small bowel (SB). Other OARs considered were bladder, rectum, and sigmoid. All plans had a highly similar PTV coverage, within the clinical constraints (above). After plan normalizations for exactly equal median PTV doses in corresponding plans, all evaluated OAR parameters in autoVMAT plans were on average lower than in the CLINICAL plans with an average reduction in SB V45Gy of 34.6% (p<0.001). For 41/44 autoVMAT plans, SB V45Gy was lower than for manVMAT (p<0.001, average reduction 30.3%), while SB V15Gy increased by 2.3% (p = 0.011). AutoIMRT reduced SB V45Gy by another 2.7% compared to autoVMAT, while also resulting in a 9.0% reduction in SB V15Gy (p<0.001), but with a prolonged delivery time. Differences between manVMAT and autoVMAT in bladder, rectal and sigmoid doses were ≤ 1%. Improvements in SB dose delivery with autoVMAT instead of manVMAT were higher for empty bladder PTVs compared to full bladder PTVs, due to differences in concavity of the PTVs. Quality of automatically generated VMAT plans was superior to manually generated plans. Automatic VMAT plan generation for cervical cancer has been implemented in our clinical routine. Due to the achieved workload reduction, extension of plan libraries has become feasible.
MeSH indexing based on automatically generated summaries
2013-01-01
Background MEDLINE citations are manually indexed at the U.S. National Library of Medicine (NLM) using as reference the Medical Subject Headings (MeSH) controlled vocabulary. For this task, the human indexers read the full text of the article. Due to the growth of MEDLINE, the NLM Indexing Initiative explores indexing methodologies that can support the task of the indexers. Medical Text Indexer (MTI) is a tool developed by the NLM Indexing Initiative to provide MeSH indexing recommendations to indexers. Currently, the input to MTI is MEDLINE citations, title and abstract only. Previous work has shown that using full text as input to MTI increases recall, but decreases precision sharply. We propose using summaries generated automatically from the full text for the input to MTI to use in the task of suggesting MeSH headings to indexers. Summaries distill the most salient information from the full text, which might increase the coverage of automatic indexing approaches based on MEDLINE. We hypothesize that if the results were good enough, manual indexers could possibly use automatic summaries instead of the full texts, along with the recommendations of MTI, to speed up the process while maintaining high quality of indexing results. Results We have generated summaries of different lengths using two different summarizers, and evaluated the MTI indexing on the summaries using different algorithms: MTI, individual MTI components, and machine learning. The results are compared to those of full text articles and MEDLINE citations. Our results show that automatically generated summaries achieve similar recall but higher precision compared to full text articles. Compared to MEDLINE citations, summaries achieve higher recall but lower precision. Conclusions Our results show that automatic summaries produce better indexing than full text articles. Summaries produce similar recall to full text but much better precision, which seems to indicate that automatic summaries can efficiently capture the most important contents within the original articles. The combination of MEDLINE citations and automatically generated summaries could improve the recommendations suggested by MTI. On the other hand, indexing performance might be dependent on the MeSH heading being indexed. Summarization techniques could thus be considered as a feature selection algorithm that might have to be tuned individually for each MeSH heading. PMID:23802936
Automated Testcase Generation for Numerical Support Functions in Embedded Systems
NASA Technical Reports Server (NTRS)
Schumann, Johann; Schnieder, Stefan-Alexander
2014-01-01
We present a tool for the automatic generation of test stimuli for small numerical support functions, e.g., code for trigonometric functions, quaternions, filters, or table lookup. Our tool is based on KLEE to produce a set of test stimuli for full path coverage. We use a method of iterative deepening over abstractions to deal with floating-point values. During actual testing the stimuli exercise the code against a reference implementation. We illustrate our approach with results of experiments with low-level trigonometric functions, interpolation routines, and mathematical support functions from an open source UAS autopilot.
KML Super Overlay to WMS Translator
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2007-01-01
This translator is a server-based application that automatically generates KML super overlay configuration files required by Google Earth for map data access via the Open Geospatial Consortium WMS (Web Map Service) standard. The translator uses a set of URL parameters that mirror the WMS parameters as much as possible, and it also can generate a super overlay subdivision of any given area that is only loaded when needed, enabling very large areas of coverage at very high resolutions. It can make almost any dataset available as a WMS service visible and usable in any KML application, without the need to reformat the data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aaltonen, T.; Amerio, S.; Amidei, D.
2014-07-23
We perform a search for new physics using final states consisting of three leptons and a large imbalance in transverse momentum resulting from proton-antiproton collisions at 1.96 TeV center-of-mass energy. We use data corresponding to 5.8 fb -1 of integrated luminosity recorded by the CDF II detector at the Tevatron collider. Our main objective is to investigate possible new low-momentum (down to 5 GeV/c) multi-leptonic final states not investigated by LHC experiments. Relative to previous CDF analyses, we expand the geometric and kinematic coverage of electrons and muons and utilize tau leptons that decay hadronically. Inclusion of tau leptons ismore » particularly important for supersymmetry (SUSY) searches. The results are consistent with standard-model predictions. By optimizing our event selection to increase sensitivity to the minimal supergravity (mSUGRA) SUSY model, we set limits on the associated production of chargino and neutralino, the SUSY partners of the electroweak gauge bosons. We exclude cross sections up to 0.1 pb and chargino masses up to 168 GeV/c 2 at 95% CL, for a suited set of mSUGRA parameters. We also exclude a region of the two-dimensional space of the masses of the neutralino and the supersymmetric partner of the tau lepton, not previously excluded at the Tevatron.« less
Gallucci, Marcello; Ricciardelli, Paola
2018-01-01
Social exclusion is a painful experience that is felt as a threat to the human need to belong and can lead to increased aggressive and anti-social behaviours, and results in emotional and cognitive numbness. Excluded individuals also seem to show an automatic tuning to positivity: they tend to increase their selective attention towards social acceptance signals. Despite these effects known in the literature, the consequences of social exclusion on social information processing still need to be explored in depth. The aim of this study was to investigate the effects of social exclusion on processing two features that are strictly bound in the appraisal of the meaning of facial expressions: gaze direction and emotional expression. In two experiments (N = 60, N = 45), participants were asked to identify gaze direction or emotional expressions from facial stimuli, in which both these features were manipulated. They performed these tasks in a four-block crossed design after being socially included or excluded using the Cyberball game. Participants’ empathy and self-reported emotions were recorded using the Empathy Quotient (EQ) and PANAS questionnaires. The Need Threat Scale and three additional questions were also used as manipulation checks in the second experiment. In both experiments, excluded participants showed to be less accurate than included participants in gaze direction discrimination. Modulatory effects of direct gaze (Experiment 1) and sad expression (Experiment 2) on the effects of social exclusion were found on response times (RTs) in the emotion recognition task. Specific differences in the reaction to social exclusion between males and females were also found in Experiment 2: excluded male participants tended to be less accurate and faster than included male participants, while excluded females showed a more accurate and slower performance than included female participants. No influence of social exclusion on PANAS or EQ scores was found. Results are discussed in the context of the importance of identifying gaze direction in appraisal theories. PMID:29617410
Mars global digital dune database: MC-30
Hayward, R.K.; Fenton, L.K.; Titus, T.N.; Colaprete, A.; Christensen, P.R.
2012-01-01
The Mars Global Digital Dune Database (MGD3) provides data and describes the methodology used in creating the global database of moderate- to large-size dune fields on Mars. The database is being released in a series of U.S. Geological Survey Open-File Reports. The first report (Hayward and others, 2007) included dune fields from lat 65° N. to 65° S. (http://pubs.usgs.gov/of/2007/1158/). The second report (Hayward and others, 2010) included dune fields from lat 60° N. to 90° N. (http://pubs.usgs.gov/of/2010/1170/). This report encompasses ~75,000 km2 of mapped dune fields from lat 60° to 90° S. The dune fields included in this global database were initially located using Mars Odyssey Thermal Emission Imaging System (THEMIS) Infrared (IR) images. In the previous two reports, some dune fields may have been unintentionally excluded for two reasons: (1) incomplete THEMIS IR (daytime) coverage may have caused us to exclude some moderate- to large-size dune fields or (2) resolution of THEMIS IR coverage (100 m/pixel) certainly caused us to exclude smaller dune fields. In this report, mapping is more complete. The Arizona State University THEMIS daytime IR mosaic provided complete IR coverage, and it is unlikely that we missed any large dune fields in the South Pole (SP) region. In addition, the increased availability of higher resolution images resulted in the inclusion of more small (~1 km2) sand dune fields and sand patches. To maintain consistency with the previous releases, we have identified the sand features that would not have been included in earlier releases. While the moderate to large dune fields in MGD3 are likely to constitute the largest compilation of sediment on the planet, we acknowledge that our database excludes numerous small dune fields and some moderate to large dune fields as well. Please note that the absence of mapped dune fields does not mean that dune fields do not exist and is not intended to imply a lack of saltating sand in other areas. Where availability and quality of THEMIS visible (VIS), Mars Orbiter Camera (MOC) narrow angle, Mars Express High Resolution Stereo Camera, or Mars Reconnaissance Orbiter Context Camera and High Resolution Imaging Science Experiment images allowed, we classified dunes and included some dune slipface measurements, which were derived from gross dune morphology and represent the approximate prevailing wind direction at the last time of significant dune modification. It was beyond the scope of this report to look at the detail needed to discern subtle dune modification. It was also beyond the scope of this report to measure all slipfaces. We attempted to include enough slipface measurements to represent the general circulation (as implied by gross dune morphology) and to give a sense of the complex nature of aeolian activity on Mars. The absence of slipface measurements in a given direction should not be taken as evidence that winds in that direction did not occur. When a dune field was located within a crater, the azimuth from crater centroid to dune field centroid was calculated, as another possible indicator of wind direction. Output from a general circulation model is also included. In addition to polygons locating dune fields, the database includes ~700 of the THEMIS VIS and MOC images that were used to build the database.
NASA Astrophysics Data System (ADS)
Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.
2017-08-01
In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.
Kammermeier, Jochen; Drury, Suzanne; James, Chela T; Dziubak, Robert; Ocaka, Louise; Elawad, Mamoun; Beales, Philip; Lench, Nicholas; Uhlig, Holm H; Bacchelli, Chiara; Shah, Neil
2014-11-01
Multiple monogenetic conditions with partially overlapping phenotypes can present with inflammatory bowel disease (IBD)-like intestinal inflammation. With novel genotype-specific therapies emerging, establishing a molecular diagnosis is becoming increasingly important. We have introduced targeted next-generation sequencing (NGS) technology as a prospective screening tool in children with very early onset IBD (VEOIBD). We evaluated the coverage of 40 VEOIBD genes in two separate cohorts undergoing targeted gene panel sequencing (TGPS) (n=25) and whole exome sequencing (WES) (n=20). TGPS revealed causative mutations in four genes (IL10RA, EPCAM, TTC37 and SKIV2L) discovered unexpected phenotypes and directly influenced clinical decision making by supporting as well as avoiding haematopoietic stem cell transplantation. TGPS resulted in significantly higher median coverage when compared with WES, fewer coverage deficiencies and improved variant detection across established VEOIBD genes. Excluding or confirming known VEOIBD genotypes should be considered early in the disease course in all cases of therapy-refractory VEOIBD, as it can have a direct impact on patient management. To combine both described NGS technologies would compensate for the limitations of WES for disease-specific application while offering the opportunity for novel gene discovery in the research setting. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Out-of-pocket costs and insurance coverage for abortion in the United States.
Roberts, Sarah C M; Gould, Heather; Kimport, Katrina; Weitz, Tracy A; Foster, Diana Greene
2014-01-01
Since 1976, federal Medicaid has excluded abortion care except in a small number of circumstances; 17 states provide this coverage using state Medicaid dollars. Since 2010, federal and state restrictions on insurance coverage for abortion have increased. This paper describes payment for abortion care before new restrictions among a sample of women receiving first and second trimester abortions. Data are from the Turnaway Study, a study of women seeking abortion care at 30 facilities across the United States. Two thirds received financial assistance, with those with pregnancies at later gestations more likely to receive assistance. Seven percent received funding from private insurance, 34% state Medicaid, and 29% other organizations. Median out-of-pocket costs when private insurance or Medicaid paid were $18 and $0. Median out-of-pocket cost for women for whom insurance or Medicaid did not pay was $575. For more than half, out-of-pocket costs were equivalent to more than one-third of monthly personal income; this was closer to two thirds among those receiving later abortions. One quarter who had private insurance had their abortion covered through insurance. Among women possibly eligible for Medicaid based on income and residence, more than one third received Medicaid coverage for the abortion. More than half reported cost as a reason for delay in obtaining an abortion. In a multivariate analysis, living in a state where Medicaid for abortion was available, having Medicaid or private insurance, being at a lower gestational age, and higher income were associated with lower odds of reporting cost as a reason for delay. Out-of-pocket costs for abortion care are substantial for many women, especially at later gestations. There are significant gaps in public and private insurance coverage for abortion. Copyright © 2014 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.
Structural Validation of Nursing Terminologies
Hardiker, Nicholas R.; Rector, Alan L.
2001-01-01
Objective: The purpose of the study is twofold: 1) to explore the applicability of combinatorial terminologies as the basis for building enumerated classifications, and 2) to investigate the usefulness of formal terminological systems for performing such classification and for assisting in the refinement of both combinatorial terminologies and enumerated classifications. Design: A formal model of the beta version of the International Classification for Nursing Practice (ICNP) was constructed in the compositional terminological language GRAIL (GALEN Representation and Integration Language). Terms drawn from the North American Nursing Diagnosis Association Taxonomy I (NANDA taxonomy) were mapped into the model and classified automatically using GALEN technology. Measurements: The resulting generated hierarchy was compared with the NANDA taxonomy to assess coverage and accuracy of classification. Results: In terms of coverage, in this study ICNP was able to capture 77 percent of NANDA terms using concepts drawn from five of its eight axes. Three axes—Body Site, Topology, and Frequency—were not needed. In terms of accuracy, where hierarchic relationships existed in the generated hierarchy or the NANDA taxonomy, or both, 6 were identical, 19 existed in the generated hierarchy alone (2 of these were considered suitable for incorporation into the NANDA taxonomy and 17 were considered inaccurate), and 23 appeared in the NANDA taxonomy alone (8 of these were considered suitable for incorporation into ICNP, 9 were considered inaccurate, and 6 reflected different, equally valid perspectives). Sixty terms appeared at the top level, with no indenting, in both the generated hierarchy and the NANDA taxonomy. Conclusions: With appropriate refinement, combinatorial terminologies such as ICNP have the potential to provide a useful foundation for representing enumerated classifications such as NANDA. Technologies such as GALEN make possible the process of building automatically enumerated classifications while providing a useful means of validating and refining both combinatorial terminologies and enumerated classifications. PMID:11320066
NASA Astrophysics Data System (ADS)
Hsu, Kuo-Hsien
2012-11-01
Formosat-2 image is a kind of high-spatial-resolution (2 meters GSD) remote sensing satellite data, which includes one panchromatic band and four multispectral bands (Blue, Green, Red, near-infrared). An essential sector in the daily processing of received Formosat-2 image is to estimate the cloud statistic of image using Automatic Cloud Coverage Assessment (ACCA) algorithm. The information of cloud statistic of image is subsequently recorded as an important metadata for image product catalog. In this paper, we propose an ACCA method with two consecutive stages: preprocessing and post-processing analysis. For pre-processing analysis, the un-supervised K-means classification, Sobel's method, thresholding method, non-cloudy pixels reexamination, and cross-band filter method are implemented in sequence for cloud statistic determination. For post-processing analysis, Box-Counting fractal method is implemented. In other words, the cloud statistic is firstly determined via pre-processing analysis, the correctness of cloud statistic of image of different spectral band is eventually cross-examined qualitatively and quantitatively via post-processing analysis. The selection of an appropriate thresholding method is very critical to the result of ACCA method. Therefore, in this work, We firstly conduct a series of experiments of the clustering-based and spatial thresholding methods that include Otsu's, Local Entropy(LE), Joint Entropy(JE), Global Entropy(GE), and Global Relative Entropy(GRE) method, for performance comparison. The result shows that Otsu's and GE methods both perform better than others for Formosat-2 image. Additionally, our proposed ACCA method by selecting Otsu's method as the threshoding method has successfully extracted the cloudy pixels of Formosat-2 image for accurate cloud statistic estimation.
Rethinking the western construction of the welfare state.
Walker, A; Wong, C K
1996-01-01
This article employs case studies of China and Hong Kong to question the western ethnocentric construction of the welfare state that predominates in comparative social policy research. The authors argue that welfare regimes, and particularly the "welfare state," have been constructed as capitalist-democratic projects and that this has the damaging effect of excluding from analyses not only several advanced capitalist societies in the Asian-Pacific area but also the world's most populous country. If welfare state regimes can only coexist with western political democracies, then China and Hong Kong are excluded automatically. A similar result occurs if the traditional social administration approach is adopted whereby a "welfare state" is defined in terms only of direct state provision. The authors argue that such assumptions are untenable if state welfare is to be analyzed as a universal phenomenon. Instead of being trapped within an ethnocentric welfare statism, what social policy requires is a global political economy perspective that facilitates comparisons of the meaning of welfare and the state's role in producing it north, south, east and west.
The ethics of insurance limiting institutional medical care: It's all about the money.
Jones, James W; McCullough, Laurence B
2016-04-01
Dr F. Inest practices surgery at a renowned medical center but is concerned because increasing numbers of medical insurers are excluding his institution from coverage. Many of his former referring physicians are beginning to send their patients elsewhere for this reason. The marketing people have been busy increasing their advertising buys and exploring new business models. There is even talk about reducing expensive clinical trials. However, regardless of his affiliation, he has little control over these and other organizational decisions that directly impact his practice clinically and fiscally. What should he do? Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Reaction Mechanism Generator: Automatic construction of chemical kinetic mechanisms
Gao, Connie W.; Allen, Joshua W.; Green, William H.; ...
2016-02-24
Reaction Mechanism Generator (RMG) constructs kinetic models composed of elementary chemical reaction steps using a general understanding of how molecules react. Species thermochemistry is estimated through Benson group additivity and reaction rate coefficients are estimated using a database of known rate rules and reaction templates. At its core, RMG relies on two fundamental data structures: graphs and trees. Graphs are used to represent chemical structures, and trees are used to represent thermodynamic and kinetic data. Models are generated using a rate-based algorithm which excludes species from the model based on reaction fluxes. RMG can generate reaction mechanisms for species involvingmore » carbon, hydrogen, oxygen, sulfur, and nitrogen. It also has capabilities for estimating transport and solvation properties, and it automatically computes pressure-dependent rate coefficients and identifies chemically-activated reaction paths. RMG is an object-oriented program written in Python, which provides a stable, robust programming architecture for developing an extensible and modular code base with a large suite of unit tests. Computationally intensive functions are cythonized for speed improvements.« less
A methodology for post-mainshock probabilistic assessment of building collapse risk
Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.
2011-01-01
This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.
Automatic characterization of neointimal tissue by intravascular optical coherence tomography.
Ughi, Giovanni J; Steigerwald, Kristin; Adriaenssens, Tom; Desmet, Walter; Guagliumi, Giulio; Joner, Michael; D'hooge, Jan
2014-02-01
Intravascular optical coherence tomography (IVOCT) is rapidly becoming the method of choice for assessing vessel healing after stent implantation due to its unique axial resolution <20 μm. The amount of neointimal coverage is an important parameter. In addition, the characterization of neointimal tissue maturity is also of importance for an accurate analysis, especially in the case of drug-eluting and bioresorbable stent devices. Previous studies indicated that well-organized mature neointimal tissue appears as a high-intensity, smooth, and homogeneous region in IVOCT images, while lower-intensity signal areas might correspond to immature tissue mainly composed of acellular material. A new method for automatic neointimal tissue characterization, based on statistical texture analysis and a supervised classification technique, is presented. Algorithm training and validation were obtained through the use of 53 IVOCT images supported by histology data from atherosclerotic New Zealand White rabbits. A pixel-wise classification accuracy of 87% and a two-dimensional region-based analysis accuracy of 92% (with sensitivity and specificity of 91% and 93%, respectively) were found, suggesting that a reliable automatic characterization of neointimal tissue was achieved. This may potentially expand the clinical value of IVOCT in assessing the completeness of stent healing and speed up the current analysis methodologies (which are, due to their time- and energy-consuming character, not suitable for application in large clinical trials and clinical practice), potentially allowing for a wider use of IVOCT technology.
Xiong, Juyang; Hipgrave, David; Myklebust, Karoline; Guo, Sufang; Scherpbier, Robert W; Tong, Xuetao; Yao, Lan; Moran, Andrew E
2013-11-01
China embarked on an ambitious health system reform in 2009, and pledged to achieve universal health insurance coverage by 2020. However, there are gaps in access to healthcare for some children in China. We assessed health insurance status and associated variables among children under five in twelve communities in 2010: two urban community health centers and two rural township health centers in each of three municipalities located in China's distinctly different East, Central and Western regions. Information on demographic and socio-economic variables and children's insurance status was gathered from parents or caregivers of all children enrolled in local health programs, and others recruited from the local communities. Only 62% of 1131 children assessed were insured. This figure did not vary across geographic regions, but urban children were less likely to be insured than rural children. In multivariate analysis, infants were 2.44 times more likely to be uninsured than older children and children having at least one migrant parent were 1.90 times more likely to be uninsured than those living with non-migrant parents. Low maternal education was also associated with being uninsured. Gaps in China's child health insurance coverage might be bridged if newborns are automatically covered from birth, and if insurance is extended to all urban migrant children, regardless of the family's residential registration status and size. Copyright © 2013 Elsevier Ltd. All rights reserved.
Helicopter In-Flight Tracking System (HITS) for the Gulf of Mexico
NASA Technical Reports Server (NTRS)
Martone, Patrick; Tucker, George; Aiken, Edwin W. (Technical Monitor)
2001-01-01
The National Aeronautics and Space Administration (NASA) Ames Research Center (ARC) is sponsoring deployment and testing of the Helicopter In-flight Tracking System (HITS) in a portion of the Gulf of Mexico offshore area. Using multilateration principles, HITS determines the location and altitude of all transponder-equipped aircraft without requiring changes to current Mode A, C or S avionics. HITS tracks both rotary and fixed-wing aircraft operating in the 8,500 sq. mi. coverage region. The minimum coverage altitude of 100 ft. is beneficial for petroleum industry, allowing helicopters to be tracked onto the pad of most derricks. In addition to multilateration, HITS provides surveillance reports for aircraft equipped for Automatic Dependent Surveillance - Broadcast (ADS-B), a new surveillance system under development by the Federal Aviation Administration (FAA). The U.S. Department of Transportation (DOT) Volpe National Transportation Systems Center (Volpe Center) is supporting NASA in managing HITS installation and operation, and in evaluating the system's effectiveness. Senses Corporation is supplying, installing and maintaining the HITS ground system. Project activities are being coordinated with the FAA and local helicopter operators. Flight-testing in the Gulf will begin in early 2002. This paper describes the HITS project - specifically, the system equipment (architecture, remote sensors, central processing system at Intracoastal City, LA, and communications) and its performance (accuracy, coverage, and reliability). The paper also presents preliminary results of flight tests.
[Quality of the Early Cervical Cancer Detection Program in the State of Nuevo León].
Salinas-Martínez, A M; Villarreal-Ríos, E; Garza-Elizondo, M E; Fraire-Gloria, J M; López-Franco, J J; Barboza-Quintana, O
1997-01-01
To determine the quality of the Early Cervical Cancer Detection Program in the state of Nuevo León. A random selection of 4791 cytologic reports were analyzed, emitted by the State Ministry of Health, the University Hospital and the Mexican Institute for Social Security early cervical cancer detection modules. Pap tests of women with hysterectomy, current pregnancy, menopause or positive result were excluded. Quality was measured with previously defined standards. Analysis included, besides univariate statistics, tests of significance for proportions and means. The quality of the program was fairly satisfactory at the level of the State. The quality of the sampling procedure was low; 39.9% of the tests contained endocervical cells. Quality of coverage was low; 15.6% were women 25+years with first time Pap test. Quality of opportunity was high; 8.5 +/- 7 weekdays between the date of the pap smear and the interpretation date. Strategies are needed to increase the impact of the state program, such as improving the sampling procedure and the coverage quality levels.
Russell, Margaret L; Thurston, Wilfreda E; Henderson, Elizabeth A
2003-10-01
Low rates of staff influenza vaccine coverage occur in many health care facilities. Many programs do not offer vaccination to physicians or to volunteers, and some programs do not measure coverage or do so only for a subset of staff. The use of theory in planning and evaluation may prevent these problems and lead to more effective programs. We discuss the use of theory in the planning and evaluation of health programs and demonstrate how it can be used for the evaluation and planning of a hospital or nursing home influenza control program. The application of theory required explicit statement of the goals of the program and examination of the assumptions underlying potential program activities. This indicated that staff should probably be considered as employees, volunteers, physicians, and contractors of the facility. It also directed attention to evidence-based strategies for increasing vaccination rates. The application of a program planning model to a problem of institutional influenza prevention may prevent planners from excluding important target populations and failing to monitor the important indicators of program success.
NASA Astrophysics Data System (ADS)
Custodio, S.; Matos, C.; Grigoli, F.; Cesca, S.; Heimann, S.; Rio, I.
2015-12-01
Seismic data processing is currently undergoing a step change, benefitting from high-volume datasets and advanced computer power. In the last decade, a permanent seismic network of 30 broadband stations, complemented by dense temporary deployments, covered mainland Portugal. This outstanding regional coverage currently enables the computation of a high-resolution image of the seismicity of Portugal, which contributes to fitting together the pieces of the regional seismo-tectonic puzzle. Although traditional manual inspections are valuable to refine automatic results they are impracticable with the big data volumes now available. When conducted alone they are also less objective since the criteria is defined by the analyst. In this work we present CatchPy, a scanning algorithm to detect earthquakes in continuous datasets. Our main goal is to implement an automatic earthquake detection and location routine in order to have a tool to quickly process large data sets, while at the same time detecting low magnitude earthquakes (i.e. lowering the detection threshold). CatchPY is designed to produce an event database that could be easily located using existing location codes (e.g.: Grigoli et al. 2013, 2014). We use CatchPy to perform automatic detection and location of earthquakes that occurred in Alentejo region (South Portugal), taking advantage of a dense seismic network deployed in the region for two years during the DOCTAR experiment. Results show that our automatic procedure is particularly suitable for small aperture networks. The event detection is performed by continuously computing the short-term-average/long-term-average of two different characteristic functions (CFs). For the P phases we used a CF based on the vertical energy trace while for S phases we used a CF based on the maximum eigenvalue of the instantaneous covariance matrix (Vidale 1991). Seismic event location is performed by waveform coherence analysis, scanning different hypocentral coordinates (Grigoli et al. 2013, 2014). The reliability of automatic detections, phase pickings and locations are tested trough the quantitative comparison with manual results. This work is supported by project QuakeLoc, reference: PTDC/GEO-FIQ/3522/2012
NASA Astrophysics Data System (ADS)
Chiu, L.; Vongsaard, J.; El-Ghazawi, T.; Weinman, J.; Yang, R.; Kafatos, M.
U Due to the poor temporal sampling by satellites, data gaps exist in satellite derived time series of precipitation. This poses a challenge for assimilating rain- fall data into forecast models. To yield a continuous time series, the classic image processing technique of digital image morphing has been used. However, the digital morphing technique was applied manually and that is time consuming. In order to avoid human intervention in the process, an automatic procedure for image morphing is needed for real-time operations. For this purpose, Genetic Algorithm Based Image Registration Automatic Morphing (GRAM) model was developed and tested in this paper. Specifically, automatic morphing technique was integrated with Genetic Algo- rithm and Feature Based Image Metamorphosis technique to fill in data gaps between satellite coverage. The technique was tested using NOWRAD data which are gener- ated from the network of NEXRAD radars. Time series of NOWRAD data from storm Floyd that occurred at the US eastern region on September 16, 1999 for 00:00, 01:00, 02:00,03:00, and 04:00am were used. The GRAM technique was applied to data col- lected at 00:00 and 04:00am. These images were also manually morphed. Images at 01:00, 02:00 and 03:00am were interpolated from the GRAM and manual morphing and compared with the original NOWRAD rainrates. The results show that the GRAM technique outperforms manual morphing. The correlation coefficients between the im- ages generated using manual morphing are 0.905, 0.900, and 0.905 for the images at 01:00, 02:00,and 03:00 am, while the corresponding correlation coefficients are 0.946, 0.911, and 0.913, respectively, based on the GRAM technique. Index terms Remote Sensing, Image Registration, Hydrology, Genetic Algorithm, Morphing, NEXRAD
Hull, Brynley P; Dey, Aditi; Menzies, Rob I; Brotherton, Julia M; McIntyre, Peter B
2014-09-30
This, the 6th annual immunisation coverage report, documents trends during 2012 for a range of standard measures derived from Australian Childhood Immunisation Register (ACIR) data, and National Human Papillomavirus (HPV) Vaccination Program Register data. These include coverage at standard age milestones and for individual vaccines included on the National Immunisation Program (NIP) and coverage in adolescents and adults. The proportion of Australian children 'fully vaccinated' at 12, 24 and 60 months of age was 91.7%, 92.5% and 91.2%, respectively. For vaccines available on the NIP but not assessed during 2012 for 'fully vaccinated' status or for eligibility for incentive payments (rotavirus and pneumococcal at 12 months and meningococcal C and varicella at 24 months) coverage varied. Although pneumococcal vaccine had similar coverage at 12 months to other vaccines, coverage was lower for rotavirus at 12 months (83.6%) and varicella at 24 months (84.4%). Although 'fully vaccinated' coverage at 12 months of age was lower among Indigenous children than non-Indigenous children in all jurisdictions, the extent of the difference varied, reaching a 15 percentage point differential in South Australia but only a 0.4 percentage point differential in the Northern Territory. Overall, Indigenous coverage at 24 months of age exceeded that at 12 months of age nationally and for all jurisdictions, but as receipt of varicella vaccine at 18 months is excluded from calculations, this represents delayed immunisation, with some contribution from immunisation incentives. The 'fully vaccinated' coverage estimates for vaccinations due by 60 months of age for Indigenous children exceeded 90% at 91% in 2012. Unlike in 2011, at 60 months of age, there was no dramatic variation in coverage between Indigenous and non-Indigenous children for individual jurisdictions. As previously documented, vaccines recommended for Indigenous children only, hepatitis A and pneumococcal vaccine, had suboptimal coverage at 60.1% and 73.1%, respectively, although there was a considerable improvement in coverage from 2011, 57.7% and 68.2% respectively. On-time receipt (before 49 months of age) of vaccines by Indigenous children at the 60-month milestone age improved substantially between 2011 (19%) and 2012 (38%) but the disparity in on-time vaccination between Indigenous and non-Indigenous children worsened at the 60-month age milestone from 2011 (from 1.8 to 5.4 percentage points) and remained the same for the 12 and 24-month age milestones. By late 2012, the percentage of children who received the 1st dose of DTPa vaccine dose at less than 8 weeks of age was greater than 50% in all but 1 jurisdiction and greater than 70% for New South Wales, the Australian Capital Territory and Tasmania. Further, by late 2012, the percentage of children who received the 4th dose of DTPa vaccine dose at less than 4 years of age was greater than 30% in 3 jurisdictions. The percentage of children whose parents officially objected to vaccination in Australia was 1.7% and this figure varied by jurisdiction. However, there is a further 2.1% of children whose parents don't officially object but whose children have no vaccines recorded on the ACIR. Coverage data for the 3rd dose of HPV from the national HPV register in the school catch up program was similar to 2011 at 71% but was substantially lower for the catch up program for females outside school (44%-69%), although this was an improvement from 2011.
An Efficient Functional Test Generation Method For Processors Using Genetic Algorithms
NASA Astrophysics Data System (ADS)
Hudec, Ján; Gramatová, Elena
2015-07-01
The paper presents a new functional test generation method for processors testing based on genetic algorithms and evolutionary strategies. The tests are generated over an instruction set architecture and a processor description. Such functional tests belong to the software-oriented testing. Quality of the tests is evaluated by code coverage of the processor description using simulation. The presented test generation method uses VHDL models of processors and the professional simulator ModelSim. The rules, parameters and fitness functions were defined for various genetic algorithms used in automatic test generation. Functionality and effectiveness were evaluated using the RISC type processor DP32.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, Jiangye
Up-to-date maps of installed solar photovoltaic panels are a critical input for policy and financial assessment of solar distributed generation. However, such maps for large areas are not available. With high coverage and low cost, aerial images enable large-scale mapping, bit it is highly difficult to automatically identify solar panels from images, which are small objects with varying appearances dispersed in complex scenes. We introduce a new approach based on deep convolutional networks, which effectively learns to delineate solar panels in aerial scenes. The approach has successfully mapped solar panels in imagery covering 200 square kilometers in two cities, usingmore » only 12 square kilometers of training data that are manually labeled.« less
Automatic Temporal Tracking of Supra-Glacial Lakes
NASA Astrophysics Data System (ADS)
Liang, Y.; Lv, Q.; Gallaher, D. W.; Fanning, D.
2010-12-01
During the recent years, supra-glacial lakes in Greenland have attracted extensive global attention as they potentially play an important role in glacier movement, sea level rise, and climate change. Previous works focused on classification methods and individual cloud-free satellite images, which have limited capabilities in terms of tracking changes of lakes over time. The challenges of tracking supra-glacial lakes automatically include (1) massive amount of satellite images with diverse qualities and frequent cloud coverage, and (2) diversity and dynamics of large number of supra-glacial lakes on the Greenland ice sheet. In this study, we develop an innovative method to automatically track supra-glacial lakes temporally using the Moderate Resolution Imaging Spectroradiometer (MODIS) time-series data. The method works for both cloudy and cloud-free data and is unsupervised, i.e., no manual identification is required. After selecting the highest-quality image within each time interval, our method automatically detects supra-glacial lakes in individual images, using adaptive thresholding to handle diverse image qualities. We then track lakes across time series of images as lakes appear, change in size, and disappear. Using multi-year MODIS data during melting season, we demonstrate that this new method can detect and track supra-glacial lakes in both space and time with 95% accuracy. Attached figure shows an example of the current result. Detailed analysis of the temporal variation of detected lakes will be presented. (a) One of our experimental data. The Investigated region is centered at Jakobshavn Isbrae glacier in west Greenland. (b) Enlarged view of part of ice sheet. It is partially cloudy and with supra-glacial lakes on it. Lakes are shown as dark spots. (c) Current result. Red spots are detected lakes.
Synonym set extraction from the biomedical literature by lexical pattern discovery.
McCrae, John; Collier, Nigel
2008-03-24
Although there are a large number of thesauri for the biomedical domain many of them lack coverage in terms and their variant forms. Automatic thesaurus construction based on patterns was first suggested by Hearst 1, but it is still not clear how to automatically construct such patterns for different semantic relations and domains. In particular it is not certain which patterns are useful for capturing synonymy. The assumption of extant resources such as parsers is also a limiting factor for many languages, so it is desirable to find patterns that do not use syntactical analysis. Finally to give a more consistent and applicable result it is desirable to use these patterns to form synonym sets in a sound way. We present a method that automatically generates regular expression patterns by expanding seed patterns in a heuristic search and then develops a feature vector based on the occurrence of term pairs in each developed pattern. This allows for a binary classifications of term pairs as synonymous or non-synonymous. We then model this result as a probability graph to find synonym sets, which is equivalent to the well-studied problem of finding an optimal set cover. We achieved 73.2% precision and 29.7% recall by our method, out-performing hand-made resources such as MeSH and Wikipedia. We conclude that automatic methods can play a practical role in developing new thesauri or expanding on existing ones, and this can be done with only a small amount of training data and no need for resources such as parsers. We also concluded that the accuracy can be improved by grouping into synonym sets.
NASA Astrophysics Data System (ADS)
Zhou, Kaixing; Sun, Xiucong; Huang, Hai; Wang, Xinsheng; Ren, Guangwei
2017-10-01
The space-based Automatic Dependent Surveillance - Broadcast (ADS-B) is a new technology for air traffic management. The satellite equipped with spaceborne ADS-B system receives the broadcast signals from aircraft and transfers the message to ground stations, so as to extend the coverage area of terrestrial-based ADS-B. In this work, a novel satellite single-axis attitude determination solution based on the ADS-B receiving system is proposed. This solution utilizes the signal-to-noise ratio (SNR) measurement of the broadcast signals from aircraft to determine the boresight orientation of the ADS-B receiving antenna fixed on the satellite. The basic principle of this solution is described. The feasibility study of this new attitude determination solution is implemented, including the link budget and the access analysis. On this basis, the nonlinear least squares estimation based on the Levenberg-Marquardt method is applied to estimate the single-axis orientation. A full digital simulation has been carried out to verify the effectiveness and performance of this solution. Finally, the corresponding results are processed and presented minutely.
Methods for Processing and Interpretation of AIS Signals Corrupted by Noise and Packet Collisions
NASA Astrophysics Data System (ADS)
Poļevskis, J.; Krastiņš, M.; Korāts, G.; Skorodumovs, A.; Trokšs, J.
2012-01-01
The authors deal with the operation of Automatic Identification System (AIS) used in the marine traffic monitoring to broadcast messages containing information about the vessel: id, payload, size, speed, destination etc., meant primarily for avoidance of ship collisions. To extend the radius of AIS operation, it is envisaged to dispose its receivers on satellites. However, in space, due to a large coverage area, interfering factors are especially pronounced - such as packet collision, Doppler's shift and noise impact on AIS message receiving, pre-processing and decoding. To assess the quality of an AIS receiver's operation, a test was carried out in which, varying automatically frequency, amplitude, noise, and other parameters, the data on the ability of the receiver's ability to decode AIS signals are collected. In the work, both hardware- and software-based AIS decoders were tested. As a result, quite satisfactory statistics has been gathered - both on the common and the differing features of such decoders when operating in space. To obtain reliable data on the software-defined radio AIS receivers, further research is envisaged.
New automatic mode of visualizing the colon via Cine CT
NASA Astrophysics Data System (ADS)
Udupa, Jayaram K.; Odhner, Dewey; Eisenberg, Harvey C.
2001-05-01
Methods of visualizing the inner colonic wall by using CT images has actively been pursued in recent years in an attempt to eventually replace conventional colonoscopic examination. In spite of impressive progress in this direction, there are still several problems, which need satisfactory solutions. Among these, we address three problems in this paper: segmentation, coverage, and speed of rendering. Instead of thresholding, we utilize the fuzzy connectedness framework to segment the colonic wall. Instead of the endoscopic viewing mode and various mapping techniques, we utilize the central line through the colon to generate automatically viewing directions that are enface with respect to the colon wall, thereby avoiding blind spots in viewing. We utilize some modifications of the ultra fast shell rendering framework to ensure fast rendering speed. The combined effect of these developments is that a colon study requires an initial 5 minutes of operator time plus an additional 5 minutes of computational time and subsequently enface renditions are created in real time (15 frames/sec) on a 1 GHz Pentium PC under the Linux operating system.
NASA Astrophysics Data System (ADS)
Kesseli, Aurora Y.; West, Andrew A.; Veyette, Mark; Harrison, Brandon; Feldman, Dan; Bochanski, John J.
2017-06-01
We present a library of empirical stellar spectra created using spectra from the Sloan Digital Sky Survey’s Baryon Oscillation Spectroscopic Survey. The templates cover spectral types O5 through L3, are binned by metallicity from -2.0 dex through +1.0 dex, and are separated into main-sequence (dwarf) stars and giant stars. With recently developed M dwarf metallicity indicators, we are able to extend the metallicity bins down through the spectral subtype M8, making this the first empirical library with this degree of temperature and metallicity coverage. The wavelength coverage for the templates is from 3650 to 10200 Å at a resolution of better than R ˜ 2000. Using the templates, we identify trends in color space with metallicity and surface gravity, which will be useful for analyzing large data sets from upcoming missions like the Large Synoptic Survey Telescope. Along with the templates, we are releasing a code for automatically (and/or visually) identifying the spectral type and metallicity of a star.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kesseli, Aurora Y.; West, Andrew A.; Veyette, Mark
We present a library of empirical stellar spectra created using spectra from the Sloan Digital Sky Survey’s Baryon Oscillation Spectroscopic Survey. The templates cover spectral types O5 through L3, are binned by metallicity from −2.0 dex through +1.0 dex, and are separated into main-sequence (dwarf) stars and giant stars. With recently developed M dwarf metallicity indicators, we are able to extend the metallicity bins down through the spectral subtype M8, making this the first empirical library with this degree of temperature and metallicity coverage. The wavelength coverage for the templates is from 3650 to 10200 Å at a resolution ofmore » better than R ∼ 2000. Using the templates, we identify trends in color space with metallicity and surface gravity, which will be useful for analyzing large data sets from upcoming missions like the Large Synoptic Survey Telescope. Along with the templates, we are releasing a code for automatically (and/or visually) identifying the spectral type and metallicity of a star.« less
VizieR Online Data Catalog: 05 through L3 empirical stellar spectra from SDSS (Kesseli+, 2017)
NASA Astrophysics Data System (ADS)
Kesseli, A. Y.; West, A. A.; Veyette, M.; Harrison, B.; Feldman, D.; Bochanski, J. J.
2017-08-01
We present a library of empirical stellar spectra created using spectra from the Sloan Digital Sky Survey's Baryon Oscillation Spectroscopic Survey. The templates cover spectral types O5 through L3, are binned by metallicity from -2.0dex through +1.0dex, and are separated into main-sequence (dwarf) stars and giant stars. With recently developed M dwarf metallicity indicators, we are able to extend the metallicity bins down through the spectral subtype M8, making this the first empirical library with this degree of temperature and metallicity coverage. The wavelength coverage for the templates is from 3650 to 10200Å at a resolution of better than R~2000. Using the templates, we identify trends in color space with metallicity and surface gravity, which will be useful for analyzing large data sets from upcoming missions like the Large Synoptic Survey Telescope. Along with the templates, we are releasing a code for automatically (and/or visually) identifying the spectral type and metallicity of a star. (3 data files).
Cognition and take-up of subsidized drug benefits by Medicare beneficiaries.
Kuye, Ifedayo O; Frank, Richard G; McWilliams, J Michael
2013-06-24
Take-up of the Medicare Part D low-income subsidy (LIS) by eligible beneficiaries has been low despite the attractive drug coverage it offers at no cost to beneficiaries and outreach efforts by the Social Security Administration. To examine the role of beneficiaries' cognitive abilities in explaining this puzzle. Analysis of survey data from the nationally representative Health and Retirement Study. Elderly Medicare beneficiaries who were likely eligible for the LIS, excluding Medicaid and Supplemental Security Income recipients who automatically receive the subsidy without applying. Using survey assessments of overall cognition and numeracy from 2006 to 2010, we examined how cognitive abilities were associated with self-reported Part D enrollment, awareness of the LIS, and application for the LIS. We also compared out-of-pocket drug spending and premium costs between LIS-eligible beneficiaries who did and did not report receipt of the LIS. Analyses were adjusted for sociodemographic characteristics, household income and assets, health status, and presence of chronic conditions. Compared with LIS-eligible beneficiaries in the top quartile of overall cognition, those in the bottom quartile were significantly less likely to report Part D enrollment (adjusted rate, 63.5% vs 52.0%; P = .002), LIS awareness (58.3% vs 33.3%; P = .001), and LIS application (25.5% vs 12.7%; P < .001). Lower numeracy was also associated with lower rates of Part D enrollment (P = .03) and LIS application (P = .002). Reported receipt of the LIS was associated with significantly lower annual out-of-pocket drug spending (adjusted mean difference, -$256; P = .02) and premium costs (-$273; P = .02). Among Medicare beneficiaries likely eligible for the Part D LIS, poorer cognition and numeracy were associated with lower reported take-up. Current educational and outreach efforts encouraging LIS applications may not be sufficient for beneficiaries with limited abilities to process and respond to information. Additional policies may be needed to extend the financial protection conferred by the LIS to all eligible seniors.
Bimodal albedo distributions in the ablation zone of the southwestern Greenland Ice Sheet
NASA Astrophysics Data System (ADS)
Moustafa, S. E.; Rennermalm, A. K.; Smith, L. C.; Miller, M. A.; Mioduszewski, J. R.
2014-09-01
Surface albedo is a key variable controlling solar radiation absorbed at the Greenland Ice Sheet (GrIS) surface, and thus, meltwater production. Recent decline in surface albedo over the GrIS has been linked to enhanced snow grain metamorphic rates and amplified ice-albedo feedback from atmospheric warming. However, the importance of distinct surface types on ablation zone albedo and meltwater production is still relatively unknown, and excluded in surface mass balance models. In this study, we analyze albedo and ablation rates using in situ and remotely-sensed data. Observations include: (1) a new high-quality in situ spectral albedo dataset collected with an Analytical Spectral Devices (ASD) spectroradiometer measuring at 325-1075 nm, along a 1.25 km transect during three days in June 2013; (2) broadband albedo at two automatic weather stations; and (3) daily MODerate Resolution Imaging Spectroradiometer (MODIS) albedo (MOD10A1) between 31 May and 30 August. We find that seasonal ablation zone albedos have a bimodal distribution, with two alternate states. This suggests that an abrupt switch from high to low albedo can be triggered by a modest melt event, resulting in amplified surface ablation rates. Our results show that such a shift corresponds to an observed melt rate percent difference increase of 51.6% during peak melt season (between 10-14 and 20-24 July 2013). Furthermore, our findings demonstrate that seasonal changes in GrIS ablation zone albedo are not exclusively a function of a darkening surface from ice crystal growth, but rather are controlled by changes in the fractional coverage of snow, bare ice, and impurity-rich surface types. As the climate continues to warm, regional climate models should consider the seasonal evolution of ice surface types in Greenland's ablation zone to improve projections of mass loss contributions to sea level rise.
Bimodal Albedo Distributions in the Ablation Zone of the Southwestern Greenland Ice Sheet
NASA Astrophysics Data System (ADS)
Moustafa, S.; Rennermalm, A. K.; Smith, L. C.; Miller, M. A.; Mioduszewski, J.; Koenig, L.
2014-12-01
Surface albedo is a key variable controlling solar radiation absorbed at the Greenland Ice Sheet (GrIS) surface, and thus meltwater production. Recent decline in surface albedo over the GrIS has been linked to enhanced snow grain metamorphic rates and amplified ice-albedo feedback from atmospheric warming. However, the importance of distinct surface types on ablation zone albedo and meltwater production is still relatively unknown, and excluded in surface mass balance models. In this study, we analyze albedo and ablation rates (m d-1) using in situ and remotely-sensed data. Observations include: 1) a new high-quality in situ spectral albedo dataset collected with an Analytical Spectral Devices (ASD) spectroradiometer measuring at 325-1075 nm, along a 1.25 km transect during three days in June 2013; 2) broadband albedo at two automatic weather stations; and 3) daily MODerate Resolution Imaging Spectroradiometer (MODIS) albedo (MOD10A1) between 31 May and 30 August. We find that seasonal ablation zone albedos have a bimodal distribution, with two alternate states. This suggests that an abrupt switch from high to low albedo can be triggered by a modest melt event, resulting in amplified ablation rates. Our results show that such a shift corresponds to an observed melt rate percent difference increase of 51.6% during peak melt season (between 10-14 July and 20-24 July, 2013). Furthermore, our findings demonstrate that seasonal changes in GrIS ablation zone albedo are not exclusively a function of a darkening surface from ice crystal growth, but rather are controlled by changes in the fractional coverage of snow, bare ice, and impurity-rich surface types. As the climate continues to warm, regional climate models should consider the seasonal evolution of ice surface types in Greenland's ablation zone to improve projections of mass loss contributions to sea level rise.
NASA Astrophysics Data System (ADS)
Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry
2015-11-01
In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.
Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry
2015-11-21
In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.
Changes in newspaper coverage of mental illness from 2008 to 2014 in England.
Rhydderch, D; Krooupa, A-M; Shefer, G; Goulden, R; Williams, P; Thornicroft, A; Rose, D; Thornicroft, G; Henderson, C
2016-08-01
This study evaluates English newspaper coverage of mental health topics between 2008 and 2014 to provide context for the concomitant improvement in public attitudes and seek evidence for changes in coverage. Articles in 27 newspapers were retrieved using keyword searches on two randomly chosen days each month in 2008-2014, excluding 2012 due to restricted resources. Content analysis used a structured coding framework. Univariate logistic regression models were used to estimate the odds of each hypothesised element occurring each year compared to 2008. There was a substantial increase in the number of articles covering mental health between 2008 and 2014. We found an increase in the proportion of antistigmatising articles which approached significance at P < 0.05 (OR = 1.21, P = 0.056). The decrease in stigmatising articles was not statistically significant (OR = 0.90, P = 0.312). There was a significant decrease in the proportion of articles featuring the stigmatising elements 'danger to others' and 'personal responsibility', and an increase in 'hopeless victim'. There was a significant proportionate increase in articles featuring the antistigmatising elements 'injustice' and 'stigma', but a decrease in 'sympathetic portrayal of people with mental illness'. We found a decrease in articles promoting ideas about dangerousness or mental illness being self-inflicted, but an increase in articles portraying people as incapable. Yet, these findings were not consistent over time. © 2016 The Authors. Acta Psychiatrica Scandinavica Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Matos, Catarina; Grigoli, Francesco; Cesca, Simone; Custódio, Susana
2015-04-01
In the last decade a permanent seismic network of 30 broadband stations, complemented by dense temporary deployments, covered Portugal. This extraordinary network coverage enables now the computation of a high-resolution image of the seismicity of Portugal, which in turn will shed light on the seismotectonics of Portugal. The large data volumes available cannot be analyzed by traditional time-consuming manual location procedures. In this presentation we show first results on the automatic detection and location of earthquakes occurred in a selected region in the south of Portugal Our main goal is to implement an automatic earthquake detection and location routine in order to have a tool to quickly process large data sets, while at the same time detecting low magnitude earthquakes (i.e., lowering the detection threshold). We present a modified version of the automatic seismic event location by waveform coherency analysis developed by Grigoli et al. (2013, 2014), designed to perform earthquake detections and locations in continuous data. The event detection is performed by continuously computing the short-term-average/long-term-average of two different characteristic functions (CFs). For the P phases we used a CF based on the vertical energy trace, while for S phases we used a CF based on the maximum eigenvalue of the instantaneous covariance matrix (Vidale 1991). Seismic event detection and location is obtained by performing waveform coherence analysis scanning different hypocentral coordinates. We apply this technique to earthquakes in the Alentejo region (South Portugal), taking advantage from a small aperture seismic network installed in the south of Portugal for two years (2010 - 2011) during the DOCTAR experiment. In addition to the good network coverage, the Alentejo region was chosen for its simple tectonic setting and also because the relationship between seismicity, tectonics and local lithospheric structure is intriguing and still poorly understood. Inside the target area the seismicity clusters mainly within two clouds, oriented SE-NW and SW-NE. Should these clusters be seen as the expression of local active faults? Are they associated to lithological transitions? Or do the locations obtained from the previously sparse permanent network have large errors and generate fake clusters? We present preliminary results from this study, and compare them with manual locations. This work is supported by project QuakeLoc, reference: PTDC/GEO-FIQ/3522/2012
Breedveld, Sebastiaan; Voet, Peter W. J.; Heijkoop, Sabrina T.; Mens, Jan-Willem M.; Hoogeman, Mischa S.; Heijmen, Ben J. M.
2016-01-01
Purpose To develop and validate fully automated generation of VMAT plan-libraries for plan-of-the-day adaptive radiotherapy in locally-advanced cervical cancer. Material and Methods Our framework for fully automated treatment plan generation (Erasmus-iCycle) was adapted to create dual-arc VMAT treatment plan libraries for cervical cancer patients. For each of 34 patients, automatically generated VMAT plans (autoVMAT) were compared to manually generated, clinically delivered 9-beam IMRT plans (CLINICAL), and to dual-arc VMAT plans generated manually by an expert planner (manVMAT). Furthermore, all plans were benchmarked against 20-beam equi-angular IMRT plans (autoIMRT). For all plans, a PTV coverage of 99.5% by at least 95% of the prescribed dose (46 Gy) had the highest planning priority, followed by minimization of V45Gy for small bowel (SB). Other OARs considered were bladder, rectum, and sigmoid. Results All plans had a highly similar PTV coverage, within the clinical constraints (above). After plan normalizations for exactly equal median PTV doses in corresponding plans, all evaluated OAR parameters in autoVMAT plans were on average lower than in the CLINICAL plans with an average reduction in SB V45Gy of 34.6% (p<0.001). For 41/44 autoVMAT plans, SB V45Gy was lower than for manVMAT (p<0.001, average reduction 30.3%), while SB V15Gy increased by 2.3% (p = 0.011). AutoIMRT reduced SB V45Gy by another 2.7% compared to autoVMAT, while also resulting in a 9.0% reduction in SB V15Gy (p<0.001), but with a prolonged delivery time. Differences between manVMAT and autoVMAT in bladder, rectal and sigmoid doses were ≤ 1%. Improvements in SB dose delivery with autoVMAT instead of manVMAT were higher for empty bladder PTVs compared to full bladder PTVs, due to differences in concavity of the PTVs. Conclusions Quality of automatically generated VMAT plans was superior to manually generated plans. Automatic VMAT plan generation for cervical cancer has been implemented in our clinical routine. Due to the achieved workload reduction, extension of plan libraries has become feasible. PMID:28033342
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheng, Y; Li, T; Yoo, S
2016-06-15
Purpose: To enable near-real-time (<20sec) and interactive planning without compromising quality for whole breast RT treatment planning using tangential fields. Methods: Whole breast RT plans from 20 patients treated with single energy (SE, 6MV, 10 patients) or mixed energy (ME, 6/15MV, 10 patients) were randomly selected for model training. Additional 20 cases were used as validation cohort. The planning process for a new case consists of three fully automated steps:1. Energy Selection. A classification model automatically selects energy level. To build the energy selection model, principle component analysis (PCA) was applied to the digital reconstructed radiographs (DRRs) of training casesmore » to extract anatomy-energy relationship.2. Fluence Estimation. Once energy is selected, a random forest (RF) model generates the initial fluence. This model summarizes the relationship between patient anatomy’s shape based features and the output fluence. 3. Fluence Fine-tuning. This step balances the overall dose contribution throughout the whole breast tissue by automatically selecting reference points and applying centrality correction. Fine-tuning works at beamlet-level until the dose distribution meets clinical objectives. Prior to finalization, physicians can also make patient-specific trade-offs between target coverage and high-dose volumes.The proposed method was validated by comparing auto-plans with manually generated clinical-plans using Wilcoxon Signed-Rank test. Results: In 19/20 cases the model suggested the same energy combination as clinical-plans. The target volume coverage V100% was 78.1±4.7% for auto-plans, and 79.3±4.8% for clinical-plans (p=0.12). Volumes receiving 105% Rx were 69.2±78.0cc for auto-plans compared to 83.9±87.2cc for clinical-plans (p=0.13). The mean V10Gy, V20Gy of the ipsilateral lung was 24.4±6.7%, 18.6±6.0% for auto plans and 24.6±6.7%, 18.9±6.1% for clinical-plans (p=0.04, <0.001). Total computational time for auto-plans was < 20s. Conclusion: We developed an automated method that generates breast radiotherapy plans with accurate energy selection, similar target volume coverage, reduced hotspot volumes, and significant reduction in planning time, allowing for near-real-time planning.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-30
...This final rule contains regulations implementing amendments to the Longshore and Harbor Workers' Compensation Act (LHWCA) by the American Recovery and Reinvestment Act of 2009 (ARRA), relating to the exclusion of certain recreational-vessel workers from the LHWCA's definition of ``employee.'' These regulations clarify both the definition of ``recreational vessel'' and those circumstances under which workers are excluded from LHWCA coverage when working on those vessels. The final rule also withdraws a proposed rule that would have codified current case law and the Department's longstanding view that employees are covered under the LHWCA so long as some of their work constitutes ``maritime employment'' within the meaning of the statute.
Sma3s: a three-step modular annotator for large sequence datasets.
Muñoz-Mérida, Antonio; Viguera, Enrique; Claros, M Gonzalo; Trelles, Oswaldo; Pérez-Pulido, Antonio J
2014-08-01
Automatic sequence annotation is an essential component of modern 'omics' studies, which aim to extract information from large collections of sequence data. Most existing tools use sequence homology to establish evolutionary relationships and assign putative functions to sequences. However, it can be difficult to define a similarity threshold that achieves sufficient coverage without sacrificing annotation quality. Defining the correct configuration is critical and can be challenging for non-specialist users. Thus, the development of robust automatic annotation techniques that generate high-quality annotations without needing expert knowledge would be very valuable for the research community. We present Sma3s, a tool for automatically annotating very large collections of biological sequences from any kind of gene library or genome. Sma3s is composed of three modules that progressively annotate query sequences using either: (i) very similar homologues, (ii) orthologous sequences or (iii) terms enriched in groups of homologous sequences. We trained the system using several random sets of known sequences, demonstrating average sensitivity and specificity values of ~85%. In conclusion, Sma3s is a versatile tool for high-throughput annotation of a wide variety of sequence datasets that outperforms the accuracy of other well-established annotation algorithms, and it can enrich existing database annotations and uncover previously hidden features. Importantly, Sma3s has already been used in the functional annotation of two published transcriptomes. © The Author 2014. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.
Distributed pheromone-based swarming control of unmanned air and ground vehicles for RSTA
NASA Astrophysics Data System (ADS)
Sauter, John A.; Mathews, Robert S.; Yinger, Andrew; Robinson, Joshua S.; Moody, John; Riddle, Stephanie
2008-04-01
The use of unmanned vehicles in Reconnaissance, Surveillance, and Target Acquisition (RSTA) applications has received considerable attention recently. Cooperating land and air vehicles can support multiple sensor modalities providing pervasive and ubiquitous broad area sensor coverage. However coordination of multiple air and land vehicles serving different mission objectives in a dynamic and complex environment is a challenging problem. Swarm intelligence algorithms, inspired by the mechanisms used in natural systems to coordinate the activities of many entities provide a promising alternative to traditional command and control approaches. This paper describes recent advances in a fully distributed digital pheromone algorithm that has demonstrated its effectiveness in managing the complexity of swarming unmanned systems. The results of a recent demonstration at NASA's Wallops Island of multiple Aerosonde Unmanned Air Vehicles (UAVs) and Pioneer Unmanned Ground Vehicles (UGVs) cooperating in a coordinated RSTA application are discussed. The vehicles were autonomously controlled by the onboard digital pheromone responding to the needs of the automatic target recognition algorithms. UAVs and UGVs controlled by the same pheromone algorithm self-organized to perform total area surveillance, automatic target detection, sensor cueing, and automatic target recognition with no central processing or control and minimal operator input. Complete autonomy adds several safety and fault tolerance requirements which were integrated into the basic pheromone framework. The adaptive algorithms demonstrated the ability to handle some unplanned hardware failures during the demonstration without any human intervention. The paper describes lessons learned and the next steps for this promising technology.
Tasking and sharing sensing assets using controlled natural language
NASA Astrophysics Data System (ADS)
Preece, Alun; Pizzocaro, Diego; Braines, David; Mott, David
2012-06-01
We introduce an approach to representing intelligence, surveillance, and reconnaissance (ISR) tasks at a relatively high level in controlled natural language. We demonstrate that this facilitates both human interpretation and machine processing of tasks. More specically, it allows the automatic assignment of sensing assets to tasks, and the informed sharing of tasks between collaborating users in a coalition environment. To enable automatic matching of sensor types to tasks, we created a machine-processable knowledge representation based on the Military Missions and Means Framework (MMF), and implemented a semantic reasoner to match task types to sensor types. We combined this mechanism with a sensor-task assignment procedure based on a well-known distributed protocol for resource allocation. In this paper, we re-formulate the MMF ontology in Controlled English (CE), a type of controlled natural language designed to be readable by a native English speaker whilst representing information in a structured, unambiguous form to facilitate machine processing. We show how CE can be used to describe both ISR tasks (for example, detection, localization, or identication of particular kinds of object) and sensing assets (for example, acoustic, visual, or seismic sensors, mounted on motes or unmanned vehicles). We show how these representations enable an automatic sensor-task assignment process. Where a group of users are cooperating in a coalition, we show how CE task summaries give users in the eld a high-level picture of ISR coverage of an area of interest. This allows them to make ecient use of sensing resources by sharing tasks.
ECHO: A reference-free short-read error correction algorithm
Kao, Wei-Chun; Chan, Andrew H.; Song, Yun S.
2011-01-01
Developing accurate, scalable algorithms to improve data quality is an important computational challenge associated with recent advances in high-throughput sequencing technology. In this study, a novel error-correction algorithm, called ECHO, is introduced for correcting base-call errors in short-reads, without the need of a reference genome. Unlike most previous methods, ECHO does not require the user to specify parameters of which optimal values are typically unknown a priori. ECHO automatically sets the parameters in the assumed model and estimates error characteristics specific to each sequencing run, while maintaining a running time that is within the range of practical use. ECHO is based on a probabilistic model and is able to assign a quality score to each corrected base. Furthermore, it explicitly models heterozygosity in diploid genomes and provides a reference-free method for detecting bases that originated from heterozygous sites. On both real and simulated data, ECHO is able to improve the accuracy of previous error-correction methods by several folds to an order of magnitude, depending on the sequence coverage depth and the position in the read. The improvement is most pronounced toward the end of the read, where previous methods become noticeably less effective. Using a whole-genome yeast data set, it is demonstrated here that ECHO is capable of coping with nonuniform coverage. Also, it is shown that using ECHO to perform error correction as a preprocessing step considerably facilitates de novo assembly, particularly in the case of low-to-moderate sequence coverage depth. PMID:21482625
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sirunyan, Albert M; et al.
This letter presents the results of a search for pair-produced particles of masses above 100 GeV that each decay into at least four quarks. Using data collected by the CMS experiment at the LHC in 2015-2016, corresponding to an integrated luminosity of 38.2 fbmore » $$^{-1}$$, reconstructed particles are clustered into two large jets of similar mass, each consistent with four-parton substructure. No statistically significant excess of data over the background prediction is observed in the distribution of average jet mass. Pair-produced squarks with dominant hadronic $R$-parity-violating decays into four quarks and with masses between 0.10 and 0.72 TeV are excluded at 95% confidence level. Similarly, pair-produced gluinos that decay into five quarks are also excluded with masses between 0.10 and 1.41 TeV at 95% confidence level. These are the first constraints that have been placed on pair-produced particles with masses below 400 GeV that decay into four or five quarks, bridging a significant gap in the coverage of $R$-parity-violating supersymmetry parameter space.« less
A World-Wide Net of Solar Radio Spectrometers: e-CALLISTO
NASA Astrophysics Data System (ADS)
Benz, A. O.; Monstein, C.; Meyer, H.; Manoharan, P. K.; Ramesh, R.; Altyntsev, A.; Lara, A.; Paez, J.; Cho, K.-S.
2009-04-01
Radio spectrometers of the CALLISTO type to observe solar flares have been distributed to nine locations around the globe. The instruments observe automatically, their data is collected every day via internet and stored in a central data base. A public web-interface exists through which data can be browsed and retrieved. The nine instruments form a network called e-CALLISTO. It is still growing in the number of stations, as redundancy is desirable for full 24 h coverage of the solar radio emission in the meter and low decimeter band. The e-CALLISTO system has already proven to be a valuable new tool for monitoring solar activity and for space weather research.
Leveraging Paraphrase Labels to Extract Synonyms from Twitter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antoniak, Maria A.; Bell, Eric B.; Xia, Fei
2015-05-18
We present an approach for automatically learning synonyms from a paraphrase corpus of tweets. This work shows improvement on the task of paraphrase detection when we substitute our extracted synonyms into the training set. The synonyms are learned by using chunks from a shallow parse to create candidate synonyms and their context windows, and the synonyms are incorporated into a paraphrase detection system that uses machine translation metrics as features for a classifier. We demonstrate a 2.29% improvement in F1 when we train and test on the paraphrase training set, providing better coverage than previous systems, which shows the potentialmore » power of synonyms that are representative of a specific topic.« less
Use of NOAA-N satellites for land/water discrimination and flood monitoring
NASA Technical Reports Server (NTRS)
Tappan, G.; Horvath, N. C.; Doraiswamy, P. C.; Engman, T.; Goss, D. W. (Principal Investigator)
1983-01-01
A tool for monitoring the extent of major floods was developed using data collected by the NOAA-6 advanced very high resolution radiometer (AVHRR). A basic understanding of the spectral returns in AVHRR channels 1 and 2 for water, soil, and vegetation was reached using a large number of NOAA-6 scenes from different seasons and geographic locations. A look-up table classifier was developed based on analysis of the reflective channel relationships for each surface feature. The classifier automatically separated land from water and produced classification maps which were registered for a number of acquisitions, including coverage of a major flood on the Parana River of Argentina.
Development of a Portable Torque Wrench Tester
NASA Astrophysics Data System (ADS)
Wang, Y.; Zhang, Q.; Gou, C.; Su, D.
2018-03-01
A portable torque wrench tester (PTWT) with calibration range from 0.5 Nm to 60 Nm has been developed and evaluated for periodic or on-site calibration of setting type torque wrenches, indicating type torque wrenches and hand torque screwdrivers. The PTWT is easy to carry with weight about 10 kg, simple and efficient operation and energy saving with an automatic loading and calibrating system. The relative expanded uncertainty of torque realized by the PTWT was estimated to be 0.8%, with the coverage factor k=2. A comparison experiment has been done between the PTWT and a reference torque standard at our laboratory. The consistency between these two devices under the claimed uncertainties was verified.
Stuntz, Robert; Clontz, Robert
2016-05-01
Emergency physicians are using free open access medical education (FOAM) resources at an increasing rate. The extent to which FOAM resources cover the breadth of emergency medicine core content is unknown. We hypothesize that the content of FOAM resources does not provide comprehensive or balanced coverage of the scope of knowledge necessary for emergency medicine providers. Our objective is to quantify emergency medicine core content covered by FOAM resources and identify the predominant FOAM topics. This is an institutional review board-approved, retrospective review of all English-language FOAM posts between July 1, 2013, and June 30, 2014, as aggregated on http://FOAMem.com. The topics of FOAM posts were compared with those of the emergency medicine core content, as defined by the American Board of Emergency Medicine's Model of the Clinical Practice of Emergency Medicine (MCPEM). Each FOAM post could cover more than 1 topic. Repeated posts and summaries were excluded. Review of the MCPEM yielded 915 total emergency medicine topics grouped into 20 sections. Review of 6,424 FOAM posts yielded 7,279 total topics and 654 unique topics, representing 71.5% coverage of the 915 topics outlined by the MCPEM. The procedures section was covered most often, representing 2,285 (31.4%) FOAM topics. The 4 sections with the least coverage were cutaneous disorders, hematologic disorders, nontraumatic musculoskeletal disorders, and obstetric and gynecologic disorders, each representing 0.6% of FOAM topics. Airway techniques; ECG interpretation; research, evidence-based medicine, and interpretation of the literature; resuscitation; and ultrasonography were the most overrepresented subsections, equaling 1,674 (23.0%) FOAM topics when combined. The data suggest an imbalanced and incomplete coverage of emergency medicine core content in FOAM. The study is limited by its retrospective design and use of a single referral Web site to obtain available FOAM resources. More comprehensive and balanced coverage of emergency medicine core content is needed if FOAM is to serve as a primary educational resource. Copyright © 2016 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
Use of artificial intelligence in the production of high quality minced meat
NASA Astrophysics Data System (ADS)
Kapovsky, B. R.; Pchelkina, V. A.; Plyasheshnik, P. I.; Dydykin, A. S.; Lazarev, A. A.
2017-09-01
A design for an automatic line for minced meat production according to new production technology based on an innovative meat milling method is proposed. This method allows the necessary degree of raw material comminution at the stage of raw material preparation to be obtained, which leads to production intensification due to the traditional meat mass comminution equipment being unnecessary. To ensure consistent quality of the product obtained, the use of on-line automatic control of the technological process for minced meat production is envisaged. This system has been developed using artificial intelligence methods and technologies. The system is trainable during the operation process, adapts to changes in processed raw material characteristics and to external impacts that affect the system operation, and manufactures meat shavings with minimal dispersion of the typical particle size. The control system includes equipment for express analysis of the chemical composition of the minced meat and its temperature after comminution. In this case, the minced meat production process can be controlled strictly as a function of time, which excludes subjective factors for assessing the degree of finished product readiness. This will allow finished meat products with consistent, targeted high quality to be produced.
3D registration of surfaces for change detection in medical images
NASA Astrophysics Data System (ADS)
Fisher, Elizabeth; van der Stelt, Paul F.; Dunn, Stanley M.
1997-04-01
Spatial registration of data sets is essential for quantifying changes that take place over time in cases where the position of a patient with respect to the sensor has been altered. Changes within the region of interest can be problematic for automatic methods of registration. This research addresses the problem of automatic 3D registration of surfaces derived from serial, single-modality images for the purpose of quantifying changes over time. The registration algorithm utilizes motion-invariant, curvature- based geometric properties to derive an approximation to an initial rigid transformation to align two image sets. Following the initial registration, changed portions of the surface are detected and excluded before refining the transformation parameters. The performance of the algorithm was tested using simulation experiments. To quantitatively assess the registration, random noise at various levels, known rigid motion transformations, and analytically-defined volume changes were applied to the initial surface data acquired from models of teeth. These simulation experiments demonstrated that the calculated transformation parameters were accurate to within 1.2 percent of the total applied rotation and 2.9 percent of the total applied translation, even at the highest applied noise levels and simulated wear values.
Trumpp, Natalie M; Traub, Felix; Pulvermüller, Friedemann; Kiefer, Markus
2014-02-01
Classical theories of semantic memory assume that concepts are represented in a unitary amodal memory system. In challenging this classical view, pure or hybrid modality-specific theories propose that conceptual representations are grounded in the sensory-motor brain areas, which typically process sensory and action-related information. Although neuroimaging studies provided evidence for a functional-anatomical link between conceptual processing of sensory or action-related features and the sensory-motor brain systems, it has been argued that aspects of such sensory-motor activation may not directly reflect conceptual processing but rather strategic imagery or postconceptual elaboration. In the present ERP study, we investigated masked effects of acoustic and action-related conceptual features to probe unconscious automatic conceptual processing in isolation. Subliminal feature-specific ERP effects at frontocentral electrodes were observed, which differed with regard to polarity, topography, and underlying brain electrical sources in congruency with earlier findings under conscious viewing conditions. These findings suggest that conceptual acoustic and action representations can also be unconsciously accessed, thereby excluding any postconceptual strategic processes. This study therefore further substantiates a grounding of conceptual and semantic processing in action and perception.
The effort to rehabilitate workers' compensation.
Barth, P S
1976-06-01
State workers' compensation laws have been subjected to criticism since their inception; pressure to change them is now increasing. Most of the current challenge arise from dissatisfaction with the level of benefits available to disabled workers or their survivors, and, to a lesser degree, with the extent of program coverage. In response to this challenge, changes will occur that my range from reform-simply raising benefit levels and extending coverage-to program redesign, implying major structural revisions or abolishment of the system. For several reasons, including public apathy, the role of interest groups, and experience with other social insurance programs, it seems likely that basic structural shifts will not occur in the near future. While the criticism of these state laws is widespread, the problems can be dealt with in the existing framework. One area, however, could conceivably arouse sufficient public and legislative interest to upset this forecast. If it develops that the system is excluding large numbers of individuals disabled or killed by occupational diseases, workers' compensation laws could be placed in jeopardy. While evidence on this is scarce, it is clear that the current system compensates only a small number of serious cases of disability arising from occupational diseases.
Batt, Katherine; Fox-Rushby, J A; Castillo-Riquelme, Marianela
2004-09-01
Evidence-based reviews of published literature can be subject to several biases. Grey literature, however, can be of poor quality and expensive to access. Effective search strategies also vary by topic and are rarely known in advance. This paper complements a systematic review of the published literature on the costs and effects of expanding immunization services in developing countries. The quality of data on the effectiveness and cost-effectiveness of strategies to increase immunization coverage is shown to be similar across literatures, but the quality of information on costing is much lower in the grey literature. After excluding poorer quality studies from this review we found the quantity of available evidence almost doubled, particularly for more complex health-system interventions and cost or cost-effectiveness analyses. Interventions in the grey literature are more up to date and cover a different geographical spread. Consequently the conclusions of the published and grey literatures differ, although the number of papers is still too low to account for differences across types of interventions. We recommend that in future researchers consider using non-English keywords in their searches.
Sharp, Lisa K; Fisher, Edwin B; Gerber, Ben S
2015-09-01
The Society of Behavioral Medicine (SBM) recognizes that diabetes self-management (DSM) education and support are fundamental to teaching people how to manage their diabetes and decrease disease-related complications. Implementation of the Patient Protection and Affordable Care Act provides an opportunity to expand DSM education and support to many people who are currently excluded from such services due to lack of insurance coverage, current policy barriers, or simple failure of healthcare systems to provide them. Extending the range and provision of such services could translate into reduced diabetic complications, a reduction in unnecessary healthcare utilization, and significant health-related cost savings on a national level. SBM recommends that public and private insurers be required to reimburse for 12 h of DSM education and support annually for anyone with diabetes. Further, SBM recognizes that a range of modes and providers of DSM education and support have been shown effective, and that patient preferences and resources may influence choice. To address this, SBM urges health organizations to increase and diversify approaches toward DSM education and support they offer.
Batt, Katherine; Fox-Rushby, J. A.; Castillo-Riquelme, Marianela
2004-01-01
Evidence-based reviews of published literature can be subject to several biases. Grey literature, however, can be of poor quality and expensive to access. Effective search strategies also vary by topic and are rarely known in advance. This paper complements a systematic review of the published literature on the costs and effects of expanding immunization services in developing countries. The quality of data on the effectiveness and cost-effectiveness of strategies to increase immunization coverage is shown to be similar across literatures, but the quality of information on costing is much lower in the grey literature. After excluding poorer quality studies from this review we found the quantity of available evidence almost doubled, particularly for more complex health-system interventions and cost or cost-effectiveness analyses. Interventions in the grey literature are more up to date and cover a different geographical spread. Consequently the conclusions of the published and grey literatures differ, although the number of papers is still too low to account for differences across types of interventions. We recommend that in future researchers consider using non-English keywords in their searches. PMID:15628207
Pedroza, Claudia; Han, Weilu; Thanh Truong, Van Thi; Green, Charles; Tyson, Jon E
2018-01-01
One of the main advantages of Bayesian analyses of clinical trials is their ability to formally incorporate skepticism about large treatment effects through the use of informative priors. We conducted a simulation study to assess the performance of informative normal, Student- t, and beta distributions in estimating relative risk (RR) or odds ratio (OR) for binary outcomes. Simulation scenarios varied the prior standard deviation (SD; level of skepticism of large treatment effects), outcome rate in the control group, true treatment effect, and sample size. We compared the priors with regards to bias, mean squared error (MSE), and coverage of 95% credible intervals. Simulation results show that the prior SD influenced the posterior to a greater degree than the particular distributional form of the prior. For RR, priors with a 95% interval of 0.50-2.0 performed well in terms of bias, MSE, and coverage under most scenarios. For OR, priors with a wider 95% interval of 0.23-4.35 had good performance. We recommend the use of informative priors that exclude implausibly large treatment effects in analyses of clinical trials, particularly for major outcomes such as mortality.
2012-01-01
Background Symmetry and regularity of gait are essential outcomes of gait retraining programs, especially in lower-limb amputees. This study aims presenting an algorithm to automatically compute symmetry and regularity indices, and assessing the minimum number of strides for appropriate evaluation of gait symmetry and regularity through autocorrelation of acceleration signals. Methods Ten transfemoral amputees (AMP) and ten control subjects (CTRL) were studied. Subjects wore an accelerometer and were asked to walk for 70 m at their natural speed (twice). Reference values of step and stride regularity indices (Ad1 and Ad2) were obtained by autocorrelation analysis of the vertical and antero-posterior acceleration signals, excluding initial and final strides. The Ad1 and Ad2 coefficients were then computed at different stages by analyzing increasing portions of the signals (considering both the signals cleaned by initial and final strides, and the whole signals). At each stage, the difference between Ad1 and Ad2 values and the corresponding reference values were compared with the minimum detectable difference, MDD, of the index. If that difference was less than MDD, it was assumed that the portion of signal used in the analysis was of sufficient length to allow reliable estimation of the autocorrelation coefficient. Results All Ad1 and Ad2 indices were lower in AMP than in CTRL (P < 0.0001). Excluding initial and final strides from the analysis, the minimum number of strides needed for reliable computation of step symmetry and stride regularity was about 2.2 and 3.5, respectively. Analyzing the whole signals, the minimum number of strides increased to about 15 and 20, respectively. Conclusions Without the need to identify and eliminate the phases of gait initiation and termination, twenty strides can provide a reasonable amount of information to reliably estimate gait regularity in transfemoral amputees. PMID:22316184
Genotator: a disease-agnostic tool for genetic annotation of disease.
Wall, Dennis P; Pivovarov, Rimma; Tong, Mark; Jung, Jae-Yoon; Fusaro, Vincent A; DeLuca, Todd F; Tonellato, Peter J
2010-10-29
Disease-specific genetic information has been increasing at rapid rates as a consequence of recent improvements and massive cost reductions in sequencing technologies. Numerous systems designed to capture and organize this mounting sea of genetic data have emerged, but these resources differ dramatically in their disease coverage and genetic depth. With few exceptions, researchers must manually search a variety of sites to assemble a complete set of genetic evidence for a particular disease of interest, a process that is both time-consuming and error-prone. We designed a real-time aggregation tool that provides both comprehensive coverage and reliable gene-to-disease rankings for any disease. Our tool, called Genotator, automatically integrates data from 11 externally accessible clinical genetics resources and uses these data in a straightforward formula to rank genes in order of disease relevance. We tested the accuracy of coverage of Genotator in three separate diseases for which there exist specialty curated databases, Autism Spectrum Disorder, Parkinson's Disease, and Alzheimer Disease. Genotator is freely available at http://genotator.hms.harvard.edu. Genotator demonstrated that most of the 11 selected databases contain unique information about the genetic composition of disease, with 2514 genes found in only one of the 11 databases. These findings confirm that the integration of these databases provides a more complete picture than would be possible from any one database alone. Genotator successfully identified at least 75% of the top ranked genes for all three of our use cases, including a 90% concordance with the top 40 ranked candidates for Alzheimer Disease. As a meta-query engine, Genotator provides high coverage of both historical genetic research as well as recent advances in the genetic understanding of specific diseases. As such, Genotator provides a real-time aggregation of ranked data that remains current with the pace of research in the disease fields. Genotator's algorithm appropriately transforms query terms to match the input requirements of each targeted databases and accurately resolves named synonyms to ensure full coverage of the genetic results with official nomenclature. Genotator generates an excel-style output that is consistent across disease queries and readily importable to other applications.
Automatic segmentation of tumor-laden lung volumes from the LIDC database
NASA Astrophysics Data System (ADS)
O'Dell, Walter G.
2012-03-01
The segmentation of the lung parenchyma is often a critical pre-processing step prior to application of computer-aided detection of lung nodules. Segmentation of the lung volume can dramatically decrease computation time and reduce the number of false positive detections by excluding from consideration extra-pulmonary tissue. However, while many algorithms are capable of adequately segmenting the healthy lung, none have been demonstrated to work reliably well on tumor-laden lungs. Of particular challenge is to preserve tumorous masses attached to the chest wall, mediastinum or major vessels. In this role, lung volume segmentation comprises an important computational step that can adversely affect the performance of the overall CAD algorithm. An automated lung volume segmentation algorithm has been developed with the goals to maximally exclude extra-pulmonary tissue while retaining all true nodules. The algorithm comprises a series of tasks including intensity thresholding, 2-D and 3-D morphological operations, 2-D and 3-D floodfilling, and snake-based clipping of nodules attached to the chest wall. It features the ability to (1) exclude trachea and bowels, (2) snip large attached nodules using snakes, (3) snip small attached nodules using dilation, (4) preserve large masses fully internal to lung volume, (5) account for basal aspects of the lung where in a 2-D slice the lower sections appear to be disconnected from main lung, and (6) achieve separation of the right and left hemi-lungs. The algorithm was developed and trained to on the first 100 datasets of the LIDC image database.
Graham, Brian W.; Tao, Yeqing; Dodge, Katie L.; Thaxton, Carly T.; Olaso, Danae; Young, Nicolas L.; Marshall, Alan G.
2016-01-01
The archaeal minichromosomal maintenance (MCM) helicase from Sulfolobus solfataricus (SsoMCM) is a model for understanding structural and mechanistic aspects of DNA unwinding. Although interactions of the encircled DNA strand within the central channel provide an accepted mode for translocation, interactions with the excluded strand on the exterior surface have mostly been ignored with regard to DNA unwinding. We have previously proposed an extension of the traditional steric exclusion model of unwinding to also include significant contributions with the excluded strand during unwinding, termed steric exclusion and wrapping (SEW). The SEW model hypothesizes that the displaced single strand tracks along paths on the exterior surface of hexameric helicases to protect single-stranded DNA (ssDNA) and stabilize the complex in a forward unwinding mode. Using hydrogen/deuterium exchange monitored by Fourier transform ion cyclotron resonance MS, we have probed the binding sites for ssDNA, using multiple substrates targeting both the encircled and excluded strand interactions. In each experiment, we have obtained >98.7% sequence coverage of SsoMCM from >650 peptides (5–30 residues in length) and are able to identify interacting residues on both the interior and exterior of SsoMCM. Based on identified contacts, positively charged residues within the external waist region were mutated and shown to generally lower DNA unwinding without negatively affecting the ATP hydrolysis. The combined data globally identify binding sites for ssDNA during SsoMCM unwinding as well as validating the importance of the SEW model for hexameric helicase unwinding. PMID:27044751
Hernández-Domínguez, Laura; Ratté, Sylvie; Sierra-Martínez, Gerardo; Roche-Bergua, Andrés
2018-01-01
We present a methodology to automatically evaluate the performance of patients during picture description tasks. Transcriptions and audio recordings of the Cookie Theft picture description task were used. With 25 healthy elderly control (HC) samples and an information coverage measure, we automatically generated a population-specific referent. We then assessed 517 transcriptions (257 Alzheimer's disease [AD], 217 HC, and 43 mild cognitively impaired samples) according to their informativeness and pertinence against this referent. We extracted linguistic and phonetic metrics which previous literature correlated to early-stage AD. We trained two learners to distinguish HCs from cognitively impaired individuals. Our measures significantly ( P < .001) correlated with the severity of the cognitive impairment and the Mini-Mental State Examination score. The classification sensitivity was 81% (area under the curve of receiver operating characteristics = 0.79) and 85% (area under the curve of receiver operating characteristics = 0.76) between HCs and AD and between HCs and AD and mild cognitively impaired, respectively. An automated assessment of a picture description task could assist clinicians in the detection of early signs of cognitive impairment and AD.
Xiao, Xiang; Zhu, Hao; Liu, Wei-Jie; Yu, Xiao-Ting; Duan, Lian; Li, Zheng; Zhu, Chao-Zhe
2017-01-01
The International 10/20 system is an important head-surface-based positioning system for transcranial brain mapping techniques, e.g., fNIRS and TMS. As guidance for probe placement, the 10/20 system permits both proper ROI coverage and spatial consistency among multiple subjects and experiments in a MRI-free context. However, the traditional manual approach to the identification of 10/20 landmarks faces problems in reliability and time cost. In this study, we propose a semi-automatic method to address these problems. First, a novel head surface reconstruction algorithm reconstructs head geometry from a set of points uniformly and sparsely sampled on the subject's head. Second, virtual 10/20 landmarks are determined on the reconstructed head surface in computational space. Finally, a visually-guided real-time navigation system guides the experimenter to each of the identified 10/20 landmarks on the physical head of the subject. Compared with the traditional manual approach, our proposed method provides a significant improvement both in reliability and time cost and thus could contribute to improving both the effectiveness and efficiency of 10/20-guided MRI-free probe placement.
I-SCAD® standoff chemical agent detector overview
NASA Astrophysics Data System (ADS)
Popa, Mirela O.; Griffin, Matthew T.
2012-06-01
This paper presents a system-level description of the I-SCAD® Standoff Chemical Agent Detector, a passive Fourier Transform InfraRed (FTIR) based remote sensing system, for detecting chemical vapor threats. The passive infrared detection system automatically searches the 7 to 14 micron region of the surrounding atmosphere for agent vapor clouds. It is capable of operating while on the move to accomplish reconnaissance, surveillance, and contamination avoidance missions. Additionally, the system is designed to meet the needs for application on air and sea as well as ground mobile and fixed site platforms. The lightweight, passive, and fully automatic detection system scans the surrounding atmosphere for chemical warfare agent vapors. It provides on-the-move, 360-deg coverage from a variety of tactical and reconnaissance platforms at distances up to 5 km. The core of the system is a rugged Michelson interferometer with a flexure spring bearing mechanism and bi-directional data acquisition capability. The modular system design facilitates interfacing to many platforms. A Reduced Field of View (RFOV) variant includes novel modifications to the scanner subcomponent assembly optical design that gives extended performance in detection range and detection probability without sacrificing existing radiometric sensitivity performance. This paper will deliver an overview of system.
NASA Astrophysics Data System (ADS)
Huang, Xiaomeng; Hu, Chenqi; Huang, Xing; Chu, Yang; Tseng, Yu-heng; Zhang, Guang Jun; Lin, Yanluan
2018-01-01
Mesoscale convective systems (MCSs) are important components of tropical weather systems and the climate system. Long-term data of MCS are of great significance in weather and climate research. Using long-term (1985-2008) global satellite infrared (IR) data, we developed a novel objective automatic tracking algorithm, which combines a Kalman filter (KF) with the conventional area-overlapping method, to generate a comprehensive MCS dataset. The new algorithm can effectively track small and fast-moving MCSs and thus obtain more realistic and complete tracking results than previous studies. A few examples are provided to illustrate the potential application of the dataset with a focus on the diurnal variations of MCSs over land and ocean regions. We find that the MCSs occurring over land tend to initiate in the afternoon with greater intensity, but the oceanic MCSs are more likely to initiate in the early morning with weaker intensity. A double peak in the maximum spatial coverage is noted over the western Pacific, especially over the southwestern Pacific during the austral summer. Oceanic MCSs also persist for approximately 1 h longer than their continental counterparts.
Ontology-based automatic identification of public health-related Turkish tweets.
Küçük, Emine Ela; Yapar, Kürşad; Küçük, Dilek; Küçük, Doğan
2017-04-01
Social media analysis, such as the analysis of tweets, is a promising research topic for tracking public health concerns including epidemics. In this paper, we present an ontology-based approach to automatically identify public health-related Turkish tweets. The system is based on a public health ontology that we have constructed through a semi-automated procedure. The ontology concepts are expanded through a linguistically motivated relaxation scheme as the last stage of ontology development, before being integrated into our system to increase its coverage. The ultimate lexical resource which includes the terms corresponding to the ontology concepts is used to filter the Twitter stream so that a plausible tweet subset, including mostly public-health related tweets, can be obtained. Experiments are carried out on two million genuine tweets and promising precision rates are obtained. Also implemented within the course of the current study is a Web-based interface, to track the results of this identification system, to be used by the related public health staff. Hence, the current social media analysis study has both technical and practical contributions to the significant domain of public health. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Hueschen, R. M.
1984-01-01
The Digital Integrated Automatic Landing System (DIALS) is discussed. The DIALS is a modern control theory design performing all the maneuver modes associated with current autoland systems: localizer capture and track, glideslope capture and track, decrab, and flare. The DIALS is an integrated full-state feedback system which was designed using direct-digital methods. The DIALS uses standard aircraft sensors and the digital Microwave Landing System (MLS) signals as measurements. It consists of separately designed longitudinal and lateral channels although some cross-coupling variables are fed between channels for improved state estimates and trajectory commands. The DIALS was implemented within the 16-bit fixed-point flight computers of the ATOPS research aircraft, a small twin jet commercial transport outfitted with a second research cockpit and a fly-by-wire system. The DIALS became the first modern control theory design to be successfully flight tested on a commercial-type aircraft. Flight tests were conducted in late 1981 using a wide coverage MLS on Runway 22 at Wallops Flight Center. All the modes were exercised including the capture and track of steep glidescopes up to 5 degrees.
Ramanujam, Nedunchelian; Kaliappan, Manivannan
2016-01-01
Nowadays, automatic multidocument text summarization systems can successfully retrieve the summary sentences from the input documents. But, it has many limitations such as inaccurate extraction to essential sentences, low coverage, poor coherence among the sentences, and redundancy. This paper introduces a new concept of timestamp approach with Naïve Bayesian Classification approach for multidocument text summarization. The timestamp provides the summary an ordered look, which achieves the coherent looking summary. It extracts the more relevant information from the multiple documents. Here, scoring strategy is also used to calculate the score for the words to obtain the word frequency. The higher linguistic quality is estimated in terms of readability and comprehensibility. In order to show the efficiency of the proposed method, this paper presents the comparison between the proposed methods with the existing MEAD algorithm. The timestamp procedure is also applied on the MEAD algorithm and the results are examined with the proposed method. The results show that the proposed method results in lesser time than the existing MEAD algorithm to execute the summarization process. Moreover, the proposed method results in better precision, recall, and F-score than the existing clustering with lexical chaining approach. PMID:27034971
Comprehensive Software Eases Air Traffic Management
NASA Technical Reports Server (NTRS)
2007-01-01
To help air traffic control centers improve the safety and the efficiency of the National Airspace System, Ames Research Center developed the Future Air Traffic Management Concepts Evaluation Tool (FACET) software, which won NASA's 2006 "Software of the Year" competition. In 2005, Ames licensed FACET to Flight Explorer Inc., for integration into its Flight Explorer (version 6.0) software. The primary FACET features incorporated in the Flight Explorer software system alert airspace users to forecasted demand and capacity imbalances. Advance access to this information helps dispatchers anticipate congested sectors (airspace) and delays at airports, and decide if they need to reroute flights. FACET is now a fully integrated feature in the Flight Explorer Professional Edition (version 7.0). Flight Explorer Professional offers end-users other benefits, including ease of operation; automatic alerts to inform users of important events such as weather conditions and potential airport delays; and international, real-time flight coverage over Canada, the United Kingdom, New Zealand, and sections of the Atlantic and Pacific Oceans. Flight Explorer Inc. recently broadened coverage by partnering with Honeywell International Inc.'s Global Data Center, Blue Sky Network, Sky Connect LLC, SITA, ARINC Incorporated, Latitude Technologies Corporation, and Wingspeed Corporation, to track their aircraft anywhere in the world.
Prostate Brachytherapy Seed Reconstruction with Gaussian Blurring and Optimal Coverage Cost
Lee, Junghoon; Liu, Xiaofeng; Jain, Ameet K.; Song, Danny Y.; Burdette, E. Clif; Prince, Jerry L.; Fichtinger, Gabor
2009-01-01
Intraoperative dosimetry in prostate brachytherapy requires localization of the implanted radioactive seeds. A tomosynthesis-based seed reconstruction method is proposed. A three-dimensional volume is reconstructed from Gaussian-blurred projection images and candidate seed locations are computed from the reconstructed volume. A false positive seed removal process, formulated as an optimal coverage problem, iteratively removes “ghost” seeds that are created by tomosynthesis reconstruction. In an effort to minimize pose errors that are common in conventional C-arms, initial pose parameter estimates are iteratively corrected by using the detected candidate seeds as fiducials, which automatically “focuses” the collected images and improves successive reconstructed volumes. Simulation results imply that the implanted seed locations can be estimated with a detection rate of ≥ 97.9% and ≥ 99.3% from three and four images, respectively, when the C-arm is calibrated and the pose of the C-arm is known. The algorithm was also validated on phantom data sets successfully localizing the implanted seeds from four or five images. In a Phase-1 clinical trial, we were able to localize the implanted seeds from five intraoperative fluoroscopy images with 98.8% (STD=1.6) overall detection rate. PMID:19605321
Marčan, Marija; Pavliha, Denis; Kos, Bor; Forjanič, Tadeja; Miklavčič, Damijan
2015-01-01
Treatments based on electroporation are a new and promising approach to treating tumors, especially non-resectable ones. The success of the treatment is, however, heavily dependent on coverage of the entire tumor volume with a sufficiently high electric field. Ensuring complete coverage in the case of deep-seated tumors is not trivial and can in best way be ensured by patient-specific treatment planning. The basis of the treatment planning process consists of two complex tasks: medical image segmentation, and numerical modeling and optimization. In addition to previously developed segmentation algorithms for several tissues (human liver, hepatic vessels, bone tissue and canine brain) and the algorithms for numerical modeling and optimization of treatment parameters, we developed a web-based tool to facilitate the translation of the algorithms and their application in the clinic. The developed web-based tool automatically builds a 3D model of the target tissue from the medical images uploaded by the user and then uses this 3D model to optimize treatment parameters. The tool enables the user to validate the results of the automatic segmentation and make corrections if necessary before delivering the final treatment plan. Evaluation of the tool was performed by five independent experts from four different institutions. During the evaluation, we gathered data concerning user experience and measured performance times for different components of the tool. Both user reports and performance times show significant reduction in treatment-planning complexity and time-consumption from 1-2 days to a few hours. The presented web-based tool is intended to facilitate the treatment planning process and reduce the time needed for it. It is crucial for facilitating expansion of electroporation-based treatments in the clinic and ensuring reliable treatment for the patients. The additional value of the tool is the possibility of easy upgrade and integration of modules with new functionalities as they are developed.
Automatic Road Gap Detection Using Fuzzy Inference System
NASA Astrophysics Data System (ADS)
Hashemi, S.; Valadan Zoej, M. J.; Mokhtarzadeh, M.
2011-09-01
Automatic feature extraction from aerial and satellite images is a high-level data processing which is still one of the most important research topics of the field. In this area, most of the researches are focused on the early step of road detection, where road tracking methods, morphological analysis, dynamic programming and snakes, multi-scale and multi-resolution methods, stereoscopic and multi-temporal analysis, hyper spectral experiments, are some of the mature methods in this field. Although most researches are focused on detection algorithms, none of them can extract road network perfectly. On the other hand, post processing algorithms accentuated on the refining of road detection results, are not developed as well. In this article, the main is to design an intelligent method to detect and compensate road gaps remained on the early result of road detection algorithms. The proposed algorithm consists of five main steps as follow: 1) Short gap coverage: In this step, a multi-scale morphological is designed that covers short gaps in a hierarchical scheme. 2) Long gap detection: In this step, the long gaps, could not be covered in the previous stage, are detected using a fuzzy inference system. for this reason, a knowledge base consisting of some expert rules are designed which are fired on some gap candidates of the road detection results. 3) Long gap coverage: In this stage, detected long gaps are compensated by two strategies of linear and polynomials for this reason, shorter gaps are filled by line fitting while longer ones are compensated by polynomials.4) Accuracy assessment: In order to evaluate the obtained results, some accuracy assessment criteria are proposed. These criteria are obtained by comparing the obtained results with truly compensated ones produced by a human expert. The complete evaluation of the obtained results whit their technical discussions are the materials of the full paper.
2015-01-01
Background Treatments based on electroporation are a new and promising approach to treating tumors, especially non-resectable ones. The success of the treatment is, however, heavily dependent on coverage of the entire tumor volume with a sufficiently high electric field. Ensuring complete coverage in the case of deep-seated tumors is not trivial and can in best way be ensured by patient-specific treatment planning. The basis of the treatment planning process consists of two complex tasks: medical image segmentation, and numerical modeling and optimization. Methods In addition to previously developed segmentation algorithms for several tissues (human liver, hepatic vessels, bone tissue and canine brain) and the algorithms for numerical modeling and optimization of treatment parameters, we developed a web-based tool to facilitate the translation of the algorithms and their application in the clinic. The developed web-based tool automatically builds a 3D model of the target tissue from the medical images uploaded by the user and then uses this 3D model to optimize treatment parameters. The tool enables the user to validate the results of the automatic segmentation and make corrections if necessary before delivering the final treatment plan. Results Evaluation of the tool was performed by five independent experts from four different institutions. During the evaluation, we gathered data concerning user experience and measured performance times for different components of the tool. Both user reports and performance times show significant reduction in treatment-planning complexity and time-consumption from 1-2 days to a few hours. Conclusions The presented web-based tool is intended to facilitate the treatment planning process and reduce the time needed for it. It is crucial for facilitating expansion of electroporation-based treatments in the clinic and ensuring reliable treatment for the patients. The additional value of the tool is the possibility of easy upgrade and integration of modules with new functionalities as they are developed. PMID:26356007
Celiac Node Failure Patterns After Definitive Chemoradiation for Esophageal Cancer in the Modern Era
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amini, Arya; UC Irvine School of Medicine, Irvine, California; Xiao Lianchun
2012-06-01
Purpose: The celiac lymph node axis acts as a gateway for metastatic systemic spread. The need for prophylactic celiac nodal coverage in chemoradiation therapy for esophageal cancer is controversial. Given the improved ability to evaluate lymph node status before treatment via positron emission tomography (PET) and endoscopic ultrasound, we hypothesized that prophylactic celiac node irradiation may not be needed for patients with localized esophageal carcinoma. Methods and Materials: We reviewed the radiation treatment volumes for 131 patients who underwent definitive chemoradiation for esophageal cancer. Patients with celiac lymph node involvement at baseline were excluded. Median radiation dose was 50.4 Gy.more » The location of all celiac node failures was compared with the radiation treatment plan to determine whether the failures occurred within or outside the radiation treatment field. Results: At a median follow-up time of 52.6 months (95% CI 46.1-56.7 months), 6 of 60 patients (10%) without celiac node coverage had celiac nodal failure; in 5 of these patients, the failures represented the first site of recurrence. Of the 71 patients who had celiac coverage, only 5 patients (7%) had celiac region relapse. In multivariate analyses, having a pretreatment-to-post-treatment change in standardized uptake value on PET >52% (odds ratio [OR] 0.198, p = 0.0327) and having failure in the clinical target volume (OR 10.72, p = 0.001) were associated with risk of celiac region relapse. Of those without celiac coverage, the 6 patients that later developed celiac failure had a worse median overall survival time compared with the other 54 patients who did not fail (median overall survival time: 16.5 months vs. 31.5 months, p = 0.041). Acute and late toxicities were similar in both groups. Conclusions: Although celiac lymph node failures occur in approximately 1 of 10 patients, the lack of effective salvage treatments and subsequent low morbidity may justify prophylactic treatment in distal esophageal cancer patients.« less
Choudhry, Niteesh K.; Brennan, Troyen; Toscano, Michele; Spettell, Claire; Glynn, Robert J.; Rubino, Mark; Schneeweiss, Sebastian; Brookhart, Alan M.; Fernandes, Joaquim; Mathew, Susan; Christiansen, Blake; Antman, Elliott M.; Avorn, Jerry; Shrank, William H.
2009-01-01
Background Medication nonadherence is a major public health problem, especially for patients with coronary artery disease. The cost of prescription drugs is a central reason for nonadherence, even for patients with drug insurance. Removing patient out-of-pocket drug costs may increase adherence, improve clinical outcomes, and even reduce overall health costs for high-risk patients. The existing data are inadequate to assess whether this strategy is effective. Trial Design The Post-Myocardial Infarction Free Rx and Economic Evaluation (Post-MI FREEE) trial aims to evaluate the effect of providing full prescription drug coverage (ie, no copays, coinsurance, or deductibles) for statins, β-blockers, angiotensin-converting enzyme inhibitors, and angiotensin II receptor blockers to patients after being recently discharged from the hospital. Potentially eligible patients will be those individuals who receive their health and pharmacy benefits through Aetna, Inc. Patients enrolled in a Health Savings Account plan, who are ≥65 years of age, whose plan sponsor (ie, the employer, union, government, or association that sponsors the particular benefits package) has opted out of participating in the study, and who do not receive both medical services and pharmacy coverage through Aetna will be excluded. The plan sponsor of each eligible patient will be block randomized to either full drug coverage or current levels of pharmacy benefit, and all subsequently eligible patients of that same plan sponsor will be assigned to the same benefits group. The primary outcome of the trial is a composite clinical outcome of readmission for acute MI, unstable angina, stroke, congestive heart failure, revascularization, or inhospital cardiovascular death. Secondary outcomes include medication adherence and health care costs. All patients will be followed up for a minimum of 1 year. Conclusion The Post-MI FREEE trial will be the first randomized study to evaluate the impact of reducing cost-sharing for essential cardiac medications in high-risk patients on clinical and economic outcomes. PMID:18585494
Conan, Anne; Geerdes, Joy A C; Akerele, Oluyemisi A; Reininghaus, Bjorn; Simpson, Gregory J G; Knobel, Darryn
2017-09-22
Dogs (Canis familiaris) are often free-roaming in sub-Saharan African countries. Rabies virus circulates in many of these populations and presents a public health issue. Mass vaccination of dog populations is the recommended method to decrease the number of dog and human rabies cases. We describe and compare four populations of dogs and their vaccination coverage in four different villages (Hluvukani, Athol, Utah and Dixie) in Bushbuckridge Municipality, Mpumalanga province, South Africa. Cross-sectional surveys were conducted in the villages of Athol, Utah and Dixie, while data from a Health and Demographic Surveillance System were used to describe the dog population in Hluvukani village. All households of the villages were visited to obtain information on the number, sex, age and rabies vaccination status of dogs. From May to October 2013, 2969 households were visited in the four villages and 942 owned dogs were reported. The populations were all young and skewed towards males. No differences were observed in the sex and age distributions (puppies 0-3 months excluded) among the villages. Athol had a higher proportion of dog-owning households than Hluvukani and Utah. Vaccination coverages were all above the 20% - 40% threshold required for herd immunity to rabies (38% in Hluvukani, 51% in Athol, 65% in Dixie and 74% in Utah). For the preparation of vaccination campaigns, we recommend the use of the relatively stable dog:human ratio (between 1:12 and 1:16) to estimate the number of dogs per village in Bushbuckridge Municipality.
NASA Technical Reports Server (NTRS)
Hiser, H. W.; Senn, H. V.; Bukkapatnam, S. T.; Akyuzlu, K.
1977-01-01
The use of cloud images in the visual spectrum from the SMS/GOES geostationary satellites to determine the hourly distribution of sunshine on a mesoscale in the continental United States excluding Alaska is presented. Cloud coverage and density as a function of time of day and season are evaluated through the use of digital data processing techniques. Low density cirrus clouds are less detrimental to solar energy collection than other types; and clouds in the morning and evening are less detrimental than those during midday hours of maximum insolation. Seasonal geographic distributions of cloud cover/sunshine are converted to langleys of solar radiation received at the earth's surface through relationships developed from long term measurements at six widely distributed stations.
Robson, J; Falshaw, M
1995-01-01
BACKGROUND. Reliable comparison of the results of audit between general practices and over time requires standard definitions of numerators and denominators. This is particularly relevant in areas of high population turnover and practice list inflation. Without simple validation to remove supernumeraries, population coverage and professional activity may be underestimated. AIM. This audit study aimed to define a standard denominator, the 'active patient' denominator, to enable comparison of professional activity and population coverage for preventive activities between general practices and over time. It also aimed to document the extent to which computers were used for recording such activities. METHOD. A random sample of people in the age group 30-64 years was drawn from the computerized general practice registers of the 16 inner London general practices that participated in the 'healthy eastenders project'. A validation procedure excluded those patients who were likely to have died or moved away, or who for administrative reasons were unable to contribute to the numerator; this allowed the creation of the active patient denominator. An audit of preventive activities with numerators drawn from both paper and computerized medical records was carried out and results were presented so that practices could compare their results with those of their peers and over time. RESULTS. Of the original sample of 2331 people, 25% (practice range 13%-37%) were excluded as a result of the validation procedure. A denominator based on the complete, unexpurgated practice register rather than the validated active patient denominator would have reduced the proportion of people with blood pressure recorded within the preceding five years from 77% to 61%, recording of smoking status from 68% to 53% and recording of cervical smears from 80% to 66%. Only 53% of the last recordings, within the preceding five years, of blood pressure and only 54% of those of smoking status were recorded on the practice computer. In contrast, 82% of recorded cervical smears were recorded on computer. CONCLUSION. The active patient denominator produces a more accurate estimate of population coverage and professional activity, both of which are underestimated by the complete, unexpurgated practice register. A standard definition of the denominator also allows comparisons to be made between practices and over time. As only half of the recordings of some preventive activities were recorded on computer, it is doubtful whether it is advisable to rely on computers for audit where paper records are also maintained. PMID:7546868
NASA Astrophysics Data System (ADS)
De Luca, Claudio; Zinno, Ivana; Manunta, Michele; Lanari, Riccardo; Casu, Francesco
2016-04-01
The microwave remote sensing scenario is rapidly evolving through development of new sensor technology for Earth Observation (EO). In particular, Sentinel-1A (S1A) is the first of a sensors' constellation designed to provide a satellite data stream for the Copernicus European program. Sentinel-1A has been specifically designed to provide, over land, Differential Interferometric Synthetic Aperture Radar (DInSAR) products to analyze and investigate Earth's surface displacements. S1A peculiarities include wide ground coverage (250 km of swath), C-band operational frequency and short revisit time (that will reduce from 12 to 6 days when the twin system Sentinel-1B will be placed in orbit during 2016). Such characteristics, together with the global coverage acquisition policy, make the Sentinel-1 constellation to be extremely suitable for volcanic and seismic areas studying and monitoring worldwide, thus allowing the generation of both ground displacement information with increasing rapidity and new geological understanding. The main acquisition mode over land is the so called Interferometric Wide Swath (IWS) that is based on the Terrain Observation by Progressive Scans (TOPS) technique and that guarantees the mentioned S1A large coverage characteristics at expense of a not trivial interferometric processing. Moreover, the satellite spatial coverage and the reduced revisit time will lead to an exponential increase of the data archives that, after the launch of Sentine-1B, will reach about 3TB per day. Therefore, the EO scientific community needs from the one hand automated and effective DInSAR tools able to address the S1A processing complexity, and from the other hand the computing and storage capacities to face out the expected large amount of data. Then, it is becoming more crucial to move processors and tools close to the satellite archives, being not efficient anymore the approach of downloading and processing data with in-house computing facilities. To address these issues, ESA recently funded the development of the Geohazards Exploitation Platform (GEP), a project aimed at putting together data, processing tools and results to make them accessible to the EO scientific community, with particular emphasis to the Geohazard Supersites & Natural Laboratories and the CEOS Seismic Hazards and Volcanoes Pilots. In this work we present the integration of the parallel version of a well-known DInSAR algorithm referred to as Small BAseline Subset (P-SBAS) within the GEP platform for processing Sentinel-1 data. The integration allowed us to set up an operational on-demand web tool, open to every user, aimed at automatically processing S1A data for the generation of SBAS displacement time-series. Main characteristics as well as a number of experimental results obtained by using the implemented web tool will be also shown. This work is partially supported by: the RITMARE project of Italian MIUR, the DPC-CNR agreement and the ESA GEP project.
NASA Astrophysics Data System (ADS)
de Azevedo, Samara C.; Singh, Ramesh P.; da Silva, Erivaldo A.
2017-04-01
Finer spatial resolution of areas with tall objects within urban environment causes intense shadows that lead to wrong information in urban mapping. Due to the shadows, automatic detection of objects (such as buildings, trees, structures, towers) and to estimate the surface coverage from high spatial resolution is difficult. Thus, automatic shadow detection is the first necessary preprocessing step to improve the outcome of many remote sensing applications, particularly for high spatial resolution images. Efforts have been made to explore spatial and spectral information to evaluate such shadows. In this paper, we have used morphological attribute filtering to extract contextual relations in an efficient multilevel approach for high resolution images. The attribute selected for the filtering was the area estimated from shadow spectral feature using the Normalized Saturation-Value Difference Index (NSVDI) derived from pan-sharpening images. In order to assess the quality of fusion products and the influence on shadow detection algorithm, we evaluated three pan-sharpening methods - Intensity-Hue-Saturation (IHS), Principal Components (PC) and Gran-Schmidt (GS) through the image quality measures: Correlation Coefficient (CC), Root Mean Square Error (RMSE), Relative Dimensionless Global Error in Synthesis (ERGAS) and Universal Image Quality Index (UIQI). Experimental results over Worldview II scene from São Paulo city (Brazil) show that GS method provides good correlation with original multispectral bands with no radiometric and contrast distortion. The automatic method using GS method for NSDVI generation clearly provide a clear distinction of shadows and non-shadows pixels with an overall accuracy more than 90%. The experimental results confirm the effectiveness of the proposed approach which could be used for further shadow removal and reliable for object recognition, land-cover mapping, 3D reconstruction, etc. especially in developing countries where land use and land cover are rapidly changing with tall objects within urban areas.
NASA Astrophysics Data System (ADS)
Zhou, X.; Zhou, Z.; Apple, M. E.; Spangler, L.
2016-12-01
To extract methane from unminable seams of coal in the Powder River Basin of Montana and Wyoming, coalbed methane (CBM) water has to be pumped and kept in retention ponds rather than discharged to the vadose zone to mix with the ground water. The water areal coverage of these ponds changes due to evaporation and repetitive refilling. The water quality also changes due to growing of microalgae (unicellular or filamentous including green algae and diatoms), evaporation, and refilling. To estimate the water coverage changes and monitor water quality becomes important for monitoring the CBM water retention ponds to provide timely management plan for the newly pumped CBM water. Conventional methods such as various water indices based on multi-spectral satellite data such as Landsat because of the small pond size ( 100mx100m scale) and low spatial resolution ( 30m scale) of the satellite data. In this study we will present new methods to estimate water coverage and water quality changes using Google Earth images and images collected from an unmanned aircraft system (UAS) (Phantom 2 plus). Because these images have only visible bands (red, green, and blue bands), the conventional water index methods that involve near-infrared bands do not work. We design a new method just based on the visible bands to automatically extract water pixels and the intensity of the water pixel as a proxy for water quality after a series of image processing such as georeferencing, resampling, filtering, etc. Differential GPS positions along the water edges were collected the same day as the images collected from the UAS. Area of the water area was calculated from the GPS positions and used for the validation of the method. Because of the very high resolution ( 10-30 cm scale), the water areal coverage and water quality distribution can be accurately estimated. Since the UAS can be flied any time, water area and quality information can be collected timely.
Machiels, Mélanie; Jin, Peng; van Gurp, Christianne H; van Hooft, Jeanin E; Alderliesten, Tanja; Hulshof, Maarten C C M
2018-03-21
To investigate the feasibility and geometric accuracy of carina-based registration for CBCT-guided setup verification in esophageal cancer IGRT, compared with current practice bony anatomy-based registration. Included were 24 esophageal cancer patients with 65 implanted fiducial markers, visible on planning CTs and follow-up CBCTs. All available CBCT scans (n = 236) were rigidly registered to the planning CT with respect to the bony anatomy and the carina. Target coverage was visually inspected and marker position variation was quantified relative to both registration approaches; the variation of systematic (Σ) and random errors (σ) was estimated. Automatic carina-based registration was feasible in 94.9% of the CBCT scans, with an adequate target coverage in 91.1% compared to 100% after bony anatomy-based registration. Overall, Σ (σ) in the LR/CC/AP direction was 2.9(2.4)/4.1(2.4)/2.2(1.8) mm using the bony anatomy registration compared to 3.3(3.0)/3.6(2.6)/3.9(3.1) mm for the carina. Mid-thoracic placed markers showed a non-significant but smaller Σ in CC and AP direction when using the carina-based registration. Compared with a bony anatomy-based registration, carina-based registration for esophageal cancer IGRT results in inadequate target coverage in 8.9% of cases. Furthermore, large Σ and σ, requiring larger anisotropic margins, were seen after carina-based registration. Only for tumors entirely confined to the mid-thoracic region the carina-based registration might be slightly favorable.
Ultrasonic ranging and data telemetry system
Brashear, Hugh R.; Blair, Michael S.; Phelps, James E.; Bauer, Martin L.; Nowlin, Charles H.
1990-01-01
An ultrasonic ranging and data telemetry system determines a surveyor's position and automatically links it with other simultaneously taken survey data. An ultrasonic and radio frequency (rf) transmitter are carried by the surveyor in a backpack. The surveyor's position is determined by calculations that use the measured transmission times of an airborne ultrasonic pulse transmitted from the backpack to two or more prepositioned ultrasonic transceivers. Once a second, rf communications are used both to synchronize the ultrasonic pulse transmission-time measurements and to transmit other simultaneously taken survey data. The rf communications are interpreted by a portable receiver and microcomputer which are brought to the property site. A video display attached to the computer provides real-time visual monitoring of the survey progress and site coverage.
A rule-based software test data generator
NASA Technical Reports Server (NTRS)
Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II
1991-01-01
Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.
Modelling Aṣṭādhyāyī: An Approach Based on the Methodology of Ancillary Disciplines (Vedāṅga)
NASA Astrophysics Data System (ADS)
Mishra, Anand
This article proposes a general model based on the common methodological approach of the ancillary disciplines (Vedāṅga) associated with the Vedas taking examples from Śikṣā, Chandas, Vyākaraṇa and Prātiśā khya texts. It develops and elaborates this model further to represent the contents and processes of Aṣṭādhyāyī. Certain key features are added to my earlier modelling of Pāṇinian system of Sanskrit grammar. This includes broader coverage of the Pāṇinian meta-language, mechanism for automatic application of rules and positioning the grammatical system within the procedural complexes of ancillary disciplines.
Genotyping in the cloud with Crossbow.
Gurtowski, James; Schatz, Michael C; Langmead, Ben
2012-09-01
Crossbow is a scalable, portable, and automatic cloud computing tool for identifying SNPs from high-coverage, short-read resequencing data. It is built on Apache Hadoop, an implementation of the MapReduce software framework. Hadoop allows Crossbow to distribute read alignment and SNP calling subtasks over a cluster of commodity computers. Two robust tools, Bowtie and SOAPsnp, implement the fundamental alignment and variant calling operations respectively, and have demonstrated capabilities within Crossbow of analyzing approximately one billion short reads per hour on a commodity Hadoop cluster with 320 cores. Through protocol examples, this unit will demonstrate the use of Crossbow for identifying variations in three different operating modes: on a Hadoop cluster, on a single computer, and on the Amazon Elastic MapReduce cloud computing service.
Time coded distribution via broadcasting stations
NASA Technical Reports Server (NTRS)
Leschiutta, S.; Pettiti, V.; Detoma, E.
1979-01-01
The distribution of standard time signals via AM and FM broadcasting stations presents the distinct advantages to offer a wide area coverage and to allow the use of inexpensive receivers, but the signals are radiated a limited number of times per day, are not usually available during the night, and no full and automatic synchronization of a remote clock is possible. As an attempt to overcome some of these problems, a time coded signal with a complete date information is diffused by the IEN via the national broadcasting networks in Italy. These signals are radiated by some 120 AM and about 3000 FM and TV transmitters around the country. In such a way, a time ordered system with an accuracy of a couple of milliseconds is easily achieved.
Towards Semantic Web Services on Large, Multi-Dimensional Coverages
NASA Astrophysics Data System (ADS)
Baumann, P.
2009-04-01
Observed and simulated data in the Earth Sciences often come as coverages, the general term for space-time varying phenomena as set forth by standardization bodies like the Open GeoSpatial Consortium (OGC) and ISO. Among such data are 1-d time series, 2-D surface data, 3-D surface data time series as well as x/y/z geophysical and oceanographic data, and 4-D metocean simulation results. With increasing dimensionality the data sizes grow exponentially, up to Petabyte object sizes. Open standards for exploiting coverage archives over the Web are available to a varying extent. The OGC Web Coverage Service (WCS) standard defines basic extraction operations: spatio-temporal and band subsetting, scaling, reprojection, and data format encoding of the result - a simple interoperable interface for coverage access. More processing functionality is available with products like Matlab, Grid-type interfaces, and the OGC Web Processing Service (WPS). However, these often lack properties known as advantageous from databases: declarativeness (describe results rather than the algorithms), safe in evaluation (no request can keep a server busy infinitely), and optimizable (enable the server to rearrange the request so as to produce the same result faster). WPS defines a geo-enabled SOAP interface for remote procedure calls. This allows to webify any program, but does not allow for semantic interoperability: a function is identified only by its function name and parameters while the semantics is encoded in the (only human readable) title and abstract. Hence, another desirable property is missing, namely an explicit semantics which allows for machine-machine communication and reasoning a la Semantic Web. The OGC Web Coverage Processing Service (WCPS) language, which has been adopted as an international standard by OGC in December 2008, defines a flexible interface for the navigation, extraction, and ad-hoc analysis of large, multi-dimensional raster coverages. It is abstract in that it does not anticipate any particular protocol. One such protocol is given by the OGC Web Coverage Service (WCS) Processing Extension standard which ties WCPS into WCS. Another protocol which makes WCPS an OGC Web Processing Service (WPS) Profile is under preparation. Thereby, WCPS bridges WCS and WPS. The conceptual model of WCPS relies on the coverage model of WCS, which in turn is based on ISO 19123. WCS currently addresses raster-type coverages where a coverage is seen as a function mapping points from a spatio-temporal extent (its domain) into values of some cell type (its range). A retrievable coverage has an identifier associated, further the CRSs supported and, for each range field (aka band, channel), the interpolation methods applicable. The WCPS language offers access to one or several such coverages via a functional, side-effect free language. The following example, which derives the NDVI (Normalized Difference Vegetation Index) from given coverages C1, C2, and C3 within the regions identified by the binary mask R, illustrates the language concept: for c in ( C1, C2, C3 ), r in ( R ) return encode( (char) (c.nir - c.red) / (c.nir + c.red), H˜DF-EOS\\~ ) The result is a list of three HDF-EOS encoded images containing masked NDVI values. Note that the same request can operate on coverages of any dimensionality. The expressive power of WCPS includes statistics, image, and signal processing up to recursion, to maintain safe evaluation. As both syntax and semantics of any WCPS expression is well known the language is Semantic Web ready: clients can construct WCPS requests on the fly, servers can optimize such requests (this has been investigated extensively with the rasdaman raster database system) and automatically distribute them for processing in a WCPS-enabled computing cloud. The WCPS Reference Implementation is being finalized now that the standard is stable; it will be released in open source once ready. Among the future tasks is to extend WCPS to general meshes, in synchronization with the WCS standard. In this talk WCPS is presented in the context of OGC standardization. The author is co-chair of OGC's WCS Working Group (WG) and Coverages WG.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borot de Battisti, M; Maenhout, M; Lagendijk, J J W
Purpose: To develop adaptive planning with feedback for MRI-guided focal HDR prostate brachytherapy with a single divergent needle robotic implant device. After each needle insertion, the dwell positions for that needle are calculated and the positioning of remaining needles and dosimetry are both updated based on MR imaging. Methods: Errors in needle positioning may occur due to inaccurate needle insertion (caused by e.g. the needle’s bending) and unpredictable changes in patient anatomy. Consequently, the dose plan quality might dramatically decrease compared to the preplan. In this study, a procedure was developed to re-optimize, after each needle insertion, the remaining needlemore » angulations, source positions and dwell times in order to obtain an optimal coverage (D95% PTV>19 Gy) without exceeding the constraints of the organs at risk (OAR) (D10% urethra<21 Gy, D1cc bladder<12 Gy and D1cc rectum<12 Gy). Complete HDR procedures with 6 needle insertions were simulated for a patient MR-image set with PTV, prostate, urethra, bladder and rectum delineated. Random angulation errors, modeled by a Gaussian distribution (standard deviation of 3 mm at the needle’s tip), were generated for each needle insertion. We compared the final dose parameters for the situations (I) without re-optimization and (II) with the automatic feedback. Results: The computation time of replanning was below 100 seconds on a current desk computer. For the patient tested, a clinically acceptable dose plan was achieved while applying the automatic feedback (median(range) in Gy, D95% PTV: 19.9(19.3–20.3), D10% urethra: 13.4(11.9–18.0), D1cc rectum: 11.0(10.7–11.6), D1cc bladder: 4.9(3.6–6.8)). This was not the case without re-optimization (median(range) in Gy, D95% PTV: 19.4(14.9–21.3), D10% urethra: 12.6(11.0–15.7), D1cc rectum: 10.9(8.9–14.1), D1cc bladder: 4.8(4.4–5.2)). Conclusion: An automatic guidance strategy for HDR prostate brachytherapy was developed to compensate errors in needle positioning and improve the dose distribution. Without re-optimization, target coverage and OAR constraints may not be achieved. M. Borot de Battisti is funded by Philips Medical Systems Nederland B.V.; M. Moerland is principal investigator on a contract funded by Philips Medical Systems Nederland B.V.; G. Hautvast and D. Binnekamp are full-time employees of Philips Medical Systems Nederland B.V.« less
Lu, Peng-Jun; Byrd, Kathy K; Murphy, Trudy V
2013-05-01
Since 1996, hepatitis A vaccine (HepA) has been recommended for adults at increased risk for infection including travelers to high or intermediate hepatitis A endemic countries. In 2009, travel outside the United States and Canada was the most common exposure nationally reported for persons with hepatitis A virus (HAV) infection. To assess HepA vaccination coverage among adults 18-49 years traveling to a country of high or intermediate endemicity in the United States. We analyzed data from the 2010 National Health Interview Survey (NHIS), to determine self-reported HepA vaccination coverage (≥1 dose) and series completion (≥2 dose) among persons 18-49 years who traveled, since 1995, to a country of high or intermediate HAV endemicity. Multivariable logistic regression and predictive marginal analyses were conducted to identify factors independently associated with HepA vaccine receipt. In 2010, approximately 36.6% of adults 18-49 years reported traveling to high or intermediate hepatitis A endemic countries; among this group unadjusted HepA vaccination coverage was 26.6% compared to 12.7% among non-travelers (P-values<0.001) and series completion were 16.9% and 7.6%, respectively (P-values<0.001). On multivariable analysis among all respondents, travel status was an independent predictor of HepA coverage and series completion (both P-values<0.001). Among travelers, HepA coverage and series completion (≥2 doses) were higher for travelers 18-25 years (prevalence ratios 2.3, 2.8, respectively, P-values<0.001) and for travelers 26-39 years (prevalence ratios 1.5, 1.5, respectively, P-value<0.001, P-value=0.002, respectively) compared to travelers 40-49 years. Other characteristics independently associated with a higher likelihood of HepA receipt among travelers included Asian race/ethnicity, male sex, never having been married, having a high school or higher education, living in the western United States, having greater number of physician contacts or receipt of influenza vaccination in the previous year. HepB vaccination was excluded from the model because of the significant correlation between receipt of HepA vaccination and HepB vaccination could distort the model. Although travel to a country of high or intermediate hepatitis A endemicity was associated with higher likelihood of HepA vaccination in 2010 among adults 18-49 years, self-reported HepA vaccination coverage was low among adult travelers to these areas. Healthcare providers should ask their patients' upcoming travel plans and recommend and offer travel related vaccinations to their patients. Published by Elsevier Ltd.
Lu, Peng-jun; Byrd, Kathy K.; Murphy, Trudy V.
2018-01-01
Background Since 1996, hepatitis A vaccine (HepA) has been recommended for adults at increased risk for infection including travelers to high or intermediate hepatitis A endemic countries. In 2009, travel outside the United States and Canada was the most common exposure nationally reported for persons with hepatitis A virus (HAV) infection. Objective To assess HepA vaccination coverage among adults 18–49 years traveling to a country of high or intermediate endemicity in the United States. Methods We analyzed data from the 2010 National Health Interview Survey (NHIS), to determine self-reported HepA vaccination coverage (≥1 dose) and series completion (≥2 dose) among persons 18–49 years who traveled, since 1995, to a country of high or intermediate HAV endemicity. Multivariable logistic regression and predictive marginal analyses were conducted to identify factors independently associated with HepA vaccine receipt. Results In 2010, approximately 36.6% of adults 18–49 years reported traveling to high or intermediate hepatitis A endemic countries; among this group unadjusted HepA vaccination coverage was 26.6% compared to 12.7% among non-travelers (P-values < 0.001) and series completion were 16.9% and 7.6%, respectively (P-values < 0.001). On multivariable analysis among all respondents, travel status was an independent predictor of HepA coverage and series completion (both P-values < 0.001). Among travelers, HepA coverage and series completion (≥2 doses) were higher for travelers 18–25 years (prevalence ratios 2.3, 2.8, respectively, P-values < 0.001) and for travelers 26–39 years (prevalence ratios 1.5, 1.5, respectively, P-value < 0.001, P-value = 0.002, respectively) compared to travelers 40–49 years. Other characteristics independently associated with a higher likelihood of HepA receipt among travelers included Asian race/ethnicity, male sex, never having been married, having a high school or higher education, living in the western United States, having greater number of physician contacts or receipt of influenza vaccination in the previous year. HepB vaccination was excluded from the model because of the significant correlation between receipt of HepA vaccination and HepB vaccination could distort the model. Conclusions Although travel to a country of high or intermediate hepatitis A endemicity was associated with higher likelihood of HepA vaccination in 2010 among adults 18–49 years, self-reported HepA vaccination coverage was low among adult travelers to these areas. Healthcare providers should ask their patients’ upcoming travel plans and recommend and offer travel related vaccinations to their patients. PMID:23523408
Aguzzi, Jacopo; Costa, Corrado; Robert, Katleen; Matabos, Marjolaine; Antonucci, Francesca; Juniper, S. Kim; Menesatti, Paolo
2011-01-01
The development and deployment of sensors for undersea cabled observatories is presently biased toward the measurement of habitat variables, while sensor technologies for biological community characterization through species identification and individual counting are less common. The VENUS cabled multisensory network (Vancouver Island, Canada) deploys seafloor camera systems at several sites. Our objective in this study was to implement new automated image analysis protocols for the recognition and counting of benthic decapods (i.e., the galatheid squat lobster, Munida quadrispina), as well as for the evaluation of changes in bacterial mat coverage (i.e., Beggiatoa spp.), using a camera deployed in Saanich Inlet (103 m depth). For the counting of Munida we remotely acquired 100 digital photos at hourly intervals from 2 to 6 December 2009. In the case of bacterial mat coverage estimation, images were taken from 2 to 8 December 2009 at the same time frequency. The automated image analysis protocols for both study cases were created in MatLab 7.1. Automation for Munida counting incorporated the combination of both filtering and background correction (Median- and Top-Hat Filters) with Euclidean Distances (ED) on Red-Green-Blue (RGB) channels. The Scale-Invariant Feature Transform (SIFT) features and Fourier Descriptors (FD) of tracked objects were then extracted. Animal classifications were carried out with the tools of morphometric multivariate statistic (i.e., Partial Least Square Discriminant Analysis; PLSDA) on Mean RGB (RGBv) value for each object and Fourier Descriptors (RGBv+FD) matrices plus SIFT and ED. The SIFT approach returned the better results. Higher percentages of images were correctly classified and lower misclassification errors (an animal is present but not detected) occurred. In contrast, RGBv+FD and ED resulted in a high incidence of records being generated for non-present animals. Bacterial mat coverage was estimated in terms of Percent Coverage and Fractal Dimension. A constant Region of Interest (ROI) was defined and background extraction by a Gaussian Blurring Filter was performed. Image subtraction within ROI was followed by the sum of the RGB channels matrices. Percent Coverage was calculated on the resulting image. Fractal Dimension was estimated using the box-counting method. The images were then resized to a dimension in pixels equal to a power of 2, allowing subdivision into sub-multiple quadrants. In comparisons of manual and automated Percent Coverage and Fractal Dimension estimates, the former showed an overestimation tendency for both parameters. The primary limitations on the automatic analysis of benthic images were habitat variations in sediment texture and water column turbidity. The application of filters for background corrections is a required preliminary step for the efficient recognition of animals and bacterial mat patches. PMID:22346657
De Angelis, Chiara; Sardanelli, Francesco; Perego, Matteo; Alì, Marco; Casilli, Francesco; Inglese, Luigi; Mauri, Giovanni
2017-11-01
To assess feasibility, efficacy and safety of carbon dioxide (CO 2 ) digital subtraction angiography (DSA) to guide endovascular aneurysm repair (EVAR) in a cohort of patients with chronic kidney disease (CKD). After Ethical Committee approval, the records of 13 patients (all male, mean age 74.6 ± 8.0 years) with CKD, who underwent EVAR to exclude an abdominal aortic aneurysm (AAA) under CO 2 angiography guidance, were reviewed. The AAA to be excluded had a mean diameter of 52.0 ± 8.0 mm. CO 2 angiography was performed by automatic (n = 7) or hand (n = 6) injection. The endograft was correctly placed and the AAA was excluded in all cases, without any surgical conversions. Two patients (15.4%) had an endoleak: one type-Ia, detected by CO 2 -DSA and effectively treated with prosthesis dilatation; one type-III, detected by CO 2 -DSA, confirmed using 10 ml of ICM, and conservatively managed. In one patient, CO 2 angiograms were considered of too low quality for guiding the procedure and 200 ml of ICM were administered. Overall, 11 patients (84.6%) underwent a successful EVAR under the guidance of the sole CO 2 angiography. No patients suffered from major complications, including those typically CO 2 -related. Two patients suffered from abdominal pain during the procedure secondary to a transient splanchnic perfusion's reduction due to CO 2 , and one patient had a worsening of renal function probably caused by a cholesterol embolization during the procedure. In patients with CKD, EVAR under CO 2 angiography guidance is feasible, effective, and safe.
Graham, Brian W; Tao, Yeqing; Dodge, Katie L; Thaxton, Carly T; Olaso, Danae; Young, Nicolas L; Marshall, Alan G; Trakselis, Michael A
2016-06-10
The archaeal minichromosomal maintenance (MCM) helicase from Sulfolobus solfataricus (SsoMCM) is a model for understanding structural and mechanistic aspects of DNA unwinding. Although interactions of the encircled DNA strand within the central channel provide an accepted mode for translocation, interactions with the excluded strand on the exterior surface have mostly been ignored with regard to DNA unwinding. We have previously proposed an extension of the traditional steric exclusion model of unwinding to also include significant contributions with the excluded strand during unwinding, termed steric exclusion and wrapping (SEW). The SEW model hypothesizes that the displaced single strand tracks along paths on the exterior surface of hexameric helicases to protect single-stranded DNA (ssDNA) and stabilize the complex in a forward unwinding mode. Using hydrogen/deuterium exchange monitored by Fourier transform ion cyclotron resonance MS, we have probed the binding sites for ssDNA, using multiple substrates targeting both the encircled and excluded strand interactions. In each experiment, we have obtained >98.7% sequence coverage of SsoMCM from >650 peptides (5-30 residues in length) and are able to identify interacting residues on both the interior and exterior of SsoMCM. Based on identified contacts, positively charged residues within the external waist region were mutated and shown to generally lower DNA unwinding without negatively affecting the ATP hydrolysis. The combined data globally identify binding sites for ssDNA during SsoMCM unwinding as well as validating the importance of the SEW model for hexameric helicase unwinding. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
Methicillin-Resistant Staphylococcus aureus in Foot Osteomyelitis.
Ashong, Chester N; Raheem, Shazia A; Hunter, Andrew S; Mindru, Cezarina; Barshes, Neal R
Conflicting studies exist regarding the impact of methicillin-resistant Staphylococcus aureus (MRSA) on increased time to wound healing, future need for surgical procedures, and likelihood of treatment failure in patients with diabetic foot osteomyelitis. The purpose of this study is to determine the overall significance of MRSA in predicting treatment failure in bone infections of the foot and to determine an appropriate pre-operative and empiric post-operative antibiotic regimen. Patients presenting with an initial episode of "probable" or "definite" foot osteomyelitis were included for review and analysis if the following criteria were met: (1) Osteomyelitis occurred in the foot (i.e., distal to the malleoli of the ankle); episodes occurring above the ankle were excluded. (2) Patients received either no antibiotics or only oral antibiotics for long-term treatment; episodes managed with long-term parenteral antibiotics were excluded. (3) The infection was managed initially with medical therapy or conservative surgical therapy; episodes managed with major (above-ankle) amputation as the initial treatment were excluded. The primary objective of this study was to assess whether episodes of foot osteomyelitis associated with MRSA resulted in treatment failure more frequently than not. Of 178 episodes included in the study, 50 (28.1%) episodes had treatment failure. Median time-to-treatment failure was 60 days (range 7-598 days). In 28.1% (9/32 episodes) in which treatment failure occurred and 39.0% (41/105) episodes in which no treatment failure occurred, MRSA was present. The presence of MRSA was not significantly associated with treatment failure (p = 0.99). The presence of MRSA in bone culture and whether antibiotic use had anti-MRSA activity was not associated with increased treatment failure of diabetic foot osteomyelitis in our institution. Empiric antibiotic coverage of MRSA may not be necessary for many patients presenting with foot osteomyelitis.
Automated Intelligibility Assessment of Pathological Speech Using Phonological Features
NASA Astrophysics Data System (ADS)
Middag, Catherine; Martens, Jean-Pierre; Van Nuffelen, Gwen; De Bodt, Marc
2009-12-01
It is commonly acknowledged that word or phoneme intelligibility is an important criterion in the assessment of the communication efficiency of a pathological speaker. People have therefore put a lot of effort in the design of perceptual intelligibility rating tests. These tests usually have the drawback that they employ unnatural speech material (e.g., nonsense words) and that they cannot fully exclude errors due to listener bias. Therefore, there is a growing interest in the application of objective automatic speech recognition technology to automate the intelligibility assessment. Current research is headed towards the design of automated methods which can be shown to produce ratings that correspond well with those emerging from a well-designed and well-performed perceptual test. In this paper, a novel methodology that is built on previous work (Middag et al., 2008) is presented. It utilizes phonological features, automatic speech alignment based on acoustic models that were trained on normal speech, context-dependent speaker feature extraction, and intelligibility prediction based on a small model that can be trained on pathological speech samples. The experimental evaluation of the new system reveals that the root mean squared error of the discrepancies between perceived and computed intelligibilities can be as low as 8 on a scale of 0 to 100.
Electro-oculography-based detection of sleep-wake in sleep apnea patients.
Virkkala, Jussi; Toppila, Jussi; Maasilta, Paula; Bachour, Adel
2015-09-01
Recently, we have developed a simple method that uses two electro-oculography (EOG) electrodes for the automatic scoring of sleep-wake in normal subjects. In this study, we investigated the usefulness of this method on 284 consecutive patients referred for a suspicion of sleep apnea who underwent a polysomnography (PSG). We applied the AASM 2007 scoring rules. A simple automatic sleep-wake classification algorithm based on 18-45 Hz beta power was applied to the calculated bipolar EOG channel and was compared to standard polysomnography. Epoch by epoch agreement was evaluated. Eighteen patients were excluded due to poor EOG quality. One hundred fifty-eight males and 108 females were studied, their mean age was 48 (range 17-89) years, apnea-hypopnea index 13 (range 0-96) /h, BMI 29 (range 17-52) kg/m(2), and sleep efficiency 78 (range 0-98) %. The mean agreement in sleep-wake states between EOG and PSG was 85% and the Cohen's kappa was 0.56. Overall epoch-by-epoch agreement was 85%, and the Cohen's kappa was 0.57 with positive predictive value of 91% and negative predictive value of 65%. The EOG method can be applied to patients referred for suspicion of sleep apnea to indicate the sleep-wake state.
Schenck, C H; Mahowald, M W
1995-11-01
A case of childhood-onset somnambulism is reported in which a 43-year-old man presented with repeated sleep-related injuries incurred during violent nocturnal activity, which included frenzied running, throwing punches and wielding knives. He had also driven an automobile a long distance during a presumed somnambulistic state. His wife had been repeatedly injured, and she felt that her life was threatened by his nocturnal violence 2-3 times yearly. Polysomnography (PSG) documented multiple episodes of complex and violent behaviors arising exclusively from stage 3/4 sleep, thus confirming the diagnosis of somnambulism. Other causes of sleep-related violence were excluded. The patient responded promptly to treatment with bedtime clonazepam, and benefit was maintained at 5-year follow-up. Although this strictly clinical case did not have any legal repercussions, it does carry forensic implications, particularly when placed in the context of the published medical literature on PSG-documented parasomnias (somnambulism, rapid eye movement sleep behavior disorder) containing explicit examples of recurrent violence, at times life-threatening, directed toward the bed partner and others. Thus, a new medical-legal concept is proposed, consisting of "parasomnia with continuing danger" as a noninsane automatism. Treatment guidelines, within the context of forensic medicine, are presented.
Targonski, Paul V; Poland, Gregory A
2004-01-01
Although influenza vaccine delivery strategies have improved coverage rates to unprecedented levels nationally among persons aged 65 years and older, influenza remains one of the greatest vaccine-preventable threats to public health among elderly in the US. A new, intranasal live attenuated influenza vaccine (LAIV) was recently approved by the US FDA for use in persons aged 5-49 years, which excludes the elderly population. Limitations of immune response to inactivated influenza vaccine (IAIV) and effectiveness of current influenza vaccination strategies among the elderly suggest that a combined approach using LAIV and/or the IAIV in various permutations might benefit this group. We explore characteristics of the LAIV, data regarding its utility in protecting against influenza in the elderly, and challenges and opportunities regarding potential combined inactivated/live attenuated vaccination strategies for the elderly. Although LAIV appears to hold promise either alone or in combination with IAIV, large well conducted randomised trials are necessary to define further the role of LAIV in preventing influenza morbidity and mortality among the elderly. We also suggest that innovative vaccine coverage strategies designed to optimise prevention and control of influenza and minimise viral transmission in the community must accompany, in parallel, the acquisition of clinical trials data to best combat morbidity and mortality from influenza.
Iranian Household Financial Protection against Catastrophic Health Care Expenditures
Moghadam, M Nekoei; Banshi, M; Javar, M Akbari; Amiresmaili, M; Ganjavi, S
2012-01-01
Background: Protecting households against financial risks is one of objectives of any health system. In this regard, Iran’s fourth five year developmental plan act in its 90th article, articulated decreasing household’s exposure to catastrophic health expenditure to one percent. Hence, this study aimed to measure percentage of Iranian households exposed to catastrophic health expenditures and to explore its determinants. Methods: The present descriptive-analytical study was carried out retrospectively. Households whose financial contributions to the health system exceeded 40% of disposable income were considered as exposed to catastrophic healthcare expenditures. Influential factors on catastrophic healthcare expenditures were examined by logistic regression and chi-square test. Results: Of 39,088 households, 80 were excluded due to absence of food expenditures. 2.8% of households were exposed to catastrophic health expenditures. Influential factors on catastrophic healthcare were utilizing ambulatory, hospital, and drug addiction cessation services as well as consuming pharmaceuticals. Socioeconomics characteristics such as health insurance coverage, household size, and economic status were other determinants of exposure to catastrophic healthcare expenditures. Conclusion: Iranian health system has not achieved the objective of reducing catastrophic healthcare expenditure to one percent. Inefficient health insurance coverage, different fee schedules practiced by private and public providers, failure of referral system are considered as probable barriers toward decreasing households’ exposure to catastrophic healthcare expenditures. PMID:23193508
Epigraph: A Vaccine Design Tool Applied to an HIV Therapeutic Vaccine and a Pan-Filovirus Vaccine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theiler, James; Yoon, Hyejin; Yusim, Karina
Epigraph is an efficient graph-based algorithm for designing vaccine antigens to optimize potential T-cell epitope (PTE) coverage. Functionally, epigraph vaccine antigens are similar to Mosaic vaccines, which have demonstrated effectiveness in preliminary HIV non-human primate studies. In contrast to the Mosaic algorithm, Epigraph is substantially faster, and in restricted cases, provides a mathematically optimal solution. Furthermore, epigraph has new features that enable enhanced vaccine design flexibility. These features include the ability to exclude rare epitopes from a design, to optimize population coverage based on inexact epitope matches, and to apply the code to both aligned and unaligned input sequences. Epigraphmore » was developed to provide practical design solutions for two outstanding vaccine problems. The first of these is a personalized approach to a therapeutic T-cell HIV vaccine that would provide antigens with an excellent match to an individual’s infecting strain, intended to contain or clear a chronic infection. The second is a pan-filovirus vaccine, with the potential to protect against all known viruses in the Filoviradae family, including ebolaviruses. A web-based interface to run the Epigraph tool suite is available (http://www.hiv.lanl.gov/content/sequence/EPIGRAPH/epigraph.html).« less
Sequence Analysis of the Genome of Carnation (Dianthus caryophyllus L.)
Yagi, Masafumi; Kosugi, Shunichi; Hirakawa, Hideki; Ohmiya, Akemi; Tanase, Koji; Harada, Taro; Kishimoto, Kyutaro; Nakayama, Masayoshi; Ichimura, Kazuo; Onozaki, Takashi; Yamaguchi, Hiroyasu; Sasaki, Nobuhiro; Miyahara, Taira; Nishizaki, Yuzo; Ozeki, Yoshihiro; Nakamura, Noriko; Suzuki, Takamasa; Tanaka, Yoshikazu; Sato, Shusei; Shirasawa, Kenta; Isobe, Sachiko; Miyamura, Yoshinori; Watanabe, Akiko; Nakayama, Shinobu; Kishida, Yoshie; Kohara, Mitsuyo; Tabata, Satoshi
2014-01-01
The whole-genome sequence of carnation (Dianthus caryophyllus L.) cv. ‘Francesco’ was determined using a combination of different new-generation multiplex sequencing platforms. The total length of the non-redundant sequences was 568 887 315 bp, consisting of 45 088 scaffolds, which covered 91% of the 622 Mb carnation genome estimated by k-mer analysis. The N50 values of contigs and scaffolds were 16 644 bp and 60 737 bp, respectively, and the longest scaffold was 1 287 144 bp. The average GC content of the contig sequences was 36%. A total of 1050, 13, 92 and 143 genes for tRNAs, rRNAs, snoRNA and miRNA, respectively, were identified in the assembled genomic sequences. For protein-encoding genes, 43 266 complete and partial gene structures excluding those in transposable elements were deduced. Gene coverage was ∼98%, as deduced from the coverage of the core eukaryotic genes. Intensive characterization of the assigned carnation genes and comparison with those of other plant species revealed characteristic features of the carnation genome. The results of this study will serve as a valuable resource for fundamental and applied research of carnation, especially for breeding new carnation varieties. Further information on the genomic sequences is available at http://carnation.kazusa.or.jp. PMID:24344172
Schumann, Julia; Kröhnert, Jutta; Frei, Elias; ...
2017-08-28
Carbon monoxide was applied as probe molecule to compare the surface of a ZnO-containing (Cu/ZnO:Al) and a ZnO-free (Cu/MgO) methanol synthesis catalyst (copper content 70 atomic %) after reduction in hydrogen at 523 K by DRIFT spectroscopy. Nano-structured, mainly metallic copper was detected on the surface of the Cu/MgO catalyst. In contrast, the high energy of the main peak in the spectrum of CO adsorbed on reduced Cu/ZnO:Al (2125 cm -1) proves that metallic copper is largely absent on the surface of this catalyst. The band is assigned to Zn δ+–CO. The presence of not completely reduced Cu δ+–CO speciesmore » cannot be excluded. The results are interpreted in terms of a partial coverage of the copper nano-particles in the Cu/ZnO:Al catalyst by a thin layer of metastable, defective zinc oxide. Minor contributions in the spectrum at 2090 and 2112 cm -1 due to nano-structured Cu 0–CO and CO adsorbed on highly defective Cu 0, respectively, indicate that the coverage of metallic copper is not complete.« less
González Block, Miguel Angel; Vargas Bustamante, Arturo; de la Sierra, Luz Angélica; Martínez Cardoso, Aresha
2014-04-01
The 12.4 million Mexican migrants in the United States (US) face considerable barriers to access health care, with 45% of them being uninsured. The Affordable Care Act (ACA) does not address lack of insurance for some immigrants, and the excluded groups are a large proportion of the Mexican-American community. To redress this, innovative forms of health insurance coverage have to be explored. This study analyses factors associated with willingness to pay for cross-border, bi-national health insurance (BHI) among Mexican immigrants in the US. Surveys were administered to 1,335 Mexican migrants in the Mexican Consulate of Los Angeles to assess their health status, healthcare utilization, and willingness to purchase BHI. Logistic regression was used to identify predictors of willingness to pay for BHI. Having a job, not having health insurance in the US, and relatives in Mexico attending public health services were significant predictors of willingness to pay for BHI. In addition, individuals identified quality as the most important factor when considering BHI. In spite of the interest for BHI among 54% of the sampled population, our study concludes that this type of coverage is unlikely to solve access to care challenges due to ACA eligibility among different Mexican immigrant populations.
Vaccine-preventable diseases, vaccines and Guillain-Barre' syndrome.
Principi, Nicola; Esposito, Susanna
2018-06-04
Guillain-Barré syndrome (GBS) is an acute, immune-mediated polyradiculoneuropathy. Infections and vaccines have been hypothesized to play a role in triggering GBS development. These beliefs can play a role in reducing vaccination coverage. In this report, data concerning this hypothesis are discussed. It is shown that an association between vaccine administration and GBS has never been proven for most of debated vaccines, although it cannot be definitively excluded. The only exception is the influenza vaccine, at least for the preparation used in 1976. For some vaccines, such as measles/mumps/rubella, human papillomavirus, tetravalent conjugated meningococcal vaccine, and influenza, the debate between supporters and opponents of vaccination remains robust and perception of vaccines' low safety remains a barrier to achieving adequate vaccination coverage. Less than 1 case of GBS per million immunized persons might occur for these vaccines. However, in some casesimmunization actually reduces the risk of GBS development. In addition, the benefits of vaccination are clearly demonstrated by the eradication or enormous decline in the incidence of many vaccine-preventable diseases. These data highlight that the hypothesized risks of adverse events, such as GBS, cannot be considered a valid reason to avoid the administration of currently recommended vaccines. Copyright © 2018 Elsevier Ltd. All rights reserved.
Epigraph: A Vaccine Design Tool Applied to an HIV Therapeutic Vaccine and a Pan-Filovirus Vaccine
Theiler, James; Yoon, Hyejin; Yusim, Karina; ...
2016-10-05
Epigraph is an efficient graph-based algorithm for designing vaccine antigens to optimize potential T-cell epitope (PTE) coverage. Functionally, epigraph vaccine antigens are similar to Mosaic vaccines, which have demonstrated effectiveness in preliminary HIV non-human primate studies. In contrast to the Mosaic algorithm, Epigraph is substantially faster, and in restricted cases, provides a mathematically optimal solution. Furthermore, epigraph has new features that enable enhanced vaccine design flexibility. These features include the ability to exclude rare epitopes from a design, to optimize population coverage based on inexact epitope matches, and to apply the code to both aligned and unaligned input sequences. Epigraphmore » was developed to provide practical design solutions for two outstanding vaccine problems. The first of these is a personalized approach to a therapeutic T-cell HIV vaccine that would provide antigens with an excellent match to an individual’s infecting strain, intended to contain or clear a chronic infection. The second is a pan-filovirus vaccine, with the potential to protect against all known viruses in the Filoviradae family, including ebolaviruses. A web-based interface to run the Epigraph tool suite is available (http://www.hiv.lanl.gov/content/sequence/EPIGRAPH/epigraph.html).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schumann, Julia; Kröhnert, Jutta; Frei, Elias
Carbon monoxide was applied as probe molecule to compare the surface of a ZnO-containing (Cu/ZnO:Al) and a ZnO-free (Cu/MgO) methanol synthesis catalyst (copper content 70 atomic %) after reduction in hydrogen at 523 K by DRIFT spectroscopy. Nano-structured, mainly metallic copper was detected on the surface of the Cu/MgO catalyst. In contrast, the high energy of the main peak in the spectrum of CO adsorbed on reduced Cu/ZnO:Al (2125 cm -1) proves that metallic copper is largely absent on the surface of this catalyst. The band is assigned to Zn δ+–CO. The presence of not completely reduced Cu δ+–CO speciesmore » cannot be excluded. The results are interpreted in terms of a partial coverage of the copper nano-particles in the Cu/ZnO:Al catalyst by a thin layer of metastable, defective zinc oxide. Minor contributions in the spectrum at 2090 and 2112 cm -1 due to nano-structured Cu 0–CO and CO adsorbed on highly defective Cu 0, respectively, indicate that the coverage of metallic copper is not complete.« less
[Stigma of "madness" from fate to recovery].
Bonsack, C; Morandi, S; Favrod, J; Conus, P
2013-03-13
Stigma is a "natural" social reaction, partly unconscious and automatic towards "different" and "vulnerable" populations. Suspicion of danger, unemployment, excluded from society, locked in hospital, assaulted or killed are the possible consequences of mental disorders' stigma. Despite advances in psychiatric treatments, the stigma of the "madness" remains a barrier to access to recovery. The stigmatization process is more complex than simple labeling, and leads to discrimination and loss of social power. Understanding the mechanisms of stigmatization can determine targets for effective interventions to fight stigma at the individual, institutional and political levels. The roles of patient and family associations, as well as the recovery model for the professionals, are essential. The aim of this article is to review the various aspects of mental disorders' stigma and to examine ways to cope with them.
Digital discrimination: Political bias in Internet service provision across ethnic groups.
Weidmann, Nils B; Benitez-Baleato, Suso; Hunziker, Philipp; Glatz, Eduard; Dimitropoulos, Xenofontas
2016-09-09
The global expansion of the Internet is frequently associated with increased government transparency, political rights, and democracy. However, this assumption depends on marginalized groups getting access in the first place. Here we document a strong and persistent political bias in the allocation of Internet coverage across ethnic groups worldwide. Using estimates of Internet penetration obtained through network measurements, we show that politically excluded groups suffer from significantly lower Internet penetration rates compared with those in power, an effect that cannot be explained by economic or geographic factors. Our findings underline one of the central impediments to "liberation technology," which is that governments still play a key role in the allocation of the Internet and can, intentionally or not, sabotage its liberating effects. Copyright © 2016, American Association for the Advancement of Science.
Pyle, Angela; Hudson, Gavin; Wilson, Ian J; Coxhead, Jonathan; Smertenko, Tania; Herbert, Mary; Santibanez-Koref, Mauro; Chinnery, Patrick F
2015-05-01
Recent reports have questioned the accepted dogma that mammalian mitochondrial DNA (mtDNA) is strictly maternally inherited. In humans, the argument hinges on detecting a signature of inter-molecular recombination in mtDNA sequences sampled at the population level, inferring a paternal source for the mixed haplotypes. However, interpreting these data is fraught with difficulty, and direct experimental evidence is lacking. Using extreme-high depth mtDNA re-sequencing up to ~1.2 million-fold coverage, we find no evidence that paternal mtDNA haplotypes are transmitted to offspring in humans, thus excluding a simple dilution mechanism for uniparental transmission of mtDNA present in all healthy individuals. Our findings indicate that an active mechanism eliminates paternal mtDNA which likely acts at the molecular level.
Pyle, Angela; Hudson, Gavin; Wilson, Ian J.; Coxhead, Jonathan; Smertenko, Tania; Herbert, Mary; Santibanez-Koref, Mauro; Chinnery, Patrick F.
2015-01-01
Recent reports have questioned the accepted dogma that mammalian mitochondrial DNA (mtDNA) is strictly maternally inherited. In humans, the argument hinges on detecting a signature of inter-molecular recombination in mtDNA sequences sampled at the population level, inferring a paternal source for the mixed haplotypes. However, interpreting these data is fraught with difficulty, and direct experimental evidence is lacking. Using extreme-high depth mtDNA re-sequencing up to ~1.2 million-fold coverage, we find no evidence that paternal mtDNA haplotypes are transmitted to offspring in humans, thus excluding a simple dilution mechanism for uniparental transmission of mtDNA present in all healthy individuals. Our findings indicate that an active mechanism eliminates paternal mtDNA which likely acts at the molecular level. PMID:25973765
GLD100 - Lunar topography from LROC WAC stereo
NASA Astrophysics Data System (ADS)
Scholten, F.; Oberst, J.; Robinson, M. S.
2011-10-01
The LROC WAC instrument of the LRO mission comprises substantial stereo image data from adjacent orbits. Multiple coverage of the entire surface of the Moon at a mean ground scale of 75 m/pxl has already been achieved within the first two years of the mission. We applied photogrammetric stereo processing methods for the derivation of a 100 m raster DTM (digital terrain model), called GLD100, from several tens of thousands stereo models. The GLD100 covers the lunar surface between 80° northern and southern latitude. Polar regions are excluded because of poor illumination and stereo conditions. Vertical differences of the GLD100 to altimetry data from the LRO LOLA instrument are small, the mean deviation is typically about 20 m, without systematic lateral or vertical offsets.
Rizzo, Gaia; Tonietto, Matteo; Castellaro, Marco; Raffeiner, Bernd; Coran, Alessandro; Fiocco, Ugo; Stramare, Roberto; Grisan, Enrico
2017-04-01
Contrast Enhanced Ultrasound (CEUS) is a sensitive imaging technique to assess tissue vascularity and it can be particularly useful in early detection and grading of arthritis. In a recent study we have shown that a Gamma-variate can accurately quantify synovial perfusion and it is flexible enough to describe many heterogeneous patterns. However, in some cases the heterogeneity of the kinetics can be such that even the Gamma model does not properly describe the curve, with a high number of outliers. In this work we apply to CEUS data the single compartment recirculation model (SCR) which takes explicitly into account the trapping of the microbubbles contrast agent by adding to the single Gamma-variate model its integral. The SCR model, originally proposed for dynamic-susceptibility magnetic resonance imaging, is solved here at pixel level within a Bayesian framework using Variational Bayes (VB). We also include the automatic relevant determination (ARD) algorithm to automatically infer the model complexity (SCR vs. Gamma model) from the data. We demonstrate that the inclusion of trapping best describes the CEUS patterns in 50% of the pixels, with the other 50% best fitted by a single Gamma. Such results highlight the necessity of the use ARD, to automatically exclude the irreversible component where not supported by the data. VB with ARD returns precise estimates in the majority of the kinetics (88% of total percentage of pixels) in a limited computational time (on average, 3.6 min per subject). Moreover, the impact of the additional trapping component has been evaluated for the differentiation of rheumatoid and non-rheumatoid patients, by means of a support vector machine classifier with backward feature selection. The results show that the trapping parameter is always present in the selected feature set, and improves the classification.
Wilhelm, Konrad; Miernik, Arkadiusz; Hein, Simon; Schlager, Daniel; Adams, Fabian; Benndorf, Matthias; Fritz, Benjamin; Langer, Mathias; Hesse, Albrecht; Schoenthaler, Martin; Neubauer, Jakob
2018-06-02
To validate AutoMated UroLithiasis Evaluation Tool (AMULET) software for kidney stone volumetry and compare its performance to standard clinical practice. Maximum diameter and volume of 96 urinary stones were measured as reference standard by three independent urologists. The same stones were positioned in an anthropomorphic phantom and CT scans acquired in standard settings. Three independent radiologists blinded to the reference values took manual measurements of the maximum diameter and automatic measurements of maximum diameter and volume. An "expected volume" was calculated based on manual diameter measurements using the formula: V=4/3 πr³. 96 stones were analyzed in the study. We had initially aimed to assess 100. Nine were replaced during data acquisition due of crumbling and 4 had to be excluded because the automated measurement did not work. Mean reference maximum diameter was 13.3 mm (5.2-32.1 mm). Correlation coefficients among all measured outcomes were compared. The correlation between the manual and automatic diameter measurements to the reference was 0.98 and 0.91, respectively (p<0.001). Mean reference volume was 1200 mm³ (10-9000 mm³). The correlation between the "expected volume" and automatically measured volume to the reference was 0.95 and 0.99, respectively (p<0.001). Patients' kidney stone burden is usually assessed according to maximum diameter. However, as most stones are not spherical, this entails a potential bias. Automated stone volumetry is possible and significantly more accurate than diameter-based volumetric calculations. To avoid bias in clinical trials, size should be measured as volume. However, automated diameter measurements are not as accurate as manual measurements.
Anderson, Tavis K; Laegreid, William W; Cerutti, Francesco; Osorio, Fernando A; Nelson, Eric A; Christopher-Hennings, Jane; Goldberg, Tony L
2012-06-15
The extraordinary genetic and antigenic variability of RNA viruses is arguably the greatest challenge to the development of broadly effective vaccines. No single viral variant can induce sufficiently broad immunity, and incorporating all known naturally circulating variants into one multivalent vaccine is not feasible. Furthermore, no objective strategies currently exist to select actual viral variants that should be included or excluded in polyvalent vaccines. To address this problem, we demonstrate a method based on graph theory that quantifies the relative importance of viral variants. We demonstrate our method through application to the envelope glycoprotein gene of a particularly diverse RNA virus of pigs: porcine reproductive and respiratory syndrome virus (PRRSV). Using distance matrices derived from sequence nucleotide difference, amino acid difference and evolutionary distance, we constructed viral networks and used common network statistics to assign each sequence an objective ranking of relative 'importance'. To validate our approach, we use an independent published algorithm to score our top-ranked wild-type variants for coverage of putative T-cell epitopes across the 9383 sequences in our dataset. Top-ranked viruses achieve significantly higher coverage than low-ranked viruses, and top-ranked viruses achieve nearly equal coverage as a synthetic mosaic protein constructed in silico from the same set of 9383 sequences. Our approach relies on the network structure of PRRSV but applies to any diverse RNA virus because it identifies subsets of viral variants that are most important to overall viral diversity. We suggest that this method, through the objective quantification of variant importance, provides criteria for choosing viral variants for further characterization, diagnostics, surveillance and ultimately polyvalent vaccine development.
Mars Global Digital Dune Database: MC2-MC29
Hayward, Rosalyn K.; Mullins, Kevin F.; Fenton, L.K.; Hare, T.M.; Titus, T.N.; Bourke, M.C.; Colaprete, Anthony; Christensen, P.R.
2007-01-01
Introduction The Mars Global Digital Dune Database presents data and describes the methodology used in creating the database. The database provides a comprehensive and quantitative view of the geographic distribution of moderate- to large-size dune fields from 65? N to 65? S latitude and encompasses ~ 550 dune fields. The database will be expanded to cover the entire planet in later versions. Although we have attempted to include all dune fields between 65? N and 65? S, some have likely been excluded for two reasons: 1) incomplete THEMIS IR (daytime) coverage may have caused us to exclude some moderate- to large-size dune fields or 2) resolution of THEMIS IR coverage (100m/pixel) certainly caused us to exclude smaller dune fields. The smallest dune fields in the database are ~ 1 km2 in area. While the moderate to large dune fields are likely to constitute the largest compilation of sediment on the planet, smaller stores of sediment of dunes are likely to be found elsewhere via higher resolution data. Thus, it should be noted that our database excludes all small dune fields and some moderate to large dune fields as well. Therefore the absence of mapped dune fields does not mean that such dune fields do not exist and is not intended to imply a lack of saltating sand in other areas. Where availability and quality of THEMIS visible (VIS) or Mars Orbiter Camera narrow angle (MOC NA) images allowed, we classifed dunes and included dune slipface measurements, which were derived from gross dune morphology and represent the prevailing wind direction at the last time of significant dune modification. For dunes located within craters, the azimuth from crater centroid to dune field centroid was calculated. Output from a general circulation model (GCM) is also included. In addition to polygons locating dune fields, the database includes over 1800 selected Thermal Emission Imaging System (THEMIS) infrared (IR), THEMIS visible (VIS) and Mars Orbiter Camera Narrow Angle (MOC NA) images that were used to build the database. The database is presented in a variety of formats. It is presented as a series of ArcReader projects which can be opened using the free ArcReader software. The latest version of ArcReader can be downloaded at http://www.esri.com/software/arcgis/arcreader/download.html. The database is also presented in ArcMap projects. The ArcMap projects allow fuller use of the data, but require ESRI ArcMap? software. Multiple projects were required to accommodate the large number of images needed. A fuller description of the projects can be found in the Dunes_ReadMe file and the ReadMe_GIS file in the Documentation folder. For users who prefer to create their own projects, the data is available in ESRI shapefile and geodatabase formats, as well as the open Geographic Markup Language (GML) format. A printable map of the dunes and craters in the database is available as a Portable Document Format (PDF) document. The map is also included as a JPEG file. ReadMe files are available in PDF and ASCII (.txt) files. Tables are available in both Excel (.xls) and ASCII formats.
Automatic panoramic thermal integrated sensor
NASA Astrophysics Data System (ADS)
Gutin, Mikhail A.; Tsui, Eddy K.; Gutin, Olga N.
2005-05-01
Historically, the US Army has recognized the advantages of panoramic imagers with high image resolution: increased area coverage with fewer cameras, instantaneous full horizon detection, location and tracking of multiple targets simultaneously, extended range, and others. The novel ViperViewTM high-resolution panoramic thermal imager is the heart of the Automatic Panoramic Thermal Integrated Sensor (APTIS), being jointly developed by Applied Science Innovative, Inc. (ASI) and the Armament Research, Development and Engineering Center (ARDEC) in support of the Future Combat Systems (FCS) and the Intelligent Munitions Systems (IMS). The APTIS is anticipated to operate as an intelligent node in a wireless network of multifunctional nodes that work together to improve situational awareness (SA) in many defense and offensive operations, as well as serve as a sensor node in tactical Intelligence Surveillance Reconnaissance (ISR). The ViperView is as an aberration-corrected omnidirectional imager with small optics designed to match the resolution of a 640x480 pixels IR camera with improved image quality for longer range target detection, classification, and tracking. The same approach is applicable to panoramic cameras working in the visible spectral range. Other components of the ATPIS sensor suite include ancillary sensors, advanced power management, and wakeup capability. This paper describes the development status of the APTIS system.
Using machine learning techniques to automate sky survey catalog generation
NASA Technical Reports Server (NTRS)
Fayyad, Usama M.; Roden, J. C.; Doyle, R. J.; Weir, Nicholas; Djorgovski, S. G.
1993-01-01
We describe the application of machine classification techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Palomar Observatory Sky Survey provides comprehensive photographic coverage of the northern celestial hemisphere. The photographic plates are being digitized into images containing on the order of 10(exp 7) galaxies and 10(exp 8) stars. Since the size of this data set precludes manual analysis and classification of objects, our approach is to develop a software system which integrates independently developed techniques for image processing and data classification. Image processing routines are applied to identify and measure features of sky objects. Selected features are used to determine the classification of each object. GID3* and O-BTree, two inductive learning techniques, are used to automatically learn classification decision trees from examples. We describe the techniques used, the details of our specific application, and the initial encouraging results which indicate that our approach is well-suited to the problem. The benefits of the approach are increased data reduction throughput, consistency of classification, and the automated derivation of classification rules that will form an objective, examinable basis for classifying sky objects. Furthermore, astronomers will be freed from the tedium of an intensely visual task to pursue more challenging analysis and interpretation problems given automatically cataloged data.
Shared control on lunar spacecraft teleoperation rendezvous operations with large time delay
NASA Astrophysics Data System (ADS)
Ya-kun, Zhang; Hai-yang, Li; Rui-xue, Huang; Jiang-hui, Liu
2017-08-01
Teleoperation could be used in space on-orbit serving missions, such as object deorbits, spacecraft approaches, and automatic rendezvous and docking back-up systems. Teleoperation rendezvous and docking in lunar orbit may encounter bottlenecks for the inherent time delay in the communication link and the limited measurement accuracy of sensors. Moreover, human intervention is unsuitable in view of the partial communication coverage problem. To solve these problems, a shared control strategy for teleoperation rendezvous and docking is detailed. The control authority in lunar orbital maneuvers that involves two spacecraft as rendezvous and docking in the final phase was discussed in this paper. The predictive display model based on the relative dynamic equations is established to overcome the influence of the large time delay in communication link. We discuss and attempt to prove via consistent, ground-based simulations the relative merits of fully autonomous control mode (i.e., onboard computer-based), fully manual control (i.e., human-driven at the ground station) and shared control mode. The simulation experiments were conducted on the nine-degrees-of-freedom teleoperation rendezvous and docking simulation platform. Simulation results indicated that the shared control methods can overcome the influence of time delay effects. In addition, the docking success probability of shared control method was enhanced compared with automatic and manual modes.
Urban Density Indices Using Mean Shift-Based Upsampled Elevetion Data
NASA Astrophysics Data System (ADS)
Charou, E.; Gyftakis, S.; Bratsolis, E.; Tsenoglou, T.; Papadopoulou, Th. D.; Vassilas, N.
2015-04-01
Urban density is an important factor for several fields, e.g. urban design, planning and land management. Modern remote sensors deliver ample information for the estimation of specific urban land classification classes (2D indicators), and the height of urban land classification objects (3D indicators) within an Area of Interest (AOI). In this research, two of these indicators, Building Coverage Ratio (BCR) and Floor Area Ratio (FAR) are numerically and automatically derived from high-resolution airborne RGB orthophotos and LiDAR data. In the pre-processing step the low resolution elevation data are fused with the high resolution optical data through a mean-shift based discontinuity preserving smoothing algorithm. The outcome is an improved normalized digital surface model (nDSM) is an upsampled elevation data with considerable improvement regarding region filling and "straightness" of elevation discontinuities. In a following step, a Multilayer Feedforward Neural Network (MFNN) is used to classify all pixels of the AOI to building or non-building categories. For the total surface of the block and the buildings we consider the number of their pixels and the surface of the unit pixel. Comparisons of the automatically derived BCR and FAR indicators with manually derived ones shows the applicability and effectiveness of the methodology proposed.
A 3000 TNOs Survey Project at ESO La Silla
NASA Astrophysics Data System (ADS)
Boehnhardt, H.; Hainaut, O.
We propose a wide-shallow TNO search to be done with the Wide Field Imager (WFI) instrument at the 2.2m MPG/ESO telescope in La Silla/Chile. The WFI is a half-deg camera equipped with an 8kx8k CCD (0.24 arcsec/pixel). The telescope can support excellent seeing quality down to 0.5arcsec FWHM. A TNO search pilot project was run with the 2.2m+WFI in 1999: images with just 1.6sdeg sky coverage and typically 24mag limiting brightness revealed 6 new TNOs when processed with our new automatic detection program MOVIE. The project is now continued on a somewhat larger scale in order to find more TNOs and to fine-tune the operational environment for a full automatic on-line detection, astrometry and photometry of the objects at the telescope. The future goal is to perform - with the 2.2m+WFI and in an international colaboration - an even larger TNO survey over a major part of the sky (typically 2000sdeg in and out of Ecliptic) down to 24mag. Follow-up astrometry and photometry of the expected more than 3000 discovered objects will secure their orbital and physical characterisation for synoptic dynamical and taxonomic studies of the Transneptunian population.
Motion analysis for duplicate frame removal in wireless capsule endoscope
NASA Astrophysics Data System (ADS)
Lee, Hyun-Gyu; Choi, Min-Kook; Lee, Sang-Chul
2011-03-01
Wireless capsule endoscopy (WCE) has been intensively researched recently due to its convenience for diagnosis and extended detection coverage of some diseases. Typically, a full recording covering entire human digestive system requires about 8 to 12 hours for a patient carrying a capsule endoscope and a portable image receiver/recorder unit, which produces 120,000 image frames on average. In spite of the benefits of close examination, WCE based test has a barrier for quick diagnosis such that a trained diagnostician must examine a huge amount of images for close investigation, normally over 2 hours. The main purpose of our work is to present a novel machine vision approach to reduce diagnosis time by automatically detecting duplicated recordings caused by backward camera movement, typically containing redundant information, in small intestine. The developed technique could be integrated with a visualization tool which supports intelligent inspection method, such as automatic play speed control. Our experimental result shows high accuracy of the technique by detecting 989 duplicate image frames out of 10,000, equivalently to 9.9% data reduction, in a WCE video from a real human subject. With some selected parameters, we achieved the correct detection ratio of 92.85% and the false detection ratio of 13.57%.
Hervás, Marcos; Alsina-Pagès, Rosa Ma; Alías, Francesc; Salvador, Martí
2017-01-01
Fast environmental variations due to climate change can cause mass decline or even extinctions of species, having a dramatic impact on the future of biodiversity. During the last decade, different approaches have been proposed to track and monitor endangered species, generally based on costly semi-automatic systems that require human supervision adding limitations in coverage and time. However, the recent emergence of Wireless Acoustic Sensor Networks (WASN) has allowed non-intrusive remote monitoring of endangered species in real time through the automatic identification of the sound they emit. In this work, an FPGA-based WASN centralized architecture is proposed and validated on a simulated operation environment. The feasibility of the architecture is evaluated in a case study designed to detect the threatened Botaurus stellaris among other 19 cohabiting birds species in The Parc Natural dels Aiguamolls de l’Empordà, showing an averaged recognition accuracy of 91% over 2h 55’ of representative data. The FPGA-based feature extraction implementation allows the system to process data from 30 acoustic sensors in real time with an affordable cost. Finally, several open questions derived from this research are discussed to be considered for future works. PMID:28594373
Tool Support for Parametric Analysis of Large Software Simulation Systems
NASA Technical Reports Server (NTRS)
Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony
2008-01-01
The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.
On-the-fly data assessment for high-throughput x-ray diffraction measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Fang; Pandolfi, Ronald; Van Campen, Douglas
Investment in brighter sources and larger and faster detectors has accelerated the speed of data acquisition at national user facilities. The accelerated data acquisition offers many opportunities for the discovery of new materials, but it also presents a daunting challenge. The rate of data acquisition far exceeds the current speed of data quality assessment, resulting in less than optimal data and data coverage, which in extreme cases forces recollection of data. Herein, we show how this challenge can be addressed through the development of an approach that makes routine data assessment automatic and instantaneous. By extracting and visualizing customized attributesmore » in real time, data quality and coverage, as well as other scientifically relevant information contained in large data sets, is highlighted. Deployment of such an approach not only improves the quality of data but also helps optimize the usage of expensive characterization resources by prioritizing measurements of the highest scientific impact. We anticipate our approach will become a starting point for a sophisticated decision-tree that optimizes data quality and maximizes scientific content in real time through automation. Finally, with these efforts to integrate more automation in data collection and analysis, we can truly take advantage of the accelerating speed of data acquisition.« less
Multi-UAV Routing for Area Coverage and Remote Sensing with Minimum Time
Avellar, Gustavo S. C.; Pereira, Guilherme A. S.; Pimenta, Luciano C. A.; Iscold, Paulo
2015-01-01
This paper presents a solution for the problem of minimum time coverage of ground areas using a group of unmanned air vehicles (UAVs) equipped with image sensors. The solution is divided into two parts: (i) the task modeling as a graph whose vertices are geographic coordinates determined in such a way that a single UAV would cover the area in minimum time; and (ii) the solution of a mixed integer linear programming problem, formulated according to the graph variables defined in the first part, to route the team of UAVs over the area. The main contribution of the proposed methodology, when compared with the traditional vehicle routing problem’s (VRP) solutions, is the fact that our method solves some practical problems only encountered during the execution of the task with actual UAVs. In this line, one of the main contributions of the paper is that the number of UAVs used to cover the area is automatically selected by solving the optimization problem. The number of UAVs is influenced by the vehicles’ maximum flight time and by the setup time, which is the time needed to prepare and launch a UAV. To illustrate the methodology, the paper presents experimental results obtained with two hand-launched, fixed-wing UAVs. PMID:26540055
Acoustic Sensor Planning for Gunshot Location in National Parks: A Pareto Front Approach
González-Castaño, Francisco Javier; Alonso, Javier Vales; Costa-Montenegro, Enrique; López-Matencio, Pablo; Vicente-Carrasco, Francisco; Parrado-García, Francisco J.; Gil-Castiñeira, Felipe; Costas-Rodríguez, Sergio
2009-01-01
In this paper, we propose a solution for gunshot location in national parks. In Spain there are agencies such as SEPRONA that fight against poaching with considerable success. The DiANa project, which is endorsed by Cabaneros National Park and the SEPRONA service, proposes a system to automatically detect and locate gunshots. This work presents its technical aspects related to network design and planning. The system consists of a network of acoustic sensors that locate gunshots by hyperbolic multi-lateration estimation. The differences in sound time arrivals allow the computation of a low error estimator of gunshot location. The accuracy of this method depends on tight sensor clock synchronization, which an ad-hoc time synchronization protocol provides. On the other hand, since the areas under surveillance are wide, and electric power is scarce, it is necessary to maximize detection coverage and minimize system cost at the same time. Therefore, sensor network planning has two targets, i.e., coverage and cost. We model planning as an unconstrained problem with two objective functions. We determine a set of candidate solutions of interest by combining a derivative-free descent method we have recently proposed with a Pareto front approach. The results are clearly superior to random seeding in a realistic simulation scenario. PMID:22303135
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2006-01-01
A computer program automatically builds large, full-resolution mosaics of multispectral images of Earth landmasses from images acquired by Landsat 7, complete with matching of colors and blending between adjacent scenes. While the code has been used extensively for Landsat, it could also be used for other data sources. A single mosaic of as many as 8,000 scenes, represented by more than 5 terabytes of data and the largest set produced in this work, demonstrated what the code could do to provide global coverage. The program first statistically analyzes input images to determine areas of coverage and data-value distributions. It then transforms the input images from their original universal transverse Mercator coordinates to other geographical coordinates, with scaling. It applies a first-order polynomial brightness correction to each band in each scene. It uses a data-mask image for selecting data and blending of input scenes. Under control by a user, the program can be made to operate on small parts of the output image space, with check-point and restart capabilities. The program runs on SGI IRIX computers. It is capable of parallel processing using shared-memory code, large memories, and tens of central processing units. It can retrieve input data and store output data at locations remote from the processors on which it is executed.
Observatorio Astrofísico de Javalambre: observation scheduler and sequencer
NASA Astrophysics Data System (ADS)
Ederoclite, A.; Cristóbal-Hornillos, D.; Moles, M.; Cenarro, A. J.; Marín-Franch, A.; Yanes Díaz, A.; Gruel, N.; Varela, J.; Chueca, S.; Rueda-Teruel, F.; Rueda-Teruel, S.; Luis-Simoes, R.; Hernández-Fuertes, J.; López-Sainz, A.; Chioare Díaz-Martín, M.
2013-05-01
Observational strategy is a critical path in any large survey. The planning of a night requires the knowledge of the fields observed, the quality of the data already secured, and the ones still to be observed to optimize scientific returns. Finally, field maximum altitude, sky distance/brightness during the night and meteorological data (cloud coverage and seeing) have to be taken into account in order to increase the chance to have a successful observation. To support the execution of the J-PAS project at the Javalambre Astrophysical Observatory, we have prepared a scheduler and a sequencer (SCH/SQ) which takes into account all the relevant mentioned parameters. The scheduler first selects the fields which can be observed during the night and orders them on the basis of their figure of merit. It takes into account the quality and spectral coverage of the existing observations as well as the possibility to get a good observation during the night. The sequencer takes into account the meteorological variables in order to prepare the observation queue for the night. During the commissioning of the telescopes at OAJ, we expect to improve our figures of merit and eventually get to a system which can function semi-automatically. This poster describes the design of this software.
Automated recognition and tracking of aerosol threat plumes with an IR camera pod
NASA Astrophysics Data System (ADS)
Fauth, Ryan; Powell, Christopher; Gruber, Thomas; Clapp, Dan
2012-06-01
Protection of fixed sites from chemical, biological, or radiological aerosol plume attacks depends on early warning so that there is time to take mitigating actions. Early warning requires continuous, autonomous, and rapid coverage of large surrounding areas; however, this must be done at an affordable cost. Once a potential threat plume is detected though, a different type of sensor (e.g., a more expensive, slower sensor) may be cued for identification purposes, but the problem is to quickly identify all of the potential threats around the fixed site of interest. To address this problem of low cost, persistent, wide area surveillance, an IR camera pod and multi-image stitching and processing algorithms have been developed for automatic recognition and tracking of aerosol plumes. A rugged, modular, static pod design, which accommodates as many as four micro-bolometer IR cameras for 45deg to 180deg of azimuth coverage, is presented. Various OpenCV1 based image-processing algorithms, including stitching of multiple adjacent FOVs, recognition of aerosol plume objects, and the tracking of aerosol plumes, are presented using process block diagrams and sample field test results, including chemical and biological simulant plumes. Methods for dealing with the background removal, brightness equalization between images, and focus quality for optimal plume tracking are also discussed.
NASA Astrophysics Data System (ADS)
Blaser, S.; Nebiker, S.; Cavegn, S.
2017-05-01
Image-based mobile mapping systems enable the efficient acquisition of georeferenced image sequences, which can later be exploited in cloud-based 3D geoinformation services. In order to provide a 360° coverage with accurate 3D measuring capabilities, we present a novel 360° stereo panoramic camera configuration. By using two 360° panorama cameras tilted forward and backward in combination with conventional forward and backward looking stereo camera systems, we achieve a full 360° multi-stereo coverage. We furthermore developed a fully operational new mobile mapping system based on our proposed approach, which fulfils our high accuracy requirements. We successfully implemented a rigorous sensor and system calibration procedure, which allows calibrating all stereo systems with a superior accuracy compared to that of previous work. Our study delivered absolute 3D point accuracies in the range of 4 to 6 cm and relative accuracies of 3D distances in the range of 1 to 3 cm. These results were achieved in a challenging urban area. Furthermore, we automatically reconstructed a 3D city model of our study area by employing all captured and georeferenced mobile mapping imagery. The result is a very high detailed and almost complete 3D city model of the street environment.
Acoustic sensor planning for gunshot location in national parks: a pareto front approach.
González-Castaño, Francisco Javier; Alonso, Javier Vales; Costa-Montenegro, Enrique; López-Matencio, Pablo; Vicente-Carrasco, Francisco; Parrado-García, Francisco J; Gil-Castiñeira, Felipe; Costas-Rodríguez, Sergio
2009-01-01
In this paper, we propose a solution for gunshot location in national parks. In Spain there are agencies such as SEPRONA that fight against poaching with considerable success. The DiANa project, which is endorsed by Cabaneros National Park and the SEPRONA service, proposes a system to automatically detect and locate gunshots. This work presents its technical aspects related to network design and planning. The system consists of a network of acoustic sensors that locate gunshots by hyperbolic multi-lateration estimation. The differences in sound time arrivals allow the computation of a low error estimator of gunshot location. The accuracy of this method depends on tight sensor clock synchronization, which an ad-hoc time synchronization protocol provides. On the other hand, since the areas under surveillance are wide, and electric power is scarce, it is necessary to maximize detection coverage and minimize system cost at the same time. Therefore, sensor network planning has two targets, i.e., coverage and cost. We model planning as an unconstrained problem with two objective functions. We determine a set of candidate solutions of interest by combining a derivative-free descent method we have recently proposed with a Pareto front approach. The results are clearly superior to random seeding in a realistic simulation scenario.
Aircraft monitoring by the fusion of satellite and ground ADS-B data
NASA Astrophysics Data System (ADS)
Zhang, Xuan; Zhang, Jingjing; Wu, Shufan; Cheng, Qian; Zhu, Rui
2018-02-01
The Automatic Dependent Surveillance- Broadcast (ADS-B) system is today a standard equipment on civil aircraft, transmitting periodically data packages containing information of key data such as aircraft ID, position, altitude and intention. It is designed for terrestrial based ground station to monitor air traffic flow in certain regions. Space based ADS-B is the idea to place sensitive receivers on board satellites in orbit, which can receive ADS-B packages and relay them the relevant ground stations. The terrestrial ADS-B receiver has been widely applied for airport information system, help monitor and control traffic flow, etc. However, its coverage is strongly limited by sea or mountain conditions. This paper first introduces the CubeSat mission, then discusses the integrated application of ADS-B data received from ground stations and from satellites, analyze their characteristics with statistical results of comparison, and explore the technologies to fuse these two different data resources for an integrated application. The satellite data is based on a Chinese CubeSat, STU-2C, being launched into space on Sept 25th 2015. The ADS-B data received from two different resources have shown a good complementary each other, such as to increase the coverage of space for air traffic, and to monitor the whole space in a better and complete way.
Multi-UAV Routing for Area Coverage and Remote Sensing with Minimum Time.
Avellar, Gustavo S C; Pereira, Guilherme A S; Pimenta, Luciano C A; Iscold, Paulo
2015-11-02
This paper presents a solution for the problem of minimum time coverage of ground areas using a group of unmanned air vehicles (UAVs) equipped with image sensors. The solution is divided into two parts: (i) the task modeling as a graph whose vertices are geographic coordinates determined in such a way that a single UAV would cover the area in minimum time; and (ii) the solution of a mixed integer linear programming problem, formulated according to the graph variables defined in the first part, to route the team of UAVs over the area. The main contribution of the proposed methodology, when compared with the traditional vehicle routing problem's (VRP) solutions, is the fact that our method solves some practical problems only encountered during the execution of the task with actual UAVs. In this line, one of the main contributions of the paper is that the number of UAVs used to cover the area is automatically selected by solving the optimization problem. The number of UAVs is influenced by the vehicles' maximum flight time and by the setup time, which is the time needed to prepare and launch a UAV. To illustrate the methodology, the paper presents experimental results obtained with two hand-launched, fixed-wing UAVs.
On-the-fly data assessment for high-throughput x-ray diffraction measurements
Ren, Fang; Pandolfi, Ronald; Van Campen, Douglas; ...
2017-05-02
Investment in brighter sources and larger and faster detectors has accelerated the speed of data acquisition at national user facilities. The accelerated data acquisition offers many opportunities for the discovery of new materials, but it also presents a daunting challenge. The rate of data acquisition far exceeds the current speed of data quality assessment, resulting in less than optimal data and data coverage, which in extreme cases forces recollection of data. Herein, we show how this challenge can be addressed through the development of an approach that makes routine data assessment automatic and instantaneous. By extracting and visualizing customized attributesmore » in real time, data quality and coverage, as well as other scientifically relevant information contained in large data sets, is highlighted. Deployment of such an approach not only improves the quality of data but also helps optimize the usage of expensive characterization resources by prioritizing measurements of the highest scientific impact. We anticipate our approach will become a starting point for a sophisticated decision-tree that optimizes data quality and maximizes scientific content in real time through automation. Finally, with these efforts to integrate more automation in data collection and analysis, we can truly take advantage of the accelerating speed of data acquisition.« less
Cognition and Take-up of Subsidized Drug Benefits by Medicare Beneficiaries
Kuye, Ifedayo O.; Frank, Richard G.; McWilliams, J. Michael
2013-01-01
Importance Take-up of the Medicare Part D low-income subsidy (LIS) by eligible beneficiaries has been low despite the attractive drug coverage it offers at no cost to beneficiaries and outreach efforts by the Social Security Administration. Objective To examine the role of beneficiaries’ cognitive abilities in explaining this puzzle. Design and Setting Analysis of survey data from the nationally representative Health and Retirement Study. Participants Elderly Medicare beneficiaries who were likely eligible for the LIS, excluding Medicaid and Supplemental Security Income recipients, who automatically receive the subsidy without applying. Main Outcomes and Measures Using survey assessments of overall cognition and numeracy from 2006–2010, we examined how cognitive abilities were associated with self-reported Part D enrollment, awareness of the LIS, and application for the LIS. We also compared out-of-pocket drug spending and premium costs between LIS-eligible beneficiaries who did and did not report receipt of the LIS. Analyses were adjusted for sociodemographic characteristics, household income and assets, health status, and presence of chronic conditions. Results Compared with LIS-eligible beneficiaries in the top quartile of overall cognition, those in the bottom quartile were significantly less likely to report Part D enrollment (adjusted rate, 63.5% vs. 52.0%; P=0.002), LIS awareness (58.3% vs. 33.3%; P=0.001), and LIS application (25.5% vs. 12.7%; P<0.001). Lower numeracy was also associated with lower rates of Part D enrollment (P=0.03) and LIS application (P=0.002). Reported receipt of the LIS was associated with significantly lower annual out-of-pocket drug spending (adjusted mean difference, −$256; P=0.02) and premium costs (−$273; P=0.02). Conclusions and Relevance Among Medicare beneficiaries likely eligible for the Part D LIS, poorer cognition and numeracy were associated with lower reported take-up. Current educational and outreach efforts encouraging LIS applications may not be sufficient for beneficiaries with limited abilities to process and respond to information. Additional policies may be needed to extend the financial protection conferred by the LIS to all eligible seniors. PMID:23649604
A consistent and uniform research earthquake catalog for the AlpArray region: preliminary results.
NASA Astrophysics Data System (ADS)
Molinari, I.; Bagagli, M.; Kissling, E. H.; Diehl, T.; Clinton, J. F.; Giardini, D.; Wiemer, S.
2017-12-01
The AlpArray initiative (www.alparray.ethz.ch) is a large-scale European collaboration ( 50 institutes involved) to study the entire Alpine orogen at high resolution with a variety of geoscientific methods. AlpArray provides unprecedentedly uniform station coverage for the region with more than 650 broadband seismic stations, 300 of which are temporary. The AlpArray Seismic Network (AASN) is a joint effort of 25 institutes from 10 nations, operates since January 2016 and is expected to continue until the end of 2018. In this study, we establish a uniform earthquake catalogue for the Greater Alpine region during the operation period of the AASN with a aimed completeness of M2.5. The catalog has two main goals: 1) calculation of consistent and precise hypocenter locations 2) provide preliminary but uniform magnitude calculations across the region. The procedure is based on automatic high-quality P- and S-wave pickers, providing consistent phase arrival times in combination with a picking quality assessment. First, we detect all events in the region in 2016/2017 using an STA/LTA based detector. Among the detected events, we select 50 geographically homogeneously distributed events with magnitudes ≥2.5 representative for the entire catalog. We manually pick the selected events to establish a consistent P- and S-phase reference data set, including arrival-time time uncertainties. The reference data, are used to adjust the automatic pickers and to assess their performance. In a first iteration, a simple P-picker algorithm is applied to the entire dataset, providing initial picks for the advanced MannekenPix (MPX) algorithm. In a second iteration, the MPX picker provides consistent and reliable automatic first arrival P picks together with a pick-quality estimate. The derived automatic P picks are then used as initial values for a multi-component S-phase picking algorithm. Subsequently, automatic picks of all well-locatable earthquakes will be considered to calculate final minimum 1D P and S velocity models for the region with appropriate stations corrections. Finally, all the events are relocated with the NonLinLoc algorithm in combination with the updated 1D models. The proposed procedure represents the first step towards uniform earthquake catalog for the entire greater Alpine region using the AASN.
Yushkevich, Paul A.; Pluta, John B.; Wang, Hongzhi; Xie, Long; Ding, Song-Lin; Gertje, E. C.; Mancuso, Lauren; Kliot, Daria; Das, Sandhitsu R.; Wolk, David A.
2014-01-01
We evaluate a fully automatic technique for labeling hippocampal subfields and cortical subregions in the medial temporal lobe (MTL) in in vivo 3 Tesla MRI. The method performs segmentation on a T2-weighted MRI scan with 0.4 × 0.4 × 2.0 mm3 resolution, partial brain coverage, and oblique orientation. Hippocampal subfields, entorhinal cortex, and perirhinal cortex are labeled using a pipeline that combines multi-atlas label fusion and learning-based error correction. In contrast to earlier work on automatic subfield segmentation in T2-weighted MRI (Yushkevich et al., 2010), our approach requires no manual initialization, labels hippocampal subfields over a greater anterior-posterior extent, and labels the perirhinal cortex, which is further subdivided into Brodmann areas 35 and 36. The accuracy of the automatic segmentation relative to manual segmentation is measured using cross-validation in 29 subjects from a study of amnestic Mild Cognitive Impairment (aMCI), and is highest for the dentate gyrus (Dice coefficient is 0.823), CA1 (0.803), perirhinal cortex (0.797) and entorhinal cortex (0.786) labels. A larger cohort of 83 subjects is used to examine the effects of aMCI in the hippocampal region using both subfield volume and regional subfield thickness maps. Most significant differences between aMCI and healthy aging are observed bilaterally in the CA1 subfield and in the left Brodmann area 35. Thickness analysis results are consistent with volumetry, but provide additional regional specificity and suggest non-uniformity in the effects of aMCI on hippocampal subfields and MTL cortical subregions. PMID:25181316
RoboTAP: Target priorities for robotic microlensing observations
NASA Astrophysics Data System (ADS)
Hundertmark, M.; Street, R. A.; Tsapras, Y.; Bachelet, E.; Dominik, M.; Horne, K.; Bozza, V.; Bramich, D. M.; Cassan, A.; D'Ago, G.; Figuera Jaimes, R.; Kains, N.; Ranc, C.; Schmidt, R. W.; Snodgrass, C.; Wambsganss, J.; Steele, I. A.; Mao, S.; Ment, K.; Menzies, J.; Li, Z.; Cross, S.; Maoz, D.; Shvartzvald, Y.
2018-01-01
Context. The ability to automatically select scientifically-important transient events from an alert stream of many such events, and to conduct follow-up observations in response, will become increasingly important in astronomy. With wide-angle time domain surveys pushing to fainter limiting magnitudes, the capability to follow-up on transient alerts far exceeds our follow-up telescope resources, and effective target prioritization becomes essential. The RoboNet-II microlensing program is a pathfinder project, which has developed an automated target selection process (RoboTAP) for gravitational microlensing events, which are observed in real time using the Las Cumbres Observatory telescope network. Aims: Follow-up telescopes typically have a much smaller field of view compared to surveys, therefore the most promising microlensing events must be automatically selected at any given time from an annual sample exceeding 2000 events. The main challenge is to select between events with a high planet detection sensitivity, with the aim of detecting many planets and characterizing planetary anomalies. Methods: Our target selection algorithm is a hybrid system based on estimates of the planet detection zones around a microlens. It follows automatic anomaly alerts and respects the expected survey coverage of specific events. Results: We introduce the RoboTAP algorithm, whose purpose is to select and prioritize microlensing events with high sensitivity to planetary companions. In this work, we determine the planet sensitivity of the RoboNet follow-up program and provide a working example of how a broker can be designed for a real-life transient science program conducting follow-up observations in response to alerts; we explore the issues that will confront similar programs being developed for the Large Synoptic Survey Telescope (LSST) and other time domain surveys.
Peña, José Manuel; Torres-Sánchez, Jorge; de Castro, Ana Isabel; Kelly, Maggi; López-Granados, Francisca
2013-01-01
The use of remote imagery captured by unmanned aerial vehicles (UAV) has tremendous potential for designing detailed site-specific weed control treatments in early post-emergence, which have not possible previously with conventional airborne or satellite images. A robust and entirely automatic object-based image analysis (OBIA) procedure was developed on a series of UAV images using a six-band multispectral camera (visible and near-infrared range) with the ultimate objective of generating a weed map in an experimental maize field in Spain. The OBIA procedure combines several contextual, hierarchical and object-based features and consists of three consecutive phases: 1) classification of crop rows by application of a dynamic and auto-adaptive classification approach, 2) discrimination of crops and weeds on the basis of their relative positions with reference to the crop rows, and 3) generation of a weed infestation map in a grid structure. The estimation of weed coverage from the image analysis yielded satisfactory results. The relationship of estimated versus observed weed densities had a coefficient of determination of r2=0.89 and a root mean square error of 0.02. A map of three categories of weed coverage was produced with 86% of overall accuracy. In the experimental field, the area free of weeds was 23%, and the area with low weed coverage (<5% weeds) was 47%, which indicated a high potential for reducing herbicide application or other weed operations. The OBIA procedure computes multiple data and statistics derived from the classification outputs, which permits calculation of herbicide requirements and estimation of the overall cost of weed management operations in advance. PMID:24146963
Peña, José Manuel; Torres-Sánchez, Jorge; de Castro, Ana Isabel; Kelly, Maggi; López-Granados, Francisca
2013-01-01
The use of remote imagery captured by unmanned aerial vehicles (UAV) has tremendous potential for designing detailed site-specific weed control treatments in early post-emergence, which have not possible previously with conventional airborne or satellite images. A robust and entirely automatic object-based image analysis (OBIA) procedure was developed on a series of UAV images using a six-band multispectral camera (visible and near-infrared range) with the ultimate objective of generating a weed map in an experimental maize field in Spain. The OBIA procedure combines several contextual, hierarchical and object-based features and consists of three consecutive phases: 1) classification of crop rows by application of a dynamic and auto-adaptive classification approach, 2) discrimination of crops and weeds on the basis of their relative positions with reference to the crop rows, and 3) generation of a weed infestation map in a grid structure. The estimation of weed coverage from the image analysis yielded satisfactory results. The relationship of estimated versus observed weed densities had a coefficient of determination of r(2)=0.89 and a root mean square error of 0.02. A map of three categories of weed coverage was produced with 86% of overall accuracy. In the experimental field, the area free of weeds was 23%, and the area with low weed coverage (<5% weeds) was 47%, which indicated a high potential for reducing herbicide application or other weed operations. The OBIA procedure computes multiple data and statistics derived from the classification outputs, which permits calculation of herbicide requirements and estimation of the overall cost of weed management operations in advance.
NASA Astrophysics Data System (ADS)
Ogawa, Kenta; Konno, Yukiko; Yamamoto, Satoru; Matsunaga, Tsuneo; Tachikawa, Tetsushi; Komoda, Mako
2017-09-01
Hyperspectral Imager Suite (HISUI) is a Japanese future space-borne hyperspectral instrument being developed by Ministry of Economy, Trade, and Industry (METI). HISUI will be launched in 2019 or later onboard International Space Station (ISS) as platform. HISUI has 185 spectral band from 0.4 to 2.5 μm with 20 by 30 m spatial resolution with swath of 20 km. Swath is limited as such, however observations in continental scale area are requested in HISUI mission lifetime of three years. Therefore we are developing a scheduling algorithm to generate effective observation plans. HISUI scheduling algorithm is to generate observation plans automatically based on platform orbit, observation area maps (we say DAR; "Data Acquisition Request" in HISUI project), their priorities, and available resources and limitation of HISUI system such as instrument operation time per orbit and data transfer capability. Then next we need to set adequate DAR before start of HISUI observation, because years of observations are needed to cover continental scale wide area that is difficult to change after the mission started. To address these issues, we have developed observation simulator. The simulator's critical inputs are DAR and the ISS's orbit, HISUI limitations in observation minutes per orbit, data storage and past cloud coverage data for term of HISUI observations (3 years). Then the outputs of simulator are coverage map of each day. Areas with cloud free image are accumulated for the term of observation up to three years. We have successfully tested the simulator and tentative DAR and found that it is possible to estimate coverage for each of requests for the mission lifetime.
ECOD: new developments in the evolutionary classification of domains
Schaeffer, R. Dustin; Liao, Yuxing; Cheng, Hua; Grishin, Nick V.
2017-01-01
Evolutionary Classification Of protein Domains (ECOD) (http://prodata.swmed.edu/ecod) comprehensively classifies protein with known spatial structures maintained by the Protein Data Bank (PDB) into evolutionary groups of protein domains. ECOD relies on a combination of automatic and manual weekly updates to achieve its high accuracy and coverage with a short update cycle. ECOD classifies the approximately 120 000 depositions of the PDB into more than 500 000 domains in ∼3400 homologous groups. We show the performance of the weekly update pipeline since the release of ECOD, describe improvements to the ECOD website and available search options, and discuss novel structures and homologous groups that have been classified in the recent updates. Finally, we discuss the future directions of ECOD and further improvements planned for the hierarchy and update process. PMID:27899594
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crain, Steven P.; Yang, Shuang-Hong; Zha, Hongyuan
Access to health information by consumers is ham- pered by a fundamental language gap. Current attempts to close the gap leverage consumer oriented health information, which does not, however, have good coverage of slang medical terminology. In this paper, we present a Bayesian model to automatically align documents with different dialects (slang, com- mon and technical) while extracting their semantic topics. The proposed diaTM model enables effective information retrieval, even when the query contains slang words, by explicitly modeling the mixtures of dialects in documents and the joint influence of dialects and topics on word selection. Simulations us- ing consumermore » questions to retrieve medical information from a corpus of medical documents show that diaTM achieves a 25% improvement in information retrieval relevance by nDCG@5 over an LDA baseline.« less
Model-Based Testability Assessment and Directed Troubleshooting of Shuttle Wiring Systems
NASA Technical Reports Server (NTRS)
Deb, Somnath; Domagala, Chuck; Shrestha, Roshan; Malepati, Venkatesh; Cavanaugh, Kevin; Patterson-Hine, Ann; Sanderfer, Dwight; Cockrell, Jim; Norvig, Peter (Technical Monitor)
2000-01-01
We have recently completed a pilot study on the Space shuttle wiring system commissioned by the Wiring Integrity Research (WIRe) team at NASA Ames Research Center, As the space shuttle ages, it is experiencing wiring degradation problems including arcing, chaffing insulation breakdown and broken conductors. A systematic and comprehensive test process is required to thoroughly test and quality assure (QA) the wiring systems. The NASA WIRe team recognized the value of a formal model based analysis for risk-assessment and fault coverage analysis. However. wiring systems are complex and involve over 50,000 wire segments. Therefore, NASA commissioned this pilot study with Qualtech Systems. Inc. (QSI) to explore means of automatically extracting high fidelity multi-signal models from wiring information database for use with QSI's Testability Engineering and Maintenance System (TEAMS) tool.
Treatment Planning and Image Guidance for Radiofrequency Ablations of Large Tumors
Ren, Hongliang; Campos-Nanez, Enrique; Yaniv, Ziv; Banovac, Filip; Abeledo, Hernan; Hata, Nobuhiko; Cleary, Kevin
2014-01-01
This article addresses the two key challenges in computer-assisted percutaneous tumor ablation: planning multiple overlapping ablations for large tumors while avoiding critical structures, and executing the prescribed plan. Towards semi-automatic treatment planning for image-guided surgical interventions, we develop a systematic approach to the needle-based ablation placement task, ranging from pre-operative planning algorithms to an intra-operative execution platform. The planning system incorporates clinical constraints on ablations and trajectories using a multiple objective optimization formulation, which consists of optimal path selection and ablation coverage optimization based on integer programming. The system implementation is presented and validated in phantom studies and on an animal model. The presented system can potentially be further extended for other ablation techniques such as cryotherapy. PMID:24235279
Do we need a geoelectric index?
NASA Technical Reports Server (NTRS)
Holzworth, R.; Volland, H.
1986-01-01
The need for a geoelectric index (GI) measuring the global level of atmospheric electrical activity for a given time is assessed, and methods for defining a GI are compared. Current problems in atmospheric and space electrodynamics (the global circuit, solar-terrestrial coupling, lightning effects on the ionosphere/magnetosphere, and mesospheric generators), atmospheric chemistry (the stratospheric ozone cycle and atmospheric gravity waves), and meteorology (fog forecasting) are reviewed to illustrate the usefullness of a GI. Derivations of a GI from in situ electrical measurements and from ground or satellite remote sensing of source properties are described, and a system based on ground measurement of the intensity of the Schumann resonance lines (as proposed by Polk, 1982) is found to be the most practical, requiring as few as three (automatically operated) stations for global coverage.
Mobile GPU-based implementation of automatic analysis method for long-term ECG.
Fan, Xiaomao; Yao, Qihang; Li, Ye; Chen, Runge; Cai, Yunpeng
2018-05-03
Long-term electrocardiogram (ECG) is one of the important diagnostic assistant approaches in capturing intermittent cardiac arrhythmias. Combination of miniaturized wearable holters and healthcare platforms enable people to have their cardiac condition monitored at home. The high computational burden created by concurrent processing of numerous holter data poses a serious challenge to the healthcare platform. An alternative solution is to shift the analysis tasks from healthcare platforms to the mobile computing devices. However, long-term ECG data processing is quite time consuming due to the limited computation power of the mobile central unit processor (CPU). This paper aimed to propose a novel parallel automatic ECG analysis algorithm which exploited the mobile graphics processing unit (GPU) to reduce the response time for processing long-term ECG data. By studying the architecture of the sequential automatic ECG analysis algorithm, we parallelized the time-consuming parts and reorganized the entire pipeline in the parallel algorithm to fully utilize the heterogeneous computing resources of CPU and GPU. The experimental results showed that the average executing time of the proposed algorithm on a clinical long-term ECG dataset (duration 23.0 ± 1.0 h per signal) is 1.215 ± 0.140 s, which achieved an average speedup of 5.81 ± 0.39× without compromising analysis accuracy, comparing with the sequential algorithm. Meanwhile, the battery energy consumption of the automatic ECG analysis algorithm was reduced by 64.16%. Excluding energy consumption from data loading, 79.44% of the energy consumption could be saved, which alleviated the problem of limited battery working hours for mobile devices. The reduction of response time and battery energy consumption in ECG analysis not only bring better quality of experience to holter users, but also make it possible to use mobile devices as ECG terminals for healthcare professions such as physicians and health advisers, enabling them to inspect patient ECG recordings onsite efficiently without the need of a high-quality wide-area network environment.
Rogalska, Justyna
2015-01-01
Since 1998, Poland has been actively participating in the Measles Elimination Program, coordinated by the World Health Organization (WHO). It requires achieving and maintaining very high vaccine coverage (>95%), recording all cases and suspected cases of measles, and laboratory testing of all suspected measles cases in the WHO Reference Laboratory. In Poland it is a Laboratory of Department of Virology, NIPH-NIH. In order to confirm or exclude the case of measles specific measles IgM antibodies should be measured using Elisa test, or molecular testing (PCR) should be performed to detect the presence measles virus RNA in biological material. To assess epidemiological situation of measles in Poland in 2013, including vaccination coverage in Polish population, and Measles Elimination Program implementation status. The descriptive analysis was based on data retrieved from routine mandatory surveillance system and published in the annual bulletins "Infectious diseases and poisonings in Poland in 2013" and "Vaccinations in Poland in 2013", and measles case-based reports from 2013 sent to the Department of Epidemiology NIPH-NIH by Sanitary-Epidemiological Stations. In total, there were 84 measles cases registered in Poland in 2013 (incidence 0.22 per 100,000). The highest incidence rate was observed among infants (2.18 per 100,000) and children aged 1 year (1.27 per 100,000). In 2013, 56 cases (66.7%) were hospitalized due to measles. No deaths from measles were reported. Vaccination coverage of children and youth aged 2-11 years ranged from 82.8% do 99.5% (primary vaccination in children born in 2012-2007) and from 73.6% to 93.2% (booster dose in children born in 2004-2001). In 2013, 127 measles-compatible cases were reported (67% of expected reports). Two hundred seven cases (80%) were confirmed by IgM ELISA test. In 2013, the epidemiological situation of measles deteriorated in comparison to proceding year. The sensitivity of measles surveillance improved but is still insufficient. It is necessary to further promote Measles Elimination Program in Poland, to improve measles surveillance system and to maintain the high immunisation coverage.
Multiple imputation of missing fMRI data in whole brain analysis
Vaden, Kenneth I.; Gebregziabher, Mulugeta; Kuchinsky, Stefanie E.; Eckert, Mark A.
2012-01-01
Whole brain fMRI analyses rarely include the entire brain because of missing data that result from data acquisition limits and susceptibility artifact, in particular. This missing data problem is typically addressed by omitting voxels from analysis, which may exclude brain regions that are of theoretical interest and increase the potential for Type II error at cortical boundaries or Type I error when spatial thresholds are used to establish significance. Imputation could significantly expand statistical map coverage, increase power, and enhance interpretations of fMRI results. We examined multiple imputation for group level analyses of missing fMRI data using methods that leverage the spatial information in fMRI datasets for both real and simulated data. Available case analysis, neighbor replacement, and regression based imputation approaches were compared in a general linear model framework to determine the extent to which these methods quantitatively (effect size) and qualitatively (spatial coverage) increased the sensitivity of group analyses. In both real and simulated data analysis, multiple imputation provided 1) variance that was most similar to estimates for voxels with no missing data, 2) fewer false positive errors in comparison to mean replacement, and 3) fewer false negative errors in comparison to available case analysis. Compared to the standard analysis approach of omitting voxels with missing data, imputation methods increased brain coverage in this study by 35% (from 33,323 to 45,071 voxels). In addition, multiple imputation increased the size of significant clusters by 58% and number of significant clusters across statistical thresholds, compared to the standard voxel omission approach. While neighbor replacement produced similar results, we recommend multiple imputation because it uses an informed sampling distribution to deal with missing data across subjects that can include neighbor values and other predictors. Multiple imputation is anticipated to be particularly useful for 1) large fMRI data sets with inconsistent missing voxels across subjects and 2) addressing the problem of increased artifact at ultra-high field, which significantly limit the extent of whole brain coverage and interpretations of results. PMID:22500925
Allan, James R; Kormos, Cyril; Jaeger, Tilman; Venter, Oscar; Bertzky, Bastian; Shi, Yichuan; Mackey, Brendan; van Merm, Remco; Osipova, Elena; Watson, James E M
2018-02-01
Wilderness areas are ecologically intact landscapes predominantly free of human uses, especially industrial-scale activities that result in substantial biophysical disturbance. This definition does not exclude land and resource use by local communities who depend on such areas for subsistence and bio-cultural connections. Wilderness areas are important for biodiversity conservation and sustain key ecological processes and ecosystem services that underpin planetary life-support systems. Despite these widely recognized benefits and values of wilderness, they are insufficiently protected and are consequently being rapidly eroded. There are increasing calls for multilateral environmental agreements to make a greater and more systematic contribution to wilderness conservation before it is too late. We created a global map of remaining terrestrial wilderness following the established last-of-the-wild method, which identifies the 10% of areas with the lowest human pressure within each of Earth's 62 biogeographic realms and identifies the 10 largest contiguous areas and all contiguous areas >10,000 km 2 . We used our map to assess wilderness coverage by the World Heritage Convention and to identify gaps in coverage. We then identified large nationally designated protected areas with good wilderness coverage within these gaps. One-quarter of natural and mixed (i.e., sites of both natural and cultural value) World Heritage Sites (WHS) contained wilderness (total of 545,307 km 2 ), which is approximately 1.8% of the world's wilderness extent. Many WHS had excellent wilderness coverage, for example, the Okavango Delta in Botswana (11,914 km 2 ) and the Central Suriname Nature Reserve (16,029 km 2 ). However, 22 (35%) of the world's terrestrial biorealms had no wilderness representation within WHS. We identified 840 protected areas of >500 km 2 that were predominantly wilderness (>50% of their area) and represented 18 of the 22 missing biorealms. These areas offer a starting point for assessing the potential for the designation of new WHSs that could help increase wilderness representation on the World Heritage list. We urge the World Heritage Convention to ensure that the ecological integrity and outstanding universal value of existing WHS with wilderness values are preserved. © 2017 Society for Conservation Biology.
Time trends in educational inequalities in cancer mortality in Colombia, 1998–2012
Arroyave, Ivan; Pardo, Constanza
2016-01-01
Objectives To evaluate trends in premature cancer mortality in Colombia by educational level in three periods: 1998–2002 with low healthcare insurance coverage, 2003–2007 with rapidly increasing coverage and finally 2008–2012 with almost universal coverage (2008–2012). Setting Colombian population-based, national secondary mortality data. Participants We included all (n=188 091) cancer deaths occurring in the age group 20–64 years between 1998 and 2012, excluding only cases with low levels of quality of registration (n=2902, 1.5%). Primary and secondary outcome measures In this descriptive study, we linked mortality data of ages 20–64 years to census data to obtain age-standardised cancer mortality rates by educational level. Using Poisson regression, we modelled premature mortality by educational level estimating rate ratios (RR), relative index of inequality (RII) and the Slope Index of Inequality (SII). Results Relative measures showed increased risks of dying among the lower educated compared to the highest educated; this tendency was stronger in women (RRprimary 1.49; RRsecondary 1.22, both p<0.0001) than in men (RRprimary 1.35; RRsecondary 1.11, both p<0.0001). In absolute terms (SII), cancer caused a difference per 100 000 deaths between the highest and lowest educated of 20.5 in males and 28.5 in females. RII was significantly higher among women and the younger age categories. RII decreased between the first and second periods; afterwards (2008–2012), it increased significantly back to their previous levels. Among women, no significant increases or declines in cancer mortality over time were observed in recent periods in the lowest educated group, whereas strong recent declines were observed in those with secondary education or higher. Conclusions Educational inequalities in cancer mortality in Colombia are increasing in absolute and relative terms, and are concentrated in young age categories. This trend was not curbed by increases in healthcare insurance coverage. Policymakers should focus on improving equal access to prevention, early detection, diagnostic and treatment facilities. PMID:27048630
NASA Astrophysics Data System (ADS)
Notti, Davide; Calò, Fabiana; Cigna, Francesca; Manunta, Michele; Herrera, Gerardo; Berti, Matteo; Meisina, Claudia; Tapete, Deodato; Zucca, Francesco
2015-11-01
Recent advances in multi-temporal Differential Synthetic Aperture Radar (SAR) Interferometry (DInSAR) have greatly improved our capability to monitor geological processes. Ground motion studies using DInSAR require both the availability of good quality input data and rigorous approaches to exploit the retrieved Time Series (TS) at their full potential. In this work we present a methodology for DInSAR TS analysis, with particular focus on landslides and subsidence phenomena. The proposed methodology consists of three main steps: (1) pre-processing, i.e., assessment of a SAR Dataset Quality Index (SDQI) (2) post-processing, i.e., application of empirical/stochastic methods to improve the TS quality, and (3) trend analysis, i.e., comparative implementation of methodologies for automatic TS analysis. Tests were carried out on TS datasets retrieved from processing of SAR imagery acquired by different radar sensors (i.e., ERS-1/2 SAR, RADARSAT-1, ENVISAT ASAR, ALOS PALSAR, TerraSAR-X, COSMO-SkyMed) using advanced DInSAR techniques (i.e., SqueeSAR™, PSInSAR™, SPN and SBAS). The obtained values of SDQI are discussed against the technical parameters of each data stack (e.g., radar band, number of SAR scenes, temporal coverage, revisiting time), the retrieved coverage of the DInSAR results, and the constraints related to the characterization of the investigated geological processes. Empirical and stochastic approaches were used to demonstrate how the quality of the TS can be improved after the SAR processing, and examples are discussed to mitigate phase unwrapping errors, and remove regional trends, noise and anomalies. Performance assessment of recently developed methods of trend analysis (i.e., PS-Time, Deviation Index and velocity TS) was conducted on two selected study areas in Northern Italy affected by land subsidence and landslides. Results show that the automatic detection of motion trends enhances the interpretation of DInSAR data, since it provides an objective picture of the deformation behaviour recorded through TS and therefore contributes to the understanding of the on-going geological processes.
Detection of benign prostatic hyperplasia nodules in T2W MR images using fuzzy decision forest
NASA Astrophysics Data System (ADS)
Lay, Nathan; Freeman, Sabrina; Turkbey, Baris; Summers, Ronald M.
2016-03-01
Prostate cancer is the second leading cause of cancer-related death in men MRI has proven useful for detecting prostate cancer, and CAD may further improve detection. One source of false positives in prostate computer-aided diagnosis (CAD) is the presence of benign prostatic hyperplasia (BPH) nodules. These nodules have a distinct appearance with a pseudo-capsule on T2 weighted MR images but can also resemble cancerous lesions in other sequences such as the ADC or high B-value images. Describing their appearance with hand-crafted heuristics (features) that also exclude the appearance of cancerous lesions is challenging. This work develops a method based on fuzzy decision forests to automatically learn discriminative features for the purpose of BPH nodule detection in T2 weighted images for the purpose of improving prostate CAD systems.
Storm, Lance; Tressoldi, Patrizio E; Utts, Jessica
2013-01-01
Rouder, Morey, and Province (2013) stated that (a) the evidence-based case for psi in Storm, Tressoldi, and Di Risio's (2010) meta-analysis is supported only by a number of studies that used manual randomization, and (b) when these studies are excluded so that only investigations using automatic randomization are evaluated (and some additional studies previously omitted by Storm et al., 2010, are included), the evidence for psi is "unpersuasive." Rouder et al. used a Bayesian approach, and we adopted the same methodology, finding that our case is upheld. Because of recent updates and corrections, we reassessed the free-response databases of Storm et al. using a frequentist approach. We discuss and critique the assumptions and findings of Rouder et al. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Universality in quantum chaos and the one-parameter scaling theory.
García-García, Antonio M; Wang, Jiao
2008-02-22
The one-parameter scaling theory is adapted to the context of quantum chaos. We define a generalized dimensionless conductance, g, semiclassically and then study Anderson localization corrections by renormalization group techniques. This analysis permits a characterization of the universality classes associated to a metal (g-->infinity), an insulator (g-->0), and the metal-insulator transition (g-->g(c)) in quantum chaos provided that the classical phase space is not mixed. According to our results the universality class related to the metallic limit includes all the systems in which the Bohigas-Giannoni-Schmit conjecture holds but automatically excludes those in which dynamical localization effects are important. The universality class related to the metal-insulator transition is characterized by classical superdiffusion or a fractal spectrum in low dimensions (d < or = 2). Several examples are discussed in detail.
Towards Direct Manipulation and Remixing of Massive Data: The EarthServer Approach
NASA Astrophysics Data System (ADS)
Baumann, P.
2012-04-01
Complex analytics on "big data" is one of the core challenges of current Earth science, generating strong requirements for on-demand processing and fil tering of massive data sets. Issues under discussion include flexibility, performance, scalability, and the heterogeneity of the information types invo lved. In other domains, high-level query languages (such as those offered by database systems) have proven successful in the quest for flexible, scalable data access interfaces to massive amounts of data. However, due to the lack of support for many of the Earth science data structures, database systems are only used for registries and catalogs, but not for the bulk of spatio-temporal data. One core information category in this field is given by coverage data. ISO 19123 defines coverages, simplifying, as a representation of a "space-time varying phenomenon". This model can express a large class of Earth science data structures, including rectified and non-rectified rasters, curvilinear grids, point clouds, TINs, general meshes, trajectories, surfaces, and solids. This abstract definition, which is too high-level to establish interoperability, is concretized by the OGC GML 3.2.1 Application Schema for Coverages Standard into an interoperable representation. The OGC Web Coverage Processing Service (WCPS) Standard defines a declarative query language on multi-dimensional raster-type coverages, such as 1D in-situ sensor timeseries, 2D EO imagery, 3D x/y/t image time series and x/y/z geophysical data, 4D x/y/z/t climate and ocean data. Hence, important ingredients for versatile coverage retrieval are given - however, this potential has not been fully unleashed by service architectures up to now. The EU FP7-INFRA project EarthServer, launched in September 2011, aims at enabling standards-based on-demand analytics over the Web for Earth science data based on an integration of W3C XQuery for alphanumeric data and OGC-WCPS for raster data. Ultimately, EarthServer will support all OGC coverage types. The platform used by EarthServer is the rasdaman raster database system. To exploit heterogeneous multi-parallel platforms, automatic request distribution and orchestration is being established. Client toolkits are under development which will allow to quickly compose bespoke interactive clients, ranging from mobile devices over Web clients to high-end immersive virtual reality. The EarthServer platform has been deployed in six large-scale data centres with the aim of setting up Lighthouse Applications addressing all Earth Sciences, including satellite and airborne earth observation as well as use cases from atmosphere, ocean, snow, and ice monitoring, and geology on Earth and Mars. These services, each of which will ultimately host at least 100 TB, will form a peer cloud with distributed query processing for arbitrarily mixing database and in-situ access. With its ability to directly manipulate, analyze and remix massive data, the goal of EarthServer is to lift the data providers' semantic level from data stewardship to service stewardship.
Techniques for automatic large scale change analysis of temporal multispectral imagery
NASA Astrophysics Data System (ADS)
Mercovich, Ryan A.
Change detection in remotely sensed imagery is a multi-faceted problem with a wide variety of desired solutions. Automatic change detection and analysis to assist in the coverage of large areas at high resolution is a popular area of research in the remote sensing community. Beyond basic change detection, the analysis of change is essential to provide results that positively impact an image analyst's job when examining potentially changed areas. Present change detection algorithms are geared toward low resolution imagery, and require analyst input to provide anything more than a simple pixel level map of the magnitude of change that has occurred. One major problem with this approach is that change occurs in such large volume at small spatial scales that a simple change map is no longer useful. This research strives to create an algorithm based on a set of metrics that performs a large area search for change in high resolution multispectral image sequences and utilizes a variety of methods to identify different types of change. Rather than simply mapping the magnitude of any change in the scene, the goal of this research is to create a useful display of the different types of change in the image. The techniques presented in this dissertation are used to interpret large area images and provide useful information to an analyst about small regions that have undergone specific types of change while retaining image context to make further manual interpretation easier. This analyst cueing to reduce information overload in a large area search environment will have an impact in the areas of disaster recovery, search and rescue situations, and land use surveys among others. By utilizing a feature based approach founded on applying existing statistical methods and new and existing topological methods to high resolution temporal multispectral imagery, a novel change detection methodology is produced that can automatically provide useful information about the change occurring in large area and high resolution image sequences. The change detection and analysis algorithm developed could be adapted to many potential image change scenarios to perform automatic large scale analysis of change.
Terminology extraction from medical texts in Polish
2014-01-01
Background Hospital documents contain free text describing the most important facts relating to patients and their illnesses. These documents are written in specific language containing medical terminology related to hospital treatment. Their automatic processing can help in verifying the consistency of hospital documentation and obtaining statistical data. To perform this task we need information on the phrases we are looking for. At the moment, clinical Polish resources are sparse. The existing terminologies, such as Polish Medical Subject Headings (MeSH), do not provide sufficient coverage for clinical tasks. It would be helpful therefore if it were possible to automatically prepare, on the basis of a data sample, an initial set of terms which, after manual verification, could be used for the purpose of information extraction. Results Using a combination of linguistic and statistical methods for processing over 1200 children hospital discharge records, we obtained a list of single and multiword terms used in hospital discharge documents written in Polish. The phrases are ordered according to their presumed importance in domain texts measured by the frequency of use of a phrase and the variety of its contexts. The evaluation showed that the automatically identified phrases cover about 84% of terms in domain texts. At the top of the ranked list, only 4% out of 400 terms were incorrect while out of the final 200, 20% of expressions were either not domain related or syntactically incorrect. We also observed that 70% of the obtained terms are not included in the Polish MeSH. Conclusions Automatic terminology extraction can give results which are of a quality high enough to be taken as a starting point for building domain related terminological dictionaries or ontologies. This approach can be useful for preparing terminological resources for very specific subdomains for which no relevant terminologies already exist. The evaluation performed showed that none of the tested ranking procedures were able to filter out all improperly constructed noun phrases from the top of the list. Careful choice of noun phrases is crucial to the usefulness of the created terminological resource in applications such as lexicon construction or acquisition of semantic relations from texts. PMID:24976943
Terminology extraction from medical texts in Polish.
Marciniak, Małgorzata; Mykowiecka, Agnieszka
2014-01-01
Hospital documents contain free text describing the most important facts relating to patients and their illnesses. These documents are written in specific language containing medical terminology related to hospital treatment. Their automatic processing can help in verifying the consistency of hospital documentation and obtaining statistical data. To perform this task we need information on the phrases we are looking for. At the moment, clinical Polish resources are sparse. The existing terminologies, such as Polish Medical Subject Headings (MeSH), do not provide sufficient coverage for clinical tasks. It would be helpful therefore if it were possible to automatically prepare, on the basis of a data sample, an initial set of terms which, after manual verification, could be used for the purpose of information extraction. Using a combination of linguistic and statistical methods for processing over 1200 children hospital discharge records, we obtained a list of single and multiword terms used in hospital discharge documents written in Polish. The phrases are ordered according to their presumed importance in domain texts measured by the frequency of use of a phrase and the variety of its contexts. The evaluation showed that the automatically identified phrases cover about 84% of terms in domain texts. At the top of the ranked list, only 4% out of 400 terms were incorrect while out of the final 200, 20% of expressions were either not domain related or syntactically incorrect. We also observed that 70% of the obtained terms are not included in the Polish MeSH. Automatic terminology extraction can give results which are of a quality high enough to be taken as a starting point for building domain related terminological dictionaries or ontologies. This approach can be useful for preparing terminological resources for very specific subdomains for which no relevant terminologies already exist. The evaluation performed showed that none of the tested ranking procedures were able to filter out all improperly constructed noun phrases from the top of the list. Careful choice of noun phrases is crucial to the usefulness of the created terminological resource in applications such as lexicon construction or acquisition of semantic relations from texts.
NASA Astrophysics Data System (ADS)
Mouffe, M.; Getirana, A.; Ricci, S. M.; Lion, C.; Biancamaria, S.; Boone, A.; Mognard, N. M.; Rogel, P.
2011-12-01
The Surface Water and Ocean Topography (SWOT) mission is a swath mapping radar interferometer that will provide global measurements of water surface elevation (WSE). The revisit time depends upon latitude and varies from two (low latitudes) to ten (high latitudes) per 22-day orbit repeat period. The high resolution and the global coverage of the SWOT data open the way for new hydrology studies. Here, the aim is to investigate the use of virtually generated SWOT data to improve discharge simulation using data assimilation techniques. In the framework of the SWOT virtual mission (VM), this study presents the first results of the automatic calibration of a global flow routing (GFR) scheme using SWOT VM measurements for the Amazon basin. The Hydrological Modeling and Analysis Platform (HyMAP) is used along with the MOCOM-UA multi-criteria global optimization algorithm. HyMAP has a 0.25-degree spatial resolution and runs at the daily time step to simulate discharge, water levels and floodplains. The surface runoff and baseflow drainage derived from the Interactions Sol-Biosphère-Atmosphère (ISBA) model are used as inputs for HyMAP. Previous works showed that the use of ENVISAT data enables the reduction of the uncertainty on some of the hydrological model parameters, such as river width and depth, Manning roughness coefficient and groundwater time delay. In the framework of the SWOT preparation work, the automatic calibration procedure was applied using SWOT VM measurements. For this Observing System Experiment (OSE), the synthetical data were obtained applying an instrument simulator (representing realistic SWOT errors) for one hydrological year to HYMAP simulated WSE using a "true" set of parameters. Only pixels representing rivers larger than 100 meters within the Amazon basin are considered to produce SWOT VM measurements. The automatic calibration procedure leads to the estimation of optimal parametersminimizing objective functions that formulate the difference between SWOT observations and modeled WSE using a perturbed set of parameters. Different formulations of the objective function were used, especially to account for SWOT observation errors, as well as various sets of calibration parameters.
Chen, Li; Wang, Wei; Du, Xiaozhen; Rao, Xiuqin; van Velthoven, Michelle Helena; Yang, Ruikan; Zhang, Lin; Koepsell, Jeanne Catherine; Li, Ye; Wu, Qiong; Zhang, Yanfeng
2014-03-20
Although good progress has been achieved in expanding immunization of children in China, disparities exist across different provinces. Information gaps both from the service supply and demand sides hinder timely vaccination of children in rural areas. The rapid development of mobile health technology (mHealth) provides unprecedented opportunities for improving health services and reaching underserved populations. However, there is a lack of literature that rigorously evaluates the impact of mHealth interventions on immunization coverage as well as the usability and feasibility of smart phone applications (apps). This study aims to assess the effectiveness of a smart phone-based app (Expanded Program on Immunization app, or EPI app) on improving the coverage of children's immunization. This cluster randomized trial will take place in Xuanhan County, Sichuan Province, China. Functionalities of the app include the following: to make appointments automatically, record and update children's immunization information, generate a list of children who missed their vaccination appointments, and send health education information to village doctors. After pairing, 36 villages will be randomly allocated to the intervention arm (n=18) and control arm (n=18). The village doctors in the intervention arm will use the app while the village doctors in the control arm will record and manage immunization in the usual way in their catchment areas. A household survey will be used at baseline and at endline (8 months of implementation). The primary outcome is full-dose coverage and the secondary outcome is immunization coverage of the five vaccines that are included in the national Expanded Program on Immunization program as well as Hib vaccine, Rotavirus vaccine and Pneumococcal conjugate vaccine. Multidimensional evaluation of the app will also be conducted to assess usability and feasibility. This study is the first to evaluate the effectiveness of a smart phone app for child immunization in rural China. This study will contribute to the knowledge about the usability and feasibility of a smart phone app for managing immunization in rural China and to similar populations in different settings. Chinese Clinical Trials Registry (ChiCTR): ChiCTR-TRC-13003960.
Yock, Adam D; Kim, Gwe-Ya
2017-09-01
To present the k-means clustering algorithm as a tool to address treatment planning considerations characteristic of stereotactic radiosurgery using a single isocenter for multiple targets. For 30 patients treated with stereotactic radiosurgery for multiple brain metastases, the geometric centroids and radii of each met were determined from the treatment planning system. In-house software used this as well as weighted and unweighted versions of the k-means clustering algorithm to group the targets to be treated with a single isocenter, and to position each isocenter. The algorithm results were evaluated using within-cluster sum of squares as well as a minimum target coverage metric that considered the effect of target size. Both versions of the algorithm were applied to an example patient to demonstrate the prospective determination of the appropriate number and location of isocenters. Both weighted and unweighted versions of the k-means algorithm were applied successfully to determine the number and position of isocenters. Comparing the two, both the within-cluster sum of squares metric and the minimum target coverage metric resulting from the unweighted version were less than those from the weighted version. The average magnitudes of the differences were small (-0.2 cm 2 and 0.1% for the within cluster sum of squares and minimum target coverage, respectively) but statistically significant (Wilcoxon signed-rank test, P < 0.01). The differences between the versions of the k-means clustering algorithm represented an advantage of the unweighted version for the within-cluster sum of squares metric, and an advantage of the weighted version for the minimum target coverage metric. While additional treatment planning considerations have a large influence on the final treatment plan quality, both versions of the k-means algorithm provide automatic, consistent, quantitative, and objective solutions to the tasks associated with SRS treatment planning using a single isocenter for multiple targets. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Erber, Wilhelm; Schmitt, Heinz-Josef
2018-05-01
Adequate vaccination is effective in preventing tick-borne encephalitis (TBE). A population survey conducted in 2015 in Czech Republic, Estonia, Finland, Germany, Hungary, Latvia, Lithuania, Poland, Slovakia, Slovenia, and Sweden obtained information on TBE vaccination. Respondents answered 10 questions for themselves and household members. Data were weighted according to age and fine-tuned for geographical spread. Across the 10 countries (excluding Poland), TBE awareness was 83%; of all respondents, 68% were aware of TBE vaccines and 25% had ≥1 injections. Vaccination rates were lowest in Finland and Slovakia (∼10%), highest in Austria (85%, results from a separate 2015 survey), and varied widely in Germany. Across the 11 countries (excluding Austria), compliance with vaccination schedule among TBE-vaccinated respondents was 61%; 27% and 15% of respondents received first and second booster injections; strongest motivators for vaccination were fear of TBE (38%) and residence/spending time in high-risk areas (31-35%); main reasons for not receiving vaccination were beliefs that vaccination was unnecessary (33%) and that there was no risk of contracting TBE (23%). TBE vaccine uptake and compliance could be improved with effective public health information to increase TBE awareness and trust in vaccination and by updating recommendations to include all subjects visiting TBE-risk areas. Copyright © 2018 The Authors. Published by Elsevier GmbH.. All rights reserved.
Generic comparison of protein inference engines.
Claassen, Manfred; Reiter, Lukas; Hengartner, Michael O; Buhmann, Joachim M; Aebersold, Ruedi
2012-04-01
Protein identifications, instead of peptide-spectrum matches, constitute the biologically relevant result of shotgun proteomics studies. How to appropriately infer and report protein identifications has triggered a still ongoing debate. This debate has so far suffered from the lack of appropriate performance measures that allow us to objectively assess protein inference approaches. This study describes an intuitive, generic and yet formal performance measure and demonstrates how it enables experimentalists to select an optimal protein inference strategy for a given collection of fragment ion spectra. We applied the performance measure to systematically explore the benefit of excluding possibly unreliable protein identifications, such as single-hit wonders. Therefore, we defined a family of protein inference engines by extending a simple inference engine by thousands of pruning variants, each excluding a different specified set of possibly unreliable identifications. We benchmarked these protein inference engines on several data sets representing different proteomes and mass spectrometry platforms. Optimally performing inference engines retained all high confidence spectral evidence, without posterior exclusion of any type of protein identifications. Despite the diversity of studied data sets consistently supporting this rule, other data sets might behave differently. In order to ensure maximal reliable proteome coverage for data sets arising in other studies we advocate abstaining from rigid protein inference rules, such as exclusion of single-hit wonders, and instead consider several protein inference approaches and assess these with respect to the presented performance measure in the specific application context.
A bibliography of references to avian cholera
Wilson, Sonoma S.
1979-01-01
Mrs. Wilson has made a genuine effort to include in this bibliography every significant reference to avian cholera since Louis Pasteur's articles appeared in 1880, although she recognizes the likelihood that a few have been overlooked. New listings have been added throughout 1978, but comprehensive coverage of the literature cannot be claimed beyond June of that year.Textbook accounts, because they are generally summaries of work published elsewhere, are excluded. Papers dealing primarily with the biology of Pasteurella multocida, as opposed to the disease it induces in birds, are also excluded, unless they report information of diagnostic usefulness. Short abstracts are not included unless the journals in which they are published are more widely available than those in which the complete articles appear or they are English summaries of foreign language articles.In compiling this bibliography, Mrs. Wilson has made extensive use of Biological Abstracts, the Pesticide Documentation Bulletin, and printouts generated by Bibliographic Retrieval Services, Inc. The "Literature Cited" sections of textbooks and journal articles pertinent to the subject were sources of many additional references. Regardless of the origin of the citation, its accuracy was confirmed by comparison with the original publication, except in those few instances (marked with an asterisk) when the journal was not on the shelves of the libraries accessible to us.The author will be grateful to users of the bibliography who point out errors or omissions.Wayne I. JensenMicrobiologist In Charge
2010-01-01
Background Previously two prediction rules identifying children at risk of hearing loss and academic or behavioral limitations after bacterial meningitis were developed. Streptococcus pneumoniae as causative pathogen was an important risk factor in both. Since 2006 Dutch children receive seven-valent conjugate vaccination against S. pneumoniae. The presumed effect of vaccination was simulated by excluding all children infected by S. pneumoniae with the serotypes included in the vaccine, from both previous collected cohorts (between 1990-1995). Methods Children infected by one of the vaccine serotypes were excluded from both original cohorts (hearing loss: 70 of 628 children; academic or behavioral limitations: 26 of 182 children). All identified risk factors were included in multivariate logistic regression models. The discriminative ability of both new models was calculated. Results The same risk factors as in the original models were significant. The discriminative ability of the original hearing loss model was 0.84 and of the new model 0.87. In the academic or behavioral limitations model it was 0.83 and 0.84 respectively. Conclusion It can be assumed that the prediction rules will also be applicable on a vaccinated population. However, vaccination does not provide 100% coverage and evidence is available that serotype replacement will occur. The impact of vaccination on serotype replacement needs to be investigated, and the prediction rules must be validated externally. PMID:20815866
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barragán, A. M., E-mail: ana.barragan@uclouvain.be; Differding, S.; Lee, J. A.
Purpose: To prove the ability of protons to reproduce a dose gradient that matches a dose painting by numbers (DPBN) prescription in the presence of setup and range errors, by using contours and structure-based optimization in a commercial treatment planning system. Methods: For two patients with head and neck cancer, voxel-by-voxel prescription to the target volume (GTV{sub PET}) was calculated from {sup 18}FDG-PET images and approximated with several discrete prescription subcontours. Treatments were planned with proton pencil beam scanning. In order to determine the optimal plan parameters to approach the DPBN prescription, the effects of the scanning pattern, number ofmore » fields, number of subcontours, and use of range shifter were separately tested on each patient. Different constant scanning grids (i.e., spot spacing = Δx = Δy = 3.5, 4, and 5 mm) and uniform energy layer separation [4 and 5 mm WED (water equivalent distance)] were analyzed versus a dynamic and automatic selection of the spots grid. The number of subcontours was increased from 3 to 11 while the number of beams was set to 3, 5, or 7. Conventional PTV-based and robust clinical target volumes (CTV)-based optimization strategies were considered and their robustness against range and setup errors assessed. Because of the nonuniform prescription, ensuring robustness for coverage of GTV{sub PET} inevitably leads to overdosing, which was compared for both optimization schemes. Results: The optimal number of subcontours ranged from 5 to 7 for both patients. All considered scanning grids achieved accurate dose painting (1% average difference between the prescribed and planned doses). PTV-based plans led to nonrobust target coverage while robust-optimized plans improved it considerably (differences between worst-case CTV dose and the clinical constraint was up to 3 Gy for PTV-based plans and did not exceed 1 Gy for robust CTV-based plans). Also, only 15% of the points in the GTV{sub PET} (worst case) were above 5% of DPBN prescription for robust-optimized plans, while they were more than 50% for PTV plans. Low dose to organs at risk (OARs) could be achieved for both PTV and robust-optimized plans. Conclusions: DPBN in proton therapy is feasible with the use of a sufficient number subcontours, automatically generated scanning patterns, and no more than three beams are needed. Robust optimization ensured the required target coverage and minimal overdosing, while PTV-approach led to nonrobust plans with excessive overdose. Low dose to OARs can be achieved even in the presence of a high-dose escalation as in DPBN.« less
Alex, J; Kolisch, G; Krause, K
2002-01-01
The objective of this presented project is to use the results of an CFD simulation to automatically, systematically and reliably generate an appropriate model structure for simulation of the biological processes using CSTR activated sludge compartments. Models and dynamic simulation have become important tools for research but also increasingly for the design and optimisation of wastewater treatment plants. Besides the biological models several cases are reported about the application of computational fluid dynamics ICFD) to wastewater treatment plants. One aim of the presented method to derive model structures from CFD results is to exclude the influence of empirical structure selection to the result of dynamic simulations studies of WWTPs. The second application of the approach developed is the analysis of badly performing treatment plants where the suspicion arises that bad flow behaviour such as short cut flows is part of the problem. The method suggested requires as the first step the calculation of fluid dynamics of the biological treatment step at different loading situations by use of 3-dimensional CFD simulation. The result of this information is used to generate a suitable model structure for conventional dynamic simulation of the treatment plant by use of a number of CSTR modules with a pattern of exchange flows between the tanks automatically. The method is explained in detail and the application to the WWTP Wuppertal Buchenhofen is presented.
Tooth segmentation system with intelligent editing for cephalometric analysis
NASA Astrophysics Data System (ADS)
Chen, Shoupu
2015-03-01
Cephalometric analysis is the study of the dental and skeletal relationship in the head, and it is used as an assessment and planning tool for improved orthodontic treatment of a patient. Conventional cephalometric analysis identifies bony and soft-tissue landmarks in 2D cephalometric radiographs, in order to diagnose facial features and abnormalities prior to treatment, or to evaluate the progress of treatment. Recent studies in orthodontics indicate that there are persistent inaccuracies and inconsistencies in the results provided using conventional 2D cephalometric analysis. Obviously, plane geometry is inappropriate for analyzing anatomical volumes and their growth; only a 3D analysis is able to analyze the three-dimensional, anatomical maxillofacial complex, which requires computing inertia systems for individual or groups of digitally segmented teeth from an image volume of a patient's head. For the study of 3D cephalometric analysis, the current paper proposes a system for semi-automatically segmenting teeth from a cone beam computed tomography (CBCT) volume with two distinct features, including an intelligent user-input interface for automatic background seed generation, and a graphics processing unit (GPU) acceleration mechanism for three-dimensional GrowCut volume segmentation. Results show a satisfying average DICE score of 0.92, with the use of the proposed tooth segmentation system, by 15 novice users who segmented a randomly sampled tooth set. The average GrowCut processing time is around one second per tooth, excluding user interaction time.
Arterial tree tracking from anatomical landmarks in magnetic resonance angiography scans
NASA Astrophysics Data System (ADS)
O'Neil, Alison; Beveridge, Erin; Houston, Graeme; McCormick, Lynne; Poole, Ian
2014-03-01
This paper reports on arterial tree tracking in fourteen Contrast Enhanced MRA volumetric scans, given the positions of a predefined set of vascular landmarks, by using the A* algorithm to find the optimal path for each vessel based on voxel intensity and a learnt vascular probability atlas. The algorithm is intended for use in conjunction with an automatic landmark detection step, to enable fully automatic arterial tree tracking. The scan is filtered to give two further images using the top-hat transform with 4mm and 8mm cubic structuring elements. Vessels are then tracked independently on the scan in which the vessel of interest is best enhanced, as determined from knowledge of typical vessel diameter and surrounding structures. A vascular probability atlas modelling expected vessel location and orientation is constructed by non-rigidly registering the training scans to the test scan using a 3D thin plate spline to match landmark correspondences, and employing kernel density estimation with the ground truth center line points to form a probability density distribution. Threshold estimation by histogram analysis is used to segment background from vessel intensities. The A* algorithm is run using a linear cost function constructed from the threshold and the vascular atlas prior. Tracking results are presented for all major arteries excluding those in the upper limbs. An improvement was observed when tracking was informed by contextual information, with particular benefit for peripheral vessels.
Automatic Hotspot and Sun Glint Detection in UAV Multispectral Images
Ortega-Terol, Damian; Ballesteros, Rocio
2017-01-01
Last advances in sensors, photogrammetry and computer vision have led to high-automation levels of 3D reconstruction processes for generating dense models and multispectral orthoimages from Unmanned Aerial Vehicle (UAV) images. However, these cartographic products are sometimes blurred and degraded due to sun reflection effects which reduce the image contrast and colour fidelity in photogrammetry and the quality of radiometric values in remote sensing applications. This paper proposes an automatic approach for detecting sun reflections problems (hotspot and sun glint) in multispectral images acquired with an Unmanned Aerial Vehicle (UAV), based on a photogrammetric strategy included in a flight planning and control software developed by the authors. In particular, two main consequences are derived from the approach developed: (i) different areas of the images can be excluded since they contain sun reflection problems; (ii) the cartographic products obtained (e.g., digital terrain model, orthoimages) and the agronomical parameters computed (e.g., normalized vegetation index-NVDI) are improved since radiometric defects in pixels are not considered. Finally, an accuracy assessment was performed in order to analyse the error in the detection process, getting errors around 10 pixels for a ground sample distance (GSD) of 5 cm which is perfectly valid for agricultural applications. This error confirms that the precision in the detection of sun reflections can be guaranteed using this approach and the current low-cost UAV technology. PMID:29036930
Automatic Hotspot and Sun Glint Detection in UAV Multispectral Images.
Ortega-Terol, Damian; Hernandez-Lopez, David; Ballesteros, Rocio; Gonzalez-Aguilera, Diego
2017-10-15
Last advances in sensors, photogrammetry and computer vision have led to high-automation levels of 3D reconstruction processes for generating dense models and multispectral orthoimages from Unmanned Aerial Vehicle (UAV) images. However, these cartographic products are sometimes blurred and degraded due to sun reflection effects which reduce the image contrast and colour fidelity in photogrammetry and the quality of radiometric values in remote sensing applications. This paper proposes an automatic approach for detecting sun reflections problems (hotspot and sun glint) in multispectral images acquired with an Unmanned Aerial Vehicle (UAV), based on a photogrammetric strategy included in a flight planning and control software developed by the authors. In particular, two main consequences are derived from the approach developed: (i) different areas of the images can be excluded since they contain sun reflection problems; (ii) the cartographic products obtained (e.g., digital terrain model, orthoimages) and the agronomical parameters computed (e.g., normalized vegetation index-NVDI) are improved since radiometric defects in pixels are not considered. Finally, an accuracy assessment was performed in order to analyse the error in the detection process, getting errors around 10 pixels for a ground sample distance (GSD) of 5 cm which is perfectly valid for agricultural applications. This error confirms that the precision in the detection of sun reflections can be guaranteed using this approach and the current low-cost UAV technology.
LYRA, a webserver for lymphocyte receptor structural modeling.
Klausen, Michael Schantz; Anderson, Mads Valdemar; Jespersen, Martin Closter; Nielsen, Morten; Marcatili, Paolo
2015-07-01
The accurate structural modeling of B- and T-cell receptors is fundamental to gain a detailed insight in the mechanisms underlying immunity and in developing new drugs and therapies. The LYRA (LYmphocyte Receptor Automated modeling) web server (http://www.cbs.dtu.dk/services/LYRA/) implements a complete and automated method for building of B- and T-cell receptor structural models starting from their amino acid sequence alone. The webserver is freely available and easy to use for non-specialists. Upon submission, LYRA automatically generates alignments using ad hoc profiles, predicts the structural class of each hypervariable loop, selects the best templates in an automatic fashion, and provides within minutes a complete 3D model that can be downloaded or inspected online. Experienced users can manually select or exclude template structures according to case specific information. LYRA is based on the canonical structure method, that in the last 30 years has been successfully used to generate antibody models of high accuracy, and in our benchmarks this approach proves to achieve similarly good results on TCR modeling, with a benchmarked average RMSD accuracy of 1.29 and 1.48 Å for B- and T-cell receptors, respectively. To the best of our knowledge, LYRA is the first automated server for the prediction of TCR structure. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Karnan, M; Thangavel, K
2007-07-01
The presence of microcalcifications in breast tissue is one of the most incident signs considered by radiologist for an early diagnosis of breast cancer, which is one of the most common forms of cancer among women. In this paper, the Genetic Algorithm (GA) is proposed for automatic look at commonly prone area the breast border and nipple position to discover the suspicious regions on digital mammograms based on asymmetries between left and right breast image. The basic idea of the asymmetry approach is to scan left and right images are subtracted to extract the suspicious region. The proposed system consists of two steps: First, the mammogram images are enhanced using median filter, normalize the image, at the pectoral muscle region is excluding the border of the mammogram and comparing for both left and right images from the binary image. Further GA is applied to magnify the detected border. The figure of merit is calculated to evaluate whether the detected border is exact or not. And the nipple position is identified using GA. The some comparisons method is adopted for detection of suspected area. Second, using the border points and nipple position as the reference the mammogram images are aligned and subtracted to extract the suspicious region. The algorithms are tested on 114 abnormal digitized mammograms from Mammogram Image Analysis Society database.
Dental Care And Medicare Beneficiaries: Access Gaps, Cost Burdens, And Policy Options.
Willink, Amber; Schoen, Cathy; Davis, Karen
2016-12-01
Despite the wealth of evidence that oral health is related to physical health, Medicare explicitly excludes dental care from coverage, leaving beneficiaries at risk for tooth decay and periodontal disease and exposed to high out-of-pocket spending. To profile these risks, we examined access to dental care across income groups and types of insurance coverage in 2012. High-income beneficiaries were almost three times as likely to have received dental care in the previous twelve months, compared to low-income beneficiaries-74 percent of whom received no dental care. We also describe two illustrative policies that would expand access, in part by providing income-related subsidies. One would offer a voluntary, premium-financed benefit similar to those offered by Part D prescription drug plans, with an estimated premium of $29 per month. The other would cover basic dental care in core Medicare Part B benefits, financed in part by premiums ($7 or $15 per month, depending on whether premiums covered 25 percent or 50 percent of the cost) and in part by general revenues. The fact that beneficiaries are forgoing dental care and are exposed to significant costs if they seek care underscores the need for action. The policies offer pathways for improving health and financial independence for older adults. Project HOPE—The People-to-People Health Foundation, Inc.
Sequence analysis of the genome of carnation (Dianthus caryophyllus L.).
Yagi, Masafumi; Kosugi, Shunichi; Hirakawa, Hideki; Ohmiya, Akemi; Tanase, Koji; Harada, Taro; Kishimoto, Kyutaro; Nakayama, Masayoshi; Ichimura, Kazuo; Onozaki, Takashi; Yamaguchi, Hiroyasu; Sasaki, Nobuhiro; Miyahara, Taira; Nishizaki, Yuzo; Ozeki, Yoshihiro; Nakamura, Noriko; Suzuki, Takamasa; Tanaka, Yoshikazu; Sato, Shusei; Shirasawa, Kenta; Isobe, Sachiko; Miyamura, Yoshinori; Watanabe, Akiko; Nakayama, Shinobu; Kishida, Yoshie; Kohara, Mitsuyo; Tabata, Satoshi
2014-06-01
The whole-genome sequence of carnation (Dianthus caryophyllus L.) cv. 'Francesco' was determined using a combination of different new-generation multiplex sequencing platforms. The total length of the non-redundant sequences was 568,887,315 bp, consisting of 45,088 scaffolds, which covered 91% of the 622 Mb carnation genome estimated by k-mer analysis. The N50 values of contigs and scaffolds were 16,644 bp and 60,737 bp, respectively, and the longest scaffold was 1,287,144 bp. The average GC content of the contig sequences was 36%. A total of 1050, 13, 92 and 143 genes for tRNAs, rRNAs, snoRNA and miRNA, respectively, were identified in the assembled genomic sequences. For protein-encoding genes, 43 266 complete and partial gene structures excluding those in transposable elements were deduced. Gene coverage was ∼ 98%, as deduced from the coverage of the core eukaryotic genes. Intensive characterization of the assigned carnation genes and comparison with those of other plant species revealed characteristic features of the carnation genome. The results of this study will serve as a valuable resource for fundamental and applied research of carnation, especially for breeding new carnation varieties. Further information on the genomic sequences is available at http://carnation.kazusa.or.jp. © The Author 2013. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.
de Blasio, Birgitte Freiesleben; Neilson, Aileen Rae; Klemp, Marianne; Skjeldestad, Finn Egil
2012-12-01
In Norway, pap smear screening target women aged 25-69 years on a triennial basis. The introduction of human papillomavirus (HPV) mass immunization in 2009 raises questions regarding the cost-saving future changes to current screening strategies. We calibrated a dynamic HPV transmission model to Norwegian data and assessed the impact of changing screening 20 or 30 years after vaccine introduction, assuming 60 or 90% vaccination coverage. Screening compliance among vaccinated women was assumed at 80 or 50%. Strategies considered: (i) 5-yearly screening of women of 25-69 years, (ii) 3-yearly screening of women of 30-69 years and (iii) 3-yearly screening of women of 25-59 years. Greatest health gains were accomplished by ensuring a high vaccine uptake. In 2060, cervical cancer incidence was reduced by an estimated 36-57% compared with that of no vaccination. Stopping screening at the age of 60 years, excluding opportunistic screening, increased cervical cancer incidence by 3% (2060) compared with maintaining the current screening strategy, resulting in 1.0-2.4% extra cancers (2010-2060). The 5-yearly screening strategy elevated cervical cancer incidence by 30% resulting in 4.7-11.3% additional cancers. High vaccine uptake in the years to come is of primary concern. Screening of young women <30 years remains important, even under the conditions of high vaccine coverage.
Automated Construction of Coverage Catalogues of Aster Satellite Image for Urban Areas of the World
NASA Astrophysics Data System (ADS)
Miyazaki, H.; Iwao, K.; Shibasaki, R.
2012-07-01
We developed an algorithm to determine a combination of satellite images according to observation extent and image quality. The algorithm was for testing necessity for completing coverage of the search extent. The tests excluded unnecessary images with low quality and preserve necessary images with good quality. The search conditions of the satellite images could be extended, indicating the catalogue could be constructed with specified periods required for time series analysis. We applied the method to a database of metadata of ASTER satellite images archived in GEO Grid of National Institute of Advanced Industrial Science and Technology (AIST), Japan. As indexes of populated places with geographical coordinates, we used a database of 3372 populated place of more than 0.1 million populations retrieved from GRUMP Settlement Points, a global gazetteer of cities, which has geographical names of populated places associated with geographical coordinates and population data. From the coordinates of populated places, 3372 extents were generated with radiuses of 30 km, a half of swath of ASTER satellite images. By merging extents overlapping each other, they were assembled into 2214 extents. As a result, we acquired combinations of good quality for 1244 extents, those of low quality for 96 extents, incomplete combinations for 611 extents. Further improvements would be expected by introducing pixel-based cloud assessment and pixel value correction over seasonal variations.
Larkin, Robert M.; Stefano, Giovanni; Ruckle, Michael E.; ...
2016-02-09
Eukaryotic cells require mechanisms to establish the proportion of cellular volume devoted to particular organelles. These mechanisms are poorly understood. From a screen for plastid-to-nucleus signaling mutants in Arabidopsis thaliana, we cloned a mutant allele of a gene that encodes a protein of unknown function that is homologous to two other Arabidopsis genes of unknown function and Arabidopsis. In contrast to FRIENDLY, these three homologs of FRIENDLY are found only in photosynthetic organisms. Based on these data, we proposed that FRIENDLY expanded into a small gene family to help regulate the energy metabolism of cells that contain both mitochondria andmore » chloroplasts. Indeed, we found that knocking out these genes caused a number of chloroplast phenotypes, including a reduction in the proportion of cellular volume devoted to chloroplasts to 50% of wild type. Thus, we refer to these genes as REDUCED CHLOROPLAST COVERAGE (REC). The size of the chloroplast compartment was reduced most in rec1 mutants. The REC1 protein accumulated in the cytosol and the nucleus. REC1 was excluded from the nucleus when plants were treated with amitrole, which inhibits cell expansion and chloroplast function. Finally, we conclude that REC1 is an extraplastidic protein that helps to establish the size of the chloroplast compartment, and that signals derived from cell expansion or chloroplasts may regulate REC1.« less
firestar--advances in the prediction of functionally important residues.
Lopez, Gonzalo; Maietta, Paolo; Rodriguez, Jose Manuel; Valencia, Alfonso; Tress, Michael L
2011-07-01
firestar is a server for predicting catalytic and ligand-binding residues in protein sequences. Here, we present the important developments since the first release of firestar. Previous versions of the server required human interpretation of the results; the server is now fully automatized. firestar has been implemented as a web service and can now be run in high-throughput mode. Prediction coverage has been greatly improved with the extension of the FireDB database and the addition of alignments generated by HHsearch. Ligands in FireDB are now classified for biological relevance. Many of the changes have been motivated by the critical assessment of techniques for protein structure prediction (CASP) ligand-binding prediction experiment, which provided us with a framework to test the performance of firestar. URL: http://firedb.bioinfo.cnio.es/Php/FireStar.php.
Research on regional intrusion prevention and control system based on target tracking
NASA Astrophysics Data System (ADS)
Liu, Yanfei; Wang, Jieling; Jiang, Ke; He, Yanhui; Wu, Zhilin
2017-08-01
In view of the fact that China’s border is very long and the border prevention and control measures are single, we designed a regional intrusion prevention and control system which based on target-tracking. The system consists of four parts: solar panel, radar, electro-optical equipment, unmanned aerial vehicle and intelligent tracking platform. The solar panel provides independent power for the entire system. The radar detects the target in real time and realizes the high precision positioning of suspicious targets, then through the linkage of electro-optical equipment, it can achieve full-time automatic precise tracking of targets. When the target appears within the range of detection, the drone will be launched to continue the tracking. The system is mainly to realize the full time, full coverage, whole process integration and active realtime control of the border area.
Weather and atmosphere observation with the ATOM all-sky camera
NASA Astrophysics Data System (ADS)
Jankowsky, Felix; Wagner, Stefan
2015-03-01
The Automatic Telescope for Optical Monitoring (ATOM) for H.E.S.S. is an 75 cm optical telescope which operates fully automated. As there is no observer present during observation, an auxiliary all-sky camera serves as weather monitoring system. This device takes an all-sky image of the whole sky every three minutes. The gathered data then undergoes live-analysis by performing astrometric comparison with a theoretical night sky model, interpreting the absence of stars as cloud coverage. The sky monitor also serves as tool for a meteorological analysis of the observation site of the the upcoming Cherenkov Telescope Array. This overview covers design and benefits of the all-sky camera and additionally gives an introduction into current efforts to integrate the device into the atmosphere analysis programme of H.E.S.S.
Automatic Parsing of Parental Verbal Input
Sagae, Kenji; MacWhinney, Brian; Lavie, Alon
2006-01-01
To evaluate theoretical proposals regarding the course of child language acquisition, researchers often need to rely on the processing of large numbers of syntactically parsed utterances, both from children and their parents. Because it is so difficult to do this by hand, there are currently no parsed corpora of child language input data. To automate this process, we developed a system that combined the MOR tagger, a rule-based parser, and statistical disambiguation techniques. The resultant system obtained nearly 80% correct parses for the sentences spoken to children. To achieve this level, we had to construct a particular processing sequence that minimizes problems caused by the coverage/ambiguity trade-off in parser design. These procedures are particularly appropriate for use with the CHILDES database, an international corpus of transcripts. The data and programs are now freely available over the Internet. PMID:15190707
Analysis of navigation and guidance requirements for commercial VTOL operations
NASA Technical Reports Server (NTRS)
Hoffman, W. C.; Zvara, J.; Hollister, W. M.
1975-01-01
The paper presents some results of a program undertaken to define navigation and guidance requirements for commercial VTOL operations in the takeoff, cruise, terminal and landing phases of flight in weather conditions up to and including Category III. Quantitative navigation requirements are given for the parameters range, coverage, operation near obstacles, horizontal accuracy, multiple landing aircraft, multiple pad requirements, inertial/radio-inertial requirements, reliability/redundancy, update rate, and data link requirements in all flight phases. A multi-configuration straw-man navigation and guidance system for commercial VTOL operations is presented. Operation of the system is keyed to a fully automatic approach for navigation, guidance and control, with pilot as monitor-manager. The system is a hybrid navigator using a relatively low-cost inertial sensor with DME updates and MLS in the approach/departure phases.
firestar—advances in the prediction of functionally important residues
Lopez, Gonzalo; Maietta, Paolo; Rodriguez, Jose Manuel; Valencia, Alfonso; Tress, Michael L.
2011-01-01
firestar is a server for predicting catalytic and ligand-binding residues in protein sequences. Here, we present the important developments since the first release of firestar. Previous versions of the server required human interpretation of the results; the server is now fully automatized. firestar has been implemented as a web service and can now be run in high-throughput mode. Prediction coverage has been greatly improved with the extension of the FireDB database and the addition of alignments generated by HHsearch. Ligands in FireDB are now classified for biological relevance. Many of the changes have been motivated by the critical assessment of techniques for protein structure prediction (CASP) ligand-binding prediction experiment, which provided us with a framework to test the performance of firestar. URL: http://firedb.bioinfo.cnio.es/Php/FireStar.php. PMID:21672959
VizieR Online Data Catalog: Catalog of strong MgII absorbers (Lawther+, 2012)
NASA Astrophysics Data System (ADS)
Lawther, D.; Paarup, T.; Schmidt, M.; Vestergaard, M.; Hjorth, J.; Malesani, D.
2012-08-01
Here we present a catalog of strong (rest equivalent width Wr> intervening Mg II absorbers in the SDSS Data Release 7 quasar catalog (2010AJ....139.2360S, Cat. VII/260). The intervening absorbers were found by a semi-automatic algorithm written in IDL - for details of the algorithm see section 2 of our paper. A subset of the absorbers have been visually inspected - see the MAN_OK flag in the catalog. The number of sightlines searched, tabulated by absorber redshift, i.e. g(z), is available as an ASCII table (for S/N>8 and S/N>15). All analysis in our paper is based on the SNR>8 coverage, and considers only sight-lines towards non-BAL quasars. Any questions regarding the catalog should be sent to Daniel Lawther (unclellama(at)gmail.com). (3 data files).
Spider: Probing the Early Universe with a Large-Scale CMB Polarization Survey
NASA Astrophysics Data System (ADS)
Jones, William
The standard dark-matter and dark-energy dominated cosmological model (LCDM) has proven to be remarkably successful in describing the current state and past evolution of the Universe. However, there remain significant uncertainties regarding the physical mechanisms that established the initial conditions upon which the LCDM predictions rely. Theories of cosmic genesis - the extremely high energy mechanisms that established these conditions - should be expected to provide a natural description of the nearly flat geometry of the Universe, the existence of super-horizon density correlations, and the adiabatic, Gaussian and nearly scale-invariant nature of the observed primordial density perturbations. The primary objective of Spider is to subject models of the early Universe to observational test, probing fundamental physics at energy scales far beyond the reach of terrestrial particle accelerators. The main scientific result will be to characterize, or place stringent upper limits on the level of the odd-parity polarization of the CMB. In the context of the inflationary paradigm, Spider will confirm or exclude the predictions of the simplest single-field inflationary models near the Lyth bound, characterized by tensor to scalar ratios r 0.03. While viable alternatives to the inflationary paradigm are an active and important area of investigation, including string cosmologies and cyclic models, early Universe models described by inflationary periods are now widely accepted as the underlying cause behind much of what we observe in cosmology today. Nevertheless, we know very little about the mechanism that would drive inflation or the energy scale at which it occurred, and the paradigm faces significant questions about the viability of the framework as a scientific theory. Fortunately, inflationary paradigms and alternative theories offer distinct predictions regarding the statistical properties of the Cosmic Microwave Background radiation. Spider will use measurements of the polarization of the CMB to search for the signature of primordial gravitational waves that are predicted within the currently favored theories of inflation. A definitive detection of this signal would provide the first direct insight into the underlying physics of inflation as well as a measurement of its energy scale. A stringent limit on the amplitude of this signal would exclude the currently favored class of inflationary models, bolstering the case for alternative theories. Spider is a suborbital Long-Duration Balloon payload housing six cryogenic smallaperture (half-degree resolution) millimeter-wave polarimeters. The frequency bands of the individual polarimeters are chosen to optimize overall sensitivity to the inflationary CMB polarization signal in the presence of Galactic foregrounds. By making extremely deep, high fidelity measurements of the entire portion of the southern sky that is relatively free of Galactic emission, the Spider data complement those of Planck (in sensitivity and control of systematics) PIPER (in frequency coverage) and EBEX (in sky coverage and angular scale). The data from Spider's inaugural flight in 2015 has resulted in high signal-to-noise maps of the southern Galactic hemisphere covering 10% of the full sky at each of 94 and 150 GHz. The payload is now being fabricated and fitted with a suite of 285 GHz cameras to extend our frequency coverage, improving our ability to disentangle the Galactic and cosmological signals. If its signature is present in the CMB, Spider's frequency coverage and fidelity to a broad range of angular scales enable the experiment to take a step beyond detection, toward the characterization of the gravitational wave induced signature in the CMB. Additionally Spider serves as a training ground for young scientists, including 16 graduate students (9 female, 7 male).
Registration and Fusion of Multiple Source Remotely Sensed Image Data
NASA Technical Reports Server (NTRS)
LeMoigne, Jacqueline
2004-01-01
Earth and Space Science often involve the comparison, fusion, and integration of multiple types of remotely sensed data at various temporal, radiometric, and spatial resolutions. Results of this integration may be utilized for global change analysis, global coverage of an area at multiple resolutions, map updating or validation of new instruments, as well as integration of data provided by multiple instruments carried on multiple platforms, e.g. in spacecraft constellations or fleets of planetary rovers. Our focus is on developing methods to perform fast, accurate and automatic image registration and fusion. General methods for automatic image registration are being reviewed and evaluated. Various choices for feature extraction, feature matching and similarity measurements are being compared, including wavelet-based algorithms, mutual information and statistically robust techniques. Our work also involves studies related to image fusion and investigates dimension reduction and co-kriging for application-dependent fusion. All methods are being tested using several multi-sensor datasets, acquired at EOS Core Sites, and including multiple sensors such as IKONOS, Landsat-7/ETM+, EO1/ALI and Hyperion, MODIS, and SeaWIFS instruments. Issues related to the coregistration of data from the same platform (i.e., AIRS and MODIS from Aqua) or from several platforms of the A-train (i.e., MLS, HIRDLS, OMI from Aura with AIRS and MODIS from Terra and Aqua) will also be considered.
Assessment of earthquake effects - contribution from online communication
NASA Astrophysics Data System (ADS)
D'Amico, Sebastiano; Agius, Matthew; Galea, Pauline
2014-05-01
The rapid increase of social media and online newspapers in the last years have given the opportunity to make a national investigation on macroseismic effects on the Maltese Islands based on felt earthquake reports. A magnitude 4.1 earthquake struck close to Malta on Sunday 24th April 2011 at 13:02 GMT. The earthquake was preceded and followed by a series of smaller magnitude quakes throughout the day, most of which were felt by the locals on the island. The continuous news media coverage during the day and the extensive sharing of the news item on social media resulted in a strong public response to fill in the 'Did you feel it?' online form on the website of the Seismic Monitoring and Research Unit (SMRU) at the University of Malta (http://seismic.research.um.edu.mt/). The results yield interesting information about the demographics of the island, and the different felt experiences possibly relating to geological settings and diverse structural and age-classified buildings. Based on this case study, the SMRU is in the process of developing a mobile phone application dedicated to share earthquake information to the local community. The application will automatically prompt users to fill in a simplified 'Did you feel it?' report to potentially felt earthquakes. Automatic location using Global Positioning Systems can be incorporated to provide a 'real time' intensity map that can be used by the Civil Protection Department.
Merlet, Benjamin; Paulhe, Nils; Vinson, Florence; Frainay, Clément; Chazalviel, Maxime; Poupin, Nathalie; Gloaguen, Yoann; Giacomoni, Franck; Jourdan, Fabien
2016-01-01
This article describes a generic programmatic method for mapping chemical compound libraries on organism-specific metabolic networks from various databases (KEGG, BioCyc) and flat file formats (SBML and Matlab files). We show how this pipeline was successfully applied to decipher the coverage of chemical libraries set up by two metabolomics facilities MetaboHub (French National infrastructure for metabolomics and fluxomics) and Glasgow Polyomics (GP) on the metabolic networks available in the MetExplore web server. The present generic protocol is designed to formalize and reduce the volume of information transfer between the library and the network database. Matching of metabolites between libraries and metabolic networks is based on InChIs or InChIKeys and therefore requires that these identifiers are specified in both libraries and networks. In addition to providing covering statistics, this pipeline also allows the visualization of mapping results in the context of metabolic networks. In order to achieve this goal, we tackled issues on programmatic interaction between two servers, improvement of metabolite annotation in metabolic networks and automatic loading of a mapping in genome scale metabolic network analysis tool MetExplore. It is important to note that this mapping can also be performed on a single or a selection of organisms of interest and is thus not limited to large facilities.
Application of Machine Learning in Urban Greenery Land Cover Extraction
NASA Astrophysics Data System (ADS)
Qiao, X.; Li, L. L.; Li, D.; Gan, Y. L.; Hou, A. Y.
2018-04-01
Urban greenery is a critical part of the modern city and the greenery coverage information is essential for land resource management, environmental monitoring and urban planning. It is a challenging work to extract the urban greenery information from remote sensing image as the trees and grassland are mixed with city built-ups. In this paper, we propose a new automatic pixel-based greenery extraction method using multispectral remote sensing images. The method includes three main steps. First, a small part of the images is manually interpreted to provide prior knowledge. Secondly, a five-layer neural network is trained and optimised with the manual extraction results, which are divided to serve as training samples, verification samples and testing samples. Lastly, the well-trained neural network will be applied to the unlabelled data to perform the greenery extraction. The GF-2 and GJ-1 high resolution multispectral remote sensing images were used to extract greenery coverage information in the built-up areas of city X. It shows a favourable performance in the 619 square kilometers areas. Also, when comparing with the traditional NDVI method, the proposed method gives a more accurate delineation of the greenery region. Due to the advantage of low computational load and high accuracy, it has a great potential for large area greenery auto extraction, which saves a lot of manpower and resources.
Less is More: Membrane Protein Digestion Beyond Urea–Trypsin Solution for Next-level Proteomics*
Zhang, Xi
2015-01-01
The goal of next-level bottom-up membrane proteomics is protein function investigation, via high-coverage high-throughput peptide-centric quantitation of expression, modifications and dynamic structures at systems scale. Yet efficient digestion of mammalian membrane proteins presents a daunting barrier, and prevalent day-long urea–trypsin in-solution digestion proved insufficient to reach this goal. Many efforts contributed incremental advances over past years, but involved protein denaturation that disconnected measurement from functional states. Beyond denaturation, the recent discovery of structure/proteomics omni-compatible detergent n-dodecyl-β-d-maltopyranoside, combined with pepsin and PNGase F columns, enabled breakthroughs in membrane protein digestion: a 2010 DDM-low-TCEP (DLT) method for H/D-exchange (HDX) using human G protein-coupled receptor, and a 2015 flow/detergent-facilitated protease and de-PTM digestions (FDD) for integrative deep sequencing and quantitation using full-length human ion channel complex. Distinguishing protein solubilization from denaturation, protease digestion reliability from theoretical specificity, and reduction from alkylation, these methods shifted day(s)-long paradigms into minutes, and afforded fully automatable (HDX)-protein-peptide-(tandem mass tag)-HPLC pipelines to instantly measure functional proteins at deep coverage, high peptide reproducibility, low artifacts and minimal leakage. Promoting—not destroying—structures and activities harnessed membrane proteins for the next-level streamlined functional proteomics. This review analyzes recent advances in membrane protein digestion methods and highlights critical discoveries for future proteomics. PMID:26081834
NASA Astrophysics Data System (ADS)
Frances, F.; Orozco, I.
2010-12-01
This work presents the assessment of the TETIS distributed hydrological model in mountain basins of the American and Carson rivers in Sierra Nevada (USA) at hourly time discretization, as part of the DMIP2 Project. In TETIS each cell of the spatial grid conceptualizes the water cycle using six tanks connected among them. The relationship between tanks depends on the case, although at the end in most situations, simple linear reservoirs and flow thresholds schemes are used with exceptional results (Vélez et al., 1999; Francés et al., 2002). In particular, within the snow tank, snow melting is based in this work on the simple degree-day method with spatial constant parameters. The TETIS model includes an automatic calibration module, based on the SCE-UA algorithm (Duan et al., 1992; Duan et al., 1994) and the model effective parameters are organized following a split structure, as presented by Francés and Benito (1995) and Francés et al. (2007). In this way, the calibration involves in TETIS up to 9 correction factors (CFs), which correct globally the different parameter maps instead of each parameter cell value, thus reducing drastically the number of variables to be calibrated. This strategy allows for a fast and agile modification in different hydrological processes preserving the spatial structure of each parameter map. With the snowmelt submodel, automatic model calibration was carried out in three steps, separating the calibration of rainfall-runoff and snowmelt parameters. In the first step, the automatic calibration of the CFs during the period 05/20/1990 to 07/31/1990 in the American River (without snow influence), gave a Nash-Sutcliffe Efficiency (NSE) index of 0.92. The calibration of the three degree-day parameters was done using all the SNOTEL stations in the American and Carson rivers. Finally, using previous calibrations as initial values, the complete calibration done in the Carson River for the period 10/01/1992 to 07/31/1993 gave a NSE index of 0.86. The temporal and spatial validation using five periods must be considered in both rivers excellent for discharges (NSEs higher than 0.76) and good for snow distribution (daily spatial coverage errors ranging from -10 to 27%). In conclusion, this work demonstrates: 1.- The viability of automatic calibration of distributed models, with the corresponding personal time saving and maximum exploitation of the available information. 2.- The good performance of the degree-day snowmelt formulation even at hourly time discretization, in spite of its simplicity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kranen, Simon van; Hamming-Vrieze, Olga; Wolf, Annelisa
Purpose: We set out to investigate loss of target coverage from anatomy changes in head and neck cancer patients as a function of applied safety margins and to verify a cone beam computed tomography (CBCT)–based adaptive strategy with an average patient anatomy to overcome possible target underdosage. Methods and Materials: For 19 oropharyngeal cancer patients, volumetric modulated arc therapy treatment plans (2 arcs; simultaneous integrated boost, 70 and 54.25 Gy; 35 fractions) were automatically optimized with uniform clinical target volume (CTV)–to–planning target volume margins of 5, 3, and 0 mm. We applied b-spline CBCT–to–computed tomography (CT) deformable registration to allow recalculation ofmore » the dose on modified CT scans (planning CT deformed to daily CBCT following online positioning) and dose accumulation in the planning CT scan. Patients with deviations in primary or elective CTV coverage >2 Gy were identified as candidates for adaptive replanning. For these patients, a single adaptive intervention was simulated with an average anatomy from the first 10 fractions. Results: Margin reduction from 5 mm to 3 mm to 0 mm generally led to an organ-at-risk (OAR) mean dose (D{sub mean}) sparing of approximately 1 Gy/mm. CTV shrinkage was mainly seen in the elective volumes (up to 10%), likely related to weight loss. Despite online repositioning, substantial systematic errors were present (>3 mm) in lymph node CTV, the parotid glands, and the larynx. Nevertheless, the average increase in OAR dose was small: maximum of 1.2 Gy (parotid glands, D{sub mean}) for all applied margins. Loss of CTV coverage >2 Gy was found in 1, 3, and 7 of 73 CTVs, respectively. Adaptive intervention in 0-mm plans substantially improved coverage: in 5 of 7 CTVs (in 6 patients) to <2 Gy of initially planned. Conclusions: Volumetric modulated arc therapy head and neck cancer treatment plans with 5-mm margins are robust for anatomy changes and show a modest increase in OAR dose. Margin reduction improves OAR sparing with approximately 1 Gy/mm at the expense of target coverage in a subgroup of patients. Patients at risk of CTV underdosage >2 Gy in 0-mm plans may be identified early in treatment using dose accumulation. A single intervention with an average anatomy derived from CBCT effectively mitigates discrepancies.« less
IMAGES OF BLACK AMERICANS: Then, "Them," and Now, "Obama!"
Fiske, Susan T; Bergsieker, Hilary B; Russell, Ann Marie; Williams, Lyle
2009-01-01
Images of Black Americans are becoming remarkably diverse, enabling Barack Obama to defy simple-minded stereotypes and succeed. Understood through the Stereotype Content Model's demonstrably fundamental trait dimensions of perceived warmth and competence, images of Black Americans show three relevant patterns. Stereotyping by omission allows non-Blacks to accentuate the positive, excluding any lingering negativity but implying it by its absence; specifically, describing Black Americans as gregarious and passionate suggests warmth but ignores competence and implies its lack. Obama's credentials prevented him from being cast as incompetent, though the experience debate continued. His legendary calm and passionate charisma saved him on the warmth dimension. Social class subtypes for Black Americans differentiate dramatically between low-income Blacks and Black professionals, among both non-Black and Black samples. Obama clearly fit the moderately warm, highly competent Black-professional subtype. Finally, the campaign's events (and nonevents) allowed voter habituation to overcome non-Blacks' automatic emotional vigilance to Black Americans.
Fiske, Susan T.; Bergsieker, Hilary B.; Russell, Ann Marie; Williams, Lyle
2013-01-01
Images of Black Americans are becoming remarkably diverse, enabling Barack Obama to defy simple-minded stereotypes and succeed. Understood through the Stereotype Content Model’s demonstrably fundamental trait dimensions of perceived warmth and competence, images of Black Americans show three relevant patterns. Stereotyping by omission allows non-Blacks to accentuate the positive, excluding any lingering negativity but implying it by its absence; specifically, describing Black Americans as gregarious and passionate suggests warmth but ignores competence and implies its lack. Obama’s credentials prevented him from being cast as incompetent, though the experience debate continued. His legendary calm and passionate charisma saved him on the warmth dimension. Social class subtypes for Black Americans differentiate dramatically between low-income Blacks and Black professionals, among both non-Black and Black samples. Obama clearly fit the moderately warm, highly competent Black-professional subtype. Finally, the campaign’s events (and nonevents) allowed voter habituation to overcome non-Blacks’ automatic emotional vigilance to Black Americans. PMID:24235974
Adlassnig, Klaus-Peter; Rappelsberger, Andrea
2008-01-01
Software-based medical knowledge packages (MKPs) are packages of highly structured medical knowledge that can be integrated into various health-care information systems or the World Wide Web. They have been established to provide different forms of clinical decision support such as textual interpretation of combinations of laboratory rest results, generating diagnostic hypotheses as well as confirmed and excluded diagnoses to support differential diagnosis in internal medicine, or for early identification and automatic monitoring of hospital-acquired infections. Technically, an MKP may consist of a number of inter-connected Arden Medical Logic Modules. Several MKPs have been integrated thus far into hospital, laboratory, and departmental information systems. This has resulted in useful and widely accepted software-based clinical decision support for the benefit of the patient, the physician, and the organization funding the health care system.
An inductive method for automatic generation of referring physician prefetch rules for PACS.
Okura, Yasuhiko; Matsumura, Yasushi; Harauchi, Hajime; Sukenobu, Yoshiharu; Kou, Hiroko; Kohyama, Syunsuke; Yasuda, Norihiro; Yamamoto, Yuichiro; Inamura, Kiyonari
2002-12-01
To prefetch images in a hospital-wide picture archiving and communication system (PACS), a rule must be devised to permit accurate selection of examinations in which a patient's images are stored. We developed an inductive method to compose prefetch rules from practical data which were obtained in a hospital using a decision tree algorithm. Our methods were evaluated on data acquired in Osaka University Hospital for one month. The data collected consisted of 58,617 cases of consultation reservations, 643,797 examination histories of patients, and 323,993 records of image requests in PACS. Four parameters indicating whether the images of the patient were requested or not for each consultation reservation were derived from the database. As a result, the successful selection sensitivity for consultations in which images were requested was approximately 0.8, and the specificity for excluding consultations accurately where images were not requested was approximately 0.7.
A bibliography of references to avian botulism
Allen, Jack E.; Wilson, Sonoma S.
1977-01-01
This bibliography, first compiled in 1970 in response to many requests for information on avian botulism, has been updated to include the literature published through 1975.In general, only articles dealing primarily with the avian disease are included, as opposed to those concerned with various aspects of the biology of Clostridium botulinum, either type C or type E. A few exceptions, such as Bengton’s report of the first isolation and description of the type C organism, are included for their historical interest. Progress reports and other administrative documents not available for distribution or request are excluded, as are textbook accounts, which are generally summaries of work published elsewhere.Although Mr. Allen and Mrs. Wilson have attempted to list every important reference, they make no claim to complete coverage of the published literature. The authors will be grateful to users of the bibliography who call attention to errors or omissions.
Expenditures on family dental care by active duty soldiers.
Chisick, M C
1996-01-01
Expenditures on family dental care by U.S. active duty soldiers were explored in this 1992 worldwide survey. Of 9,560 respondents (62% response rate), 7,187 claimed dependents and 5,569 provided reliable data. Mean annual expenditures and multinomial regression on a distribution of expenditures were calculated. Results show average family dental care expenditures were as follows: total sample, $135; childless couples, $59; couples with children, $154; and single parents, $120. Between 72 and 83% of families spent $0 on dental care. Excluding non-spenders, overall expenditures averaged as follows: total sample, $531; childless couples, $354; couples with children, $560; and single parents, $470. Regression results show that expenditures on family dental care by soldiers are influenced by different factors depending on family composition. Policy measures to encourage optimal dental care by families of active duty soldiers should focus on increasing insurance coverage and use.
Civic stratification and the exclusion of undocumented immigrants from cross-border health care*
Torres, Jacqueline M.; Waldinger, Roger
2016-01-01
This paper proposes a theoretical framework and an empirical example of the relationship between the civic stratification of immigrants in the United States, and their access to healthcare. We use the 2007 Pew/RWJF Hispanic Healthcare Survey, a nationally representative survey of U.S. Latinos (n=2783 foreign-born respondents) and find that immigrants who are not citizens or legal permanent residents are significantly more likely to be excluded from care in both the U.S. and across borders. Legal status differences in cross-border care utilization persisted after controlling for health status, insurance coverage, and other potential demographic and socio-economic predictors of care. Exclusion from care on both sides of the border was associated with reduced rates of receiving timely preventive services. Civic stratification, and political determinants broadly speaking, should be considered alongside social determinants of population health and healthcare. PMID:26582512
Managed care and sexual dysfunction. Based on a presentation by William Parham, MD.
1999-01-01
The availability of managed care benefits for the treatment of sexual dysfunction is inextricably linked with cost. An atypically low increase of 4.4% in aggregate healthcare expenditures in 1995-1996 stands in sharp contrast to outlays of more than 11% between 1966 and 1993. Between 1993 and 1996, that increase hovered at about 5%, the result largely of the growth of managed care and low levels of general inflation. However, despite relative containment of overall healthcare expenditures, those related to pharmaceuticals have risen more than 9.2% annually, an increase that reflects the managed care industry's failure to restrain drug costs. In deciding whether it will cover a particular treatment, the managed care industry applies three sets of criteria relating to efficacy, medical necessity, and appropriateness. Managed care companies are expected to counter runaway pharmacy costs for sildenafil by excluding it from coverage, imposing significant limitations, or requiring higher copayments.
Optimum allocation for a dual-frame telephone survey.
Wolter, Kirk M; Tao, Xian; Montgomery, Robert; Smith, Philip J
2015-12-01
Careful design of a dual-frame random digit dial (RDD) telephone survey requires selecting from among many options that have varying impacts on cost, precision, and coverage in order to obtain the best possible implementation of the study goals. One such consideration is whether to screen cell-phone households in order to interview cell-phone only (CPO) households and exclude dual-user household, or to take all interviews obtained via the cell-phone sample. We present a framework in which to consider the tradeoffs between these two options and a method to select the optimal design. We derive and discuss the optimum allocation of sample size between the two sampling frames and explore the choice of optimum p , the mixing parameter for the dual-user domain. We illustrate our methods using the National Immunization Survey , sponsored by the Centers for Disease Control and Prevention.
Empiric Antibiotic Therapy of Nosocomial Bacterial Infections.
Reddy, Pramod
2016-01-01
Broad-spectrum antibiotics are commonly used by physicians to treat various infections. The source of infection and causative organisms are not always apparent during the initial evaluation of the patient, and antibiotics are often given empirically to patients with suspected sepsis. Fear of attempting cephalosporins and carbapenems in penicillin-allergic septic patients may result in significant decrease in the spectrum of antimicrobial coverage. Empiric antibiotic therapy should sufficiently cover all the suspected pathogens, guided by the bacteriologic susceptibilities of the medical center. It is important to understand the major pharmacokinetic properties of antibacterial agents for proper use and to minimize the development of resistance. In several septic patients, negative cultures do not exclude active infection and positive cultures may not represent the actual infection. This article will review the important differences in the spectrum of commonly used antibiotics for nosocomial bacterial infections with a particular emphasis on culture-negative sepsis and colonization.
Notable licensing deals in the biopharma industry in the second quarter of 2017.
D'Souza, P
2017-08-01
During the second quarter of 2017, Cortellis Competitive Intelligence added 967 new licensing deals (excluding mergers and acquisition deals) as part of its ongoing coverage of pharmaceutical licensing activity. This meant an 8% decrease on the previous quarter (1,050) and a 3% decrease from the same quarter in 2016 (993). This quarter also showed a significant decline in the number of deals worth more than USD 0.5 billion from the last quarter (7 vs. 17). This article will highlight a number of the most valuable and notable deals forged during the quarter, as well as a selection of deals from some of the most prolific deal makers in the life sciences. An update on milestone, options and terminated deals of significance will also be presented, along with an early outlook on the next quarter's pharmaceutical licensing activity.
Peeters, R; Galesloot, P J B
2002-03-01
The objective of this study was to estimate the daily fat yield and fat percentage from one sampled milking per cow per test day in an automatic milking system herd, when the milking times and milk yields of all individual milkings are recorded by the automatic milking system. Multiple regression models were used to estimate the 24-h fat percentage when only one milking is sampled for components and milk yields and milking times are known for all milkings in the 24-h period before the sampled milking. In total, 10,697 cow test day records, from 595 herd tests at 91 Dutch herds milked with an automatic milking system, were used. The best model to predict 24-h fat percentage included fat percentage, protein percentage, milk yield and milking interval of the sampled milking, milk yield, and milking interval of the preceding milking, and the interaction between milking interval and the ratio of fat and protein percentage of the sampled milking. This model gave a standard deviation of the prediction error (SE) for 24-h fat percentage of 0.321 and a correlation between the predicted and actual 24-h fat percentage of 0.910. For the 24-h fat yield, we found SE = 90 g and correlation = 0.967. This precision is slightly better than that of present a.m.-p.m. testing schemes. Extra attention must be paid to correctly matching the sample jars and the milkings. Furthermore, milkings with an interval of less than 4 h must be excluded from sampling as well as milkings that are interrupted or that follow an interrupted milking. Under these restrictions (correct matching, interval of at least 4 h, and no interrupted milking), one sampled milking suffices to get a satisfactory estimate for the test-day fat yield.
NASA Astrophysics Data System (ADS)
Kamiya, Naoki; Ieda, Kosuke; Zhou, Xiangrong; Yamada, Megumi; Kato, Hiroki; Muramatsu, Chisako; Hara, Takeshi; Miyoshi, Toshiharu; Inuzuka, Takashi; Matsuo, Masayuki; Fujita, Hiroshi
2017-03-01
Amyotrophic lateral sclerosis (ALS) causes functional disorders such as difficulty in breathing and swallowing through the atrophy of voluntary muscles. ALS in its early stages is difficult to diagnose because of the difficulty in differentiating it from other muscular diseases. In addition, image inspection methods for aggressive diagnosis for ALS have not yet been established. The purpose of this study is to develop an automatic analysis system of the whole skeletal muscle to support the early differential diagnosis of ALS using whole-body CT images. In this study, the muscular atrophy parts including ALS patients are automatically identified by recognizing and segmenting whole skeletal muscle in the preliminary steps. First, the skeleton is identified by its gray value information. Second, the initial area of the body cavity is recognized by the deformation of the thoracic cavity based on the anatomical segmented skeleton. Third, the abdominal cavity boundary is recognized using ABM for precisely recognizing the body cavity. The body cavity is precisely recognized by non-rigid registration method based on the reference points of the abdominal cavity boundary. Fourth, the whole skeletal muscle is recognized by excluding the skeleton, the body cavity, and the subcutaneous fat. Additionally, the areas of muscular atrophy including ALS patients are automatically identified by comparison of the muscle mass. The experiments were carried out for ten cases with abnormality in the skeletal muscle. Global recognition and segmentation of the whole skeletal muscle were well realized in eight cases. Moreover, the areas of muscular atrophy including ALS patients were well identified in the lower limbs. As a result, this study indicated the basic technology to detect the muscle atrophy including ALS. In the future, it will be necessary to consider methods to differentiate other kinds of muscular atrophy as well as the clinical application of this detection method for early ALS detection and examine a large number of cases with stage and disease type.
Ginsberg, Gary Michael; Edejer, Tessa Tan-Torres; Lauer, Jeremy A; Sepulveda, Cecilia
2009-10-09
The paper calculates regional generalized cost-effectiveness estimates of screening, prevention, treatment and combined interventions for cervical cancer. Using standardised WHO-CHOICE methodology, a cervical cancer model was employed to provide estimates of screening, vaccination and treatment effectiveness. Intervention effectiveness was determined via a population state-transition model (PopMod) that simulates the evolution of a sub-regional population accounting for births, deaths and disease epidemiology. Economic costs of procedures and treatment were estimated, including programme overhead and training costs. In regions characterized by high income, low mortality and high existing treatment coverage, the addition of any screening programme to the current high treatment levels is very cost-effective. However, based on projections of the future price per dose (representing the economic costs of the vaccination excluding monopolistic rents and vaccine development cost) vaccination is the most cost-effective intervention. In regions characterized by low income, low mortality and existing treatment coverage around 50%, expanding treatment with or without combining it with screening appears to be cost-effective or very cost-effective. Abandoning treatment in favour of screening in a no-treatment scenario would not be cost-effective. Vaccination is usually the most cost-effective intervention. Penta or tri-annual PAP smears appear to be cost-effective, though when combined with HPV-DNA testing they are not cost-effective. In regions characterized by low income, high mortality and low treatment levels, expanding treatment with or without adding screening would be very cost-effective. A one off vaccination plus expanding treatment was usually very cost-effective. One-off PAP or VIA screening at age 40 are more cost-effective than other interventions though less effective overall. From a cost-effectiveness perspective, consideration should be given to implementing vaccination (depending on cost per dose and longevity of efficacy) and screening programmes on a worldwide basis to reduce the burden of disease from cervical cancer. Treatment should also be increased where coverage is low.
SU-F-T-443: Quantification of Dosimetric Effects of Dental Metallic Implant On VMAT Plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, C; Jiang, W; Feng, Y
Purpose: To evaluate the dosimetric impact of metallic implant that correlates with the size of targets and metallic implants and distance in between on volumetric-modulated arc therapy (VMAT) plans for head and neck (H&N) cancer patients with dental metallic implant. Methods: CT images of H&N cancer patients with dental metallic implant were used. Target volumes with different sizes and locations were contoured. Metal artifact regions excluding surrounding critical organs were outlined and assigned with CT numbers close to water (0HU). VMAT plans with half-arc, one-full-arc and two-full-arcs were constructed and same plans were applied to structure sets with and withoutmore » CT number assignment of metal artifact regions and compared. D95% was utilized to investigate PTV dose coverage and SNC Patient− Software was used for the analysis of dose distribution difference slice by slice. Results: For different targets sizes, variation of PTV dose coverage (Delta-D95%) with and without CT number replacement reduced with larger target volume for all half-arc, one-arc and two-arc VMAT plans even though there were no clinically significant differences. Additionally, there were no significant variations of the maximum percent difference (max.%diff) of dose distribution. With regard to the target location, Delta-D95% and max. %diff dropped with increasing distance between target and metallic implant. Furthermore, half-arc plans showed greater impact than one-arc plans, and two-arc plans had smallest influence for PTV dose coverage and dose distribution. Conclusion: The target size has less correlation of doseimetric impact than the target location relative to metallic implants. Plans with more arcs alleviate the dosimetric effect of metal artifact because of less contribution to the target dose from beams going through the regions with metallic artifacts. Incorrect CT number causes inaccurate dose distribution, therefore appropriately overwriting metallic artifact regions with reasonable CT numbers is recommended. More patient data are collected and under further analysis.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prokic, Vesna, E-mail: vesna.prokic@uniklinik-freiburg.de; Wiedenmann, Nicole; Fels, Franziska
2013-01-01
Purpose: To develop a new treatment planning strategy in patients with multiple brain metastases. The goal was to perform whole brain irradiation (WBI) with hippocampal sparing and dose escalation on multiple brain metastases. Two treatment concepts were investigated: simultaneously integrated boost (SIB) and WBI followed by stereotactic fractionated radiation therapy sequential concept (SC). Methods and Materials: Treatment plans for both concepts were calculated for 10 patients with 2-8 brain metastases using volumetric modulated arc therapy. In the SIB concept, the prescribed dose was 30 Gy in 12 fractions to the whole brain and 51 Gy in 12 fractions to individualmore » brain metastases. In the SC concept, the prescription was 30 Gy in 12 fractions to the whole brain followed by 18 Gy in 2 fractions to brain metastases. All plans were optimized for dose coverage of whole brain and lesions, simultaneously minimizing dose to the hippocampus. The treatment plans were evaluated on target coverage, homogeneity, and minimal dose to the hippocampus and organs at risk. Results: The SIB concept enabled more successful sparing of the hippocampus; the mean dose to the hippocampus was 7.55 {+-} 0.62 Gy and 6.29 {+-} 0.62 Gy, respectively, when 5-mm and 10-mm avoidance regions around the hippocampus were used, normalized to 2-Gy fractions. In the SC concept, the mean dose to hippocampus was 9.8 {+-} 1.75 Gy. The mean dose to the whole brain (excluding metastases) was 33.2 {+-} 0.7 Gy and 32.7 {+-} 0.96 Gy, respectively, in the SIB concept, for 5-mm and 10-mm hippocampus avoidance regions, and 37.23 {+-} 1.42 Gy in SC. Conclusions: Both concepts, SIB and SC, were able to achieve adequate whole brain coverage and radiosurgery-equivalent dose distributions to individual brain metastases. The SIB technique achieved better sparing of the hippocampus, especially when a10-mm hippocampal avoidance region was used.« less
AMFESYS: Modelling and diagnosis functions for operations support
NASA Technical Reports Server (NTRS)
Wheadon, J.
1993-01-01
Packetized telemetry, combined with low station coverage for close-earth satellites, may introduce new problems in presenting to the operator a clear picture of what the spacecraft is doing. A recent ESOC study has gone some way to show, by means of a practical demonstration, how the use of subsystem models combined with artificial intelligence techniques, within a real-time spacecraft control system (SCS), can help to overcome these problems. A spin-off from using these techniques can be an improvement in the reliability of the telemetry (TM) limit-checking function, as well as the telecommand verification function, of the Spacecraft Control systems (SCS). The problem and how it was addressed, including an overview of the 'AMF Expert System' prototype are described, and proposes further work which needs to be done to prove the concept. The Automatic Mirror Furnace is part of the payload of the European Retrievable Carrier (EURECA) spacecraft, which was launched in July 1992.
Enclosure Transform for Interest Point Detection From Speckle Imagery.
Yongjian Yu; Jue Wang
2017-03-01
We present a fast enclosure transform (ET) to localize complex objects of interest from speckle imagery. This approach explores the spatial confinement on regional features from a sparse image feature representation. Unrelated, broken ridge features surrounding an object are organized collaboratively, giving rise to the enclosureness of the object. Three enclosure likelihood measures are constructed, consisting of the enclosure force, potential energy, and encloser count. In the transform domain, the local maxima manifest the locations of objects of interest, for which only the intrinsic dimension is known a priori. The discrete ET algorithm is computationally efficient, being on the order of O(MN) using N measuring distances across an image of M ridge pixels. It involves easy and few parameter settings. We demonstrate and assess the performance of ET on the automatic detection of the prostate locations from supra-pubic ultrasound images. ET yields superior results in terms of positive detection rate, accuracy and coverage.
Computational Visual Stress Level Analysis of Calcareous Algae Exposed to Sedimentation
Nilssen, Ingunn; Eide, Ingvar; de Oliveira Figueiredo, Marcia Abreu; de Souza Tâmega, Frederico Tapajós; Nattkemper, Tim W.
2016-01-01
This paper presents a machine learning based approach for analyses of photos collected from laboratory experiments conducted to assess the potential impact of water-based drill cuttings on deep-water rhodolith-forming calcareous algae. This pilot study uses imaging technology to quantify and monitor the stress levels of the calcareous algae Mesophyllum engelhartii (Foslie) Adey caused by various degrees of light exposure, flow intensity and amount of sediment. A machine learning based algorithm was applied to assess the temporal variation of the calcareous algae size (∼ mass) and color automatically. Measured size and color were correlated to the photosynthetic efficiency (maximum quantum yield of charge separation in photosystem II, ΦPSIImax) and degree of sediment coverage using multivariate regression. The multivariate regression showed correlations between time and calcareous algae sizes, as well as correlations between fluorescence and calcareous algae colors. PMID:27285611
Deployment and Evaluation of the Helicopter In-Flight Tracking System (HITS)
NASA Technical Reports Server (NTRS)
Daskalakis, Anastasios; Martone, Patrick
2004-01-01
The Gulf of Mexico airspace has two major operating regions: low altitude offshore (below 7,000 ft) and high altitude oceanic (above 18,000 ft). Both regions suffer significant inefficiencies due to the lack of continuous surveillance during Instrument Flight Rules operations. Provision of surveillance in the offshore region is hindered by its low-altitude nature, which makes coverage by conventional radars economically infeasible. Significant portions of the oceanic sectors are inaccessible to shore-based sensors, as they are beyond line-of-sight. Two emerging surveillance technologies were assessed that are relatively low cost and can be deployed on offshore platforms Wide Area Multilateration and Automatic Dependent Surveillance Broadcast. Performance criteria were formulated using existing FAA specifications. Three configurations were developed and deployed representative of systems serving full-size and reduced-sized domestic terminal areas and an en-route/oceanic region. These configurations were evaluated during nine flight test periods using fixed- and rotary-wing aircraft.
Safe use of cellular telephones in hospitals: fundamental principles and case studies.
Cohen, Ted; Ellis, Willard S; Morrissey, Joseph J; Bakuzonis, Craig; David, Yadin; Paperman, W David
2005-01-01
Many industries and individuals have embraced cellular telephones. They provide mobile, synchronous communication, which could hypothetically increase the efficiency and safety of inpatient healthcare. However, reports of early analog cellular telephones interfering with critical life-support machines had led many hospitals to strictly prohibit cellular telephones. A literature search revealed that individual hospitals now are allowing cellular telephone use with various policies to prevent electromagnetic interference with medical devices. The fundamental principles underlying electromagnetic interference are immunity, frequency, modulation technology, distance, and power Electromagnetic interference risk mitigation methods based on these principles have been successfully implemented. In one case study, a minimum distance between cellular telephones and medical devices is maintained, with restrictions in critical areas. In another case study, cellular telephone coverage is augmented to automatically control the power of the cellular telephone. While no uniform safety standard yet exists, cellular telephones can be safely used in hospitals when their use is managed carefully.
NASA Astrophysics Data System (ADS)
Nielsen, E.; Schmidt, W.
2014-03-01
In January 1977 a new type of radar aurora experiment named STARE (Scandinavian Twin Aurora Radar Experiment) commenced operation in northern Scandinavia. The purpose of the experiment was two-fold: to make observations of the nature of radar auroras, and to contribute to the study of solar-terrestrial relationships (or space weather). The experiment was designed for automatic continuous operation, and for nearly two and a half decades it provided estimates of electron flows with good spatial coverage and resolution and good time resolution. It was a successful experiment that yielded a wealth of observations and results, pertaining to, and based on, the observed time variations of the electron flows and to the spatial flow pattern observed at any given time. This radar system inspired the creation of a similar system, SABRE (Sweden And Britain Radar Experiment), which increased the field of view towards the southwest of STARE. This system commenced operation in 1982.
A New Strategy to Land Precisely on the Northern Plains of Mars
NASA Technical Reports Server (NTRS)
Cheng, Yang; Huertas, Andres
2010-01-01
During the Phoenix mission landing site selection process, the Mars Reconnaissance Orbiter (MRO) High Resolution Imaging Science Experiment (HiRISE) images revealed widely spread and dense rock fields in the northern plains. Automatic rock mapping and subsequent statistical analyses showed 30-90% CFA (cumulative fractional area) covered by rocks larger than 1 meter in dense rock fields around craters. Less dense rock fields had 5-30% rock coverage in terrain away from craters. Detectable meter-scale boulders were found nearly everywhere. These rocks present a risk to spacecraft safety during landing. However, they are the most salient topographic features in this region, and can be good landmarks for spacecraft localization during landing. In this paper we present a novel strategy that uses abundance of rocks in northern plains for spacecraft localization. The paper discusses this approach in three sections: a rock-based landmark terrain relative navigation (TRN) algorithm; the TRN algorithm feasibility; and conclusions.
Automatic Earth observation data service based on reusable geo-processing workflow
NASA Astrophysics Data System (ADS)
Chen, Nengcheng; Di, Liping; Gong, Jianya; Yu, Genong; Min, Min
2008-12-01
A common Sensor Web data service framework for Geo-Processing Workflow (GPW) is presented as part of the NASA Sensor Web project. This framework consists of a data service node, a data processing node, a data presentation node, a Catalogue Service node and BPEL engine. An abstract model designer is used to design the top level GPW model, model instantiation service is used to generate the concrete BPEL, and the BPEL execution engine is adopted. The framework is used to generate several kinds of data: raw data from live sensors, coverage or feature data, geospatial products, or sensor maps. A scenario for an EO-1 Sensor Web data service for fire classification is used to test the feasibility of the proposed framework. The execution time and influences of the service framework are evaluated. The experiments show that this framework can improve the quality of services for sensor data retrieval and processing.
DARPA TIMIT acoustic-phonetic continous speech corpus CD-ROM. NIST speech disc 1-1.1
NASA Astrophysics Data System (ADS)
Garofolo, J. S.; Lamel, L. F.; Fisher, W. M.; Fiscus, J. G.; Pallett, D. S.
1993-02-01
The Texas Instruments/Massachusetts Institute of Technology (TIMIT) corpus of read speech has been designed to provide speech data for the acquisition of acoustic-phonetic knowledge and for the development and evaluation of automatic speech recognition systems. TIMIT contains speech from 630 speakers representing 8 major dialect divisions of American English, each speaking 10 phonetically-rich sentences. The TIMIT corpus includes time-aligned orthographic, phonetic, and word transcriptions, as well as speech waveform data for each spoken sentence. The release of TIMIT contains several improvements over the Prototype CD-ROM released in December, 1988: (1) full 630-speaker corpus, (2) checked and corrected transcriptions, (3) word-alignment transcriptions, (4) NIST SPHERE-headered waveform files and header manipulation software, (5) phonemic dictionary, (6) new test and training subsets balanced for dialectal and phonetic coverage, and (7) more extensive documentation.
Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.
Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam
2018-01-01
During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.
Policy interventions to address child health disparities: moving beyond health insurance.
Currie, Janet
2009-11-01
A full accounting of the excess burden of poor health in childhood must include any continuing loss of productivity over the life course. Including these costs results in a much higher estimate of the burden than focusing only on medical costs and other shorter-run costs to parents (such as lost work time). Policies designed to reduce this burden must go beyond increasing eligibility for health insurance, because disparities exist not only in access to health insurance but also in take-up of insurance, access to care, and the incidence of health conditions. We need to create a comprehensive safety net for young children that includes automatic eligibility for basic health coverage under Medicaid unless parents opt out by enrolling children in a private program; health and nutrition services for pregnant women and infants; quality preschool; and home visiting for infants and children at risk. Such a program is feasible and would be relatively inexpensive.
Automating the Generation of Heterogeneous Aviation Safety Cases
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Pai, Ganesh J.; Pohl, Josef M.
2012-01-01
A safety case is a structured argument, supported by a body of evidence, which provides a convincing and valid justification that a system is acceptably safe for a given application in a given operating environment. This report describes the development of a fragment of a preliminary safety case for the Swift Unmanned Aircraft System. The construction of the safety case fragment consists of two parts: a manually constructed system-level case, and an automatically constructed lower-level case, generated from formal proof of safety-relevant correctness properties. We provide a detailed discussion of the safety considerations for the target system, emphasizing the heterogeneity of sources of safety-relevant information, and use a hazard analysis to derive safety requirements, including formal requirements. We evaluate the safety case using three classes of metrics for measuring degrees of coverage, automation, and understandability. We then present our preliminary conclusions and make suggestions for future work.
Model Checking Artificial Intelligence Based Planners: Even the Best Laid Plans Must Be Verified
NASA Technical Reports Server (NTRS)
Smith, Margaret H.; Holzmann, Gerard J.; Cucullu, Gordon C., III; Smith, Benjamin D.
2005-01-01
Automated planning systems (APS) are gaining acceptance for use on NASA missions as evidenced by APS flown On missions such as Orbiter and Deep Space 1 both of which were commanded by onboard planning systems. The planning system takes high level goals and expands them onboard into a detailed of action fiat the spacecraft executes. The system must be verified to ensure that the automatically generated plans achieve the goals as expected and do not generate actions that would harm the spacecraft or mission. These systems are typically tested using empirical methods. Formal methods, such as model checking, offer exhaustive or measurable test coverage which leads to much greater confidence in correctness. This paper describes a formal method based on the SPIN model checker. This method guarantees that possible plans meet certain desirable properties. We express the input model in Promela, the language of SPIN and express the properties of desirable plans formally.
Third-generation intelligent IR focal plane arrays
NASA Astrophysics Data System (ADS)
Caulfield, H. John; Jack, Michael D.; Pettijohn, Kevin L.; Schlesselmann, John D.; Norworth, Joe
1998-03-01
SBRC is at the forefront of industry in developing IR focal plane arrays including multi-spectral technology and '3rd generation' functions that mimic the human eye. 3rd generation devices conduct advanced processing on or near the FPA that serve to reduce bandwidth while performing needed functions such as automatic target recognition, uniformity correction and dynamic range enhancement. These devices represent a solution for processing the exorbitantly high bandwidth coming off large area FPAs without sacrificing systems sensitivity. SBRC's two-color approach leverages the company's HgCdTe technology to provide simultaneous multiband coverage, from short through long wave IR, with near theoretical performance. IR systems that are sensitive to different spectral bands achieve enhanced capabilities for target identification and advanced discrimination. This paper will provide a summary of the issues, the technology and the benefits of SBRC's third generation smart and two-color FPAs.
Air-to-air radar flight testing
NASA Astrophysics Data System (ADS)
Scott, Randall E.
1988-06-01
This volume in the AGARD Flight Test Techniques Series describes flight test techniques, flight test instrumentation, ground simulation, data reduction and analysis methods used to determine the performance characteristics of a modern air-to-air (a/a) radar system. Following a general coverage of specification requirements, test plans, support requirements, development and operational testing, and management information systems, the report goes into more detailed flight test techniques covering a/a radar capabilities of: detection, manual acquisition, automatic acquisition, tracking a single target, and detection and tracking of multiple targets. There follows a section on additional flight test considerations such as electromagnetic compatibility, electronic countermeasures, displays and controls, degraded and backup modes, radome effects, environmental considerations, and use of testbeds. Other sections cover ground simulation, flight test instrumentation, and data reduction and analysis. The final sections deal with reporting and a discussion of considerations for the future and how they may affect radar flight testing.
Wide-area continuous offender monitoring
NASA Astrophysics Data System (ADS)
Hoshen, Joseph; Drake, George; Spencer, Debra D.
1997-02-01
The corrections system in the U.S. is supervising over five million offenders. This number is rising fast and so are the direct and indirect costs to society. To improve supervision and reduce the cost of parole and probation, first generation home arrest systems were introduced in 1987. While these systems proved to be helpful to the corrections system, their scope is rather limited because they only cover an offender at a single location and provide only a partial time coverage. To correct the limitations of first- generation systems, second-generation wide area continuous electronic offender monitoring systems, designed to monitor the offender at all times and locations, are now on the drawing board. These systems use radio frequency location technology to track the position of offenders. The challenge for this technology is the development of reliable personal locator devices that are small, lightweight, with long operational battery life, and indoors/outdoors accuracy of 100 meters or less. At the center of a second-generation system is a database that specifies the offender's home, workplace, commute, and time the offender should be found in each. The database could also define areas from which the offender is excluded. To test compliance, the system would compare the observed coordinates of the offender with the stored location for a given time interval. Database logfiles will also enable law enforcement to determine if a monitored offender was present at a crime scene and thus include or exclude the offender as a potential suspect.
Wide area continuous offender monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoshen, J.; Drake, G.; Spencer, D.
The corrections system in the U.S. is supervising over five million offenders. This number is rising fast and so are the direct and indirect costs to society. To improve supervision and reduce the cost of parole and probation, first generation home arrest systems were introduced in 1987. While these systems proved to be helpful to the corrections system, their scope is rather limited because they only cover an offender at a single location and provide only a partial time coverage. To correct the limitations of first-generation systems, second-generation wide area continuous electronic offender monitoring systems, designed to monitor the offendermore » at all times and locations, are now on the drawing board. These systems use radio frequency location technology to track the position of offenders. The challenge for this technology is the development of reliable personal locator devices that are small, lightweight, with long operational battery life, and indoors/outdoors accuracy of 100 meters or less. At the center of a second-generation system is a database that specifies the offender`s home, workplace, commute, and time the offender should be found in each. The database could also define areas from which the offender is excluded. To test compliance, the system would compare the observed coordinates of the offender with the stored location for a given time interval. Database logfiles will also enable law enforcement to determine if a monitored offender was present at a crime scene and thus include or exclude the offender as a potential suspect.« less
2012-01-01
Background Coronary artery calcifications (CAC) are markers of coronary atherosclerosis, but do not correlate well with stenosis severity. This study intended to evaluate clinical situations where a combined approach of coronary calcium scoring (CS) and nuclear stress test (SPECT-MPI) is useful for the detection of relevant CAD. Methods Patients with clinical indication for invasive coronary angiography (ICA) were included into our study during 08/2005-09/2008. At first all patients underwent CS procedure as part of the study protocol performed by either using a multidetector computed tomography (CT) scanner or a dual-source CT imager. CAC were automatically defined by dedicated software and the Agatston score was semi-automatically calculated. A stress-rest SPECT-MPI study was performed afterwards and scintigraphic images were evaluated quantitatively. Then all patients underwent ICA. Thereby significant CAD was defined as luminal stenosis ≥75% in quantitative coronary analysis (QCA) in ≥1 epicardial vessel. To compare data lacking Gaussian distribution an unpaired Wilcoxon-Test (Mann–Whitney) was used. Otherwise a Students t-test for unpaired samples was applied. Calculations were considered to be significant at a p-value of <0.05. Results We consecutively included 351 symptomatic patients (mean age: 61.2±12.3 years; range: 18–94 years; male: n=240) with a mean Agatston score of 258.5±512.2 (range: 0–4214). ICA verified exclusion of significant CAD in 66/67 (98.5%) patients without CAC. CAC was detected in remaining 284 patients. In 132/284 patients (46.5%) with CS>0 significant CAD was confirmed by ICA, and excluded in 152/284 (53.5%) patients. Sensitivity for CAD detection by CS alone was calculated as 99.2%, specificity was 30.3%, and negative predictive value was 98.5%. An additional SPECT in patients with CS>0 increased specificity to 80.9% while reducing sensitivity to 87.9%. Diagnostic accuracy was 84.2%. Conclusions In patients without CS=0 significant CAD can be excluded with a high negative predictive value by CS alone. An additional SPECT-MPI in those patients with CS>0 leads to a high diagnostic accuracy for the detection of CAD while reducing the number of patients needing invasive diagnostic procedure. PMID:23206557