Severity: Warning
Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 176
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 316
Function: require_once
Object counting, defined as the task of accurately predicting the number of objects in static images or videos, has recently attracted considerable interest. However, the unavoidable presence of background noise prevents counting performance from advancing further. To address this issue, we created a group and graph attention network (GGANet) for dense object counting. GGANet is an encoder-decoder architecture incorporating a group channel attention (GCA) module and a learnable graph attention (LGA) module. The GCA module groups the feature map into several subfeatures, each of which is assigned an attention factor through the identical channel attention. The LGA module views the feature map as a graph structure in which the different channels represent diverse feature vertices, and the responses between channels represent edges. The GCA and LGA modules jointly avoid the interference of irrelevant pixels and suppress the background noise. Experiments are conducted on four crowd-counting datasets, two vehicle-counting datasets, one remote-sensing counting dataset, and one few-shot object-counting dataset. Comparative results prove that the proposed GGANet achieves superior counting performance.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TNNLS.2023.3336894 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!