Severity: Warning
Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 176
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 316
Function: require_once
Learning continually from a stream of training data or tasks with an ability to learn the unseen classes using a zero-shot learning framework is gaining attention in the literature. It is referred to as continual zero-shot learning (CZSL). Existing CZSL requires clear task-boundary information during training which is not practically feasible. This paper proposes a task-free generalized CZSL (Tf-GCZSL) method with short-term/long-term memory to overcome the requirement of task-boundary in training. A variational autoencoder (VAE) handles the fundamental ZSL tasks. The short-term and long-term memory help to overcome the condition of the task boundary in the CZSL framework. Further, the proposed Tf-GCZSL method combines the concept of experience replay with dark knowledge distillation and regularization to overcome the catastrophic forgetting issues in a continual learning framework. Finally, the Tf-GCZSL uses a fully connected classifier developed using the synthetic features generated at the latent space of the VAE. The performance of the proposed Tf-GCZSL is evaluated in the existing task-agnostic prediction setting and the proposed task-free setting for the generalized CZSL over the five ZSL benchmark datasets. The results clearly indicate that the proposed Tf-GCZSL improves the prediction at least by 12%, 1%, 3%, 4%, and 3% over existing state-of-the-art and baseline methods for CUB, aPY, AWA1, AWA2, and SUN datasets, respectively in both settings (task-agnostic prediction and task-free learning). The source code is available at https://github.com/Chandan-IITI/Tf-GCZSL.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.neunet.2022.08.034 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!