A PHP Error was encountered

Severity: Warning

Message: fopen(/var/lib/php/sessions/ci_sessiongdmstbk4hrvc71fv6v0b21b74mj83ope): Failed to open stream: No space left on device

Filename: drivers/Session_files_driver.php

Line Number: 177

Backtrace:

File: /var/www/html/index.php
Line: 316
Function: require_once

A PHP Error was encountered

Severity: Warning

Message: session_start(): Failed to read session data: user (path: /var/lib/php/sessions)

Filename: Session/Session.php

Line Number: 137

Backtrace:

File: /var/www/html/index.php
Line: 316
Function: require_once

Quantization via Distillation and Contrastive Learning. | LitMetric

Quantization via Distillation and Contrastive Learning.

IEEE Trans Neural Netw Learn Syst

Published: December 2024

Quantization is a critical technique employed across various research fields for compressing deep neural networks (DNNs) to facilitate deployment within resource-limited environments. This process necessitates a delicate balance between model size and performance. In this work, we explore knowledge distillation (KD) as a promising approach for improving quantization performance by transferring knowledge from high-precision networks to low-precision counterparts. We specifically investigate feature-level information loss during distillation and emphasize the importance of feature-level network quantization perception. We propose a novel quantization method that combines feature-level distillation and contrastive learning to extract and preserve more valuable information during the quantization process. Furthermore, we utilize the hyperbolic tangent function to estimate gradients with respect to the rounding function, which smoothens the training procedure. Our extensive experimental results demonstrate that the proposed approach achieves competitive model performance with the quantized network compared to its full-precision counterpart, thus validating its efficacy and potential for real-world applications.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNNLS.2023.3300309DOI Listing

Publication Analysis

Top Keywords

distillation contrastive
8
contrastive learning
8
quantization
6
quantization distillation
4
learning quantization
4
quantization critical
4
critical technique
4
technique employed
4
employed fields
4
fields compressing
4

Similar Publications

This study is the first attempt to examine the effects of NETA on immune cells and telocytes. The results of this study form an important knowledge base for the development of new information on the mechanism of contraceptive action of NETA in the uterus. Norethisterone acetate (NETA) is a synthetic progestogen medication commonly utilized in birth control pills, menopausal hormone therapy, and for curing abnormal uterine bleeding and endometriosis.

View Article and Find Full Text PDF

Remediation of petroleum hydrocarbons in contaminated groundwater with the use of surfactants and biosurfactants.

Chemosphere

March 2025

Environmental Science and Engineering Program, Biological and Environmental Science and Engineering Division, King Abdullah University of Science and Technology (KAUST), Thuwal, 23955-6900, Kingdom of Saudi Arabia. Electronic address:

Remediation of hydrocarbon-contaminated groundwater is challenging due to the large volume of contaminated water, restricted aquifer access, and the recalcitrance of hydrocarbons. This study evaluates chemically-based surfactants (A and B, comprised of alcohols, esters and distillates from petroleum) and biosurfactants (C and BS, containing enzymes and microbial-derived surfactants) to enhance petroleum-hydrocarbon remediation. Surfactants/biosurfactants were evaluated under environmental conditions mimicking Arabic Region groundwater.

View Article and Find Full Text PDF

This study investigates the chemical durability of uranium oxide microparticles (UO and UO), as potential reference materials for nuclear safeguards. To optimize long-term preservation, the particles were exposed to three different storage media: dilute nitric acid (10 mol L HNO), deionized water, and ethanol. Dissolution rates in nitric acid (∼5 × 10 g.

View Article and Find Full Text PDF

Few-shot Class-incremental Pill Recognition (FSCIPR) aims to develop an automatic pill recognition system that requires only a few training data and can continuously adapt to new classes, providing technical support for applications in hospitals, portable apps, and assistance for visually impaired individuals. This task faces three core challenges: overfitting, fine-grained classification problems, and catastrophic forgetting. We propose the Well-Prepared Few-shot Class-incremental Learning (WP-FSCIL) framework, which addresses overfitting through a parameter-freezing strategy, enhances the robustness and discriminative power of backbone features with Center-Triplet (CT) loss and supervised contrastive loss for fine-grained classification, and alleviates catastrophic forgetting using a multi-dimensional Knowledge Distillation (KD) strategy based on flexible Pseudo-feature Synthesis (PFS).

View Article and Find Full Text PDF

The rehearsal-based continual learning methods usually involve reviewing a small number of representative samples to enable the network to learn new contents while retaining old knowledge. However, existing works overlook two crucial factors: (1) While the network prioritizes learning new data at incremental stages, it exhibits weaker generalization capabilities when trained individually on limited samples from specific categories, in contrast to training on large-scale samples across multiple categories simultaneously. (2) Knowledge distillation of a limited set of old samples can transfer certain existing knowledge, but imposing strong constraints may hinder knowledge transfer and restrict the ability of the network from the current stage to capture fresh knowledge.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!