Photonic reservoir computing has been demonstrated to be able to solve various complex problems. Although training a reservoir computing system is much simpler compared to other neural network approaches, it still requires considerable amounts of resources which becomes an issue when retraining is required. Transfer learning is a technique that allows us to re-use information between tasks, thereby reducing the cost of retraining. We propose transfer learning as a viable technique to compensate for the unavoidable parameter drift in experimental setups. Solving this parameter drift usually requires retraining the system, which is very time and energy consuming. Based on numerical studies on a delay-based reservoir computing system with semiconductor lasers, we investigate the use of transfer learning to mitigate these parameter fluctuations. Additionally, we demonstrate that transfer learning applied to two slightly different tasks allows us to reduce the amount of input samples required for training of the second task, thus reducing the amount of retraining.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11501920PMC
http://dx.doi.org/10.1515/nanoph-2022-0399DOI Listing

Publication Analysis

Top Keywords

transfer learning
20
reservoir computing
16
parameter drift
12
delay-based reservoir
8
computing system
8
transfer
5
learning photonic
4
photonic delay-based
4
reservoir
4
computing
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!