Publications by authors named "Christopher Watanabe"

Self-supervised pretext tasks have been introduced as an effective strategy when learning target tasks on small annotated data sets. However, while current research focuses on exploring novel pretext tasks for meaningful and reusable representation learning for the target task, the study of its robustness and generalizability has remained relatively under-explored. Specifically, it is crucial in medical imaging to proactively investigate performance under different perturbations for reliable deployment of clinical applications.

View Article and Find Full Text PDF