Publications by authors named "Yatao Bian"

During the evolution of large models, performance evaluation is necessary for assessing their capabilities. However, current model evaluations mainly rely on specific tasks and datasets, lacking a united framework for assessing the multidimensional intelligence of large models. In this perspective, we advocate for a comprehensive framework of cognitive science-inspired artificial general intelligence (AGI) tests, including crystallized, fluid, social, and embodied intelligence.

View Article and Find Full Text PDF
Article Synopsis
  • Self-supervised pretrained models are becoming popular in AI-driven drug discovery, but their effectiveness in extracting quality molecular representations is still under-researched.
  • The study introduces a method called RePRA, which evaluates and visualizes the relationship between molecular representations and their properties by adapting concepts from traditional structure-activity analysis.
  • Experiments show that while advanced pretrained models can improve upon conventional methods, they may still produce poor representations in some cases, highlighting the need for better techniques to refine these models.
View Article and Find Full Text PDF

Motivation: The crux of molecular property prediction is to generate meaningful representations of the molecules. One promising route is to exploit the molecular graph structure through graph neural networks (GNNs). Both atoms and bonds significantly affect the chemical properties of a molecule, so an expressive model ought to exploit both node (atom) and edge (bond) information simultaneously.

View Article and Find Full Text PDF

The emergence of Graph Convolutional Network (GCN) has greatly boosted the progress of graph learning. However, two disturbing factors, noise and redundancy in graph data, and lack of interpretation for prediction results, impede further development of GCN. One solution is to recognize a predictive yet compressed subgraph to get rid of the noise and redundancy and obtain the interpretable part of the graph.

View Article and Find Full Text PDF

The recent years have witnessed advances in parallel algorithms for large-scale optimization problems. Notwithstanding the demonstrated success, existing algorithms that parallelize over features are usually limited by divergence issues under high parallelism or require data preprocessing to alleviate these problems. In this paper, we propose a Parallel Coordinate Descent algorithm using approximate Newton steps (PCDN) that is guaranteed to converge globally without data preprocessing.

View Article and Find Full Text PDF