PopAi Logo PopAi
|
Your Personal AI Workspace

OpenTensor: Reproducing Faster Matrix Multiplication Discovering Algorithms

Authors: Yiwen Sun, Wenye Li
TLDR:
The document discusses the development of OpenTensor, a reproduction of AlphaTensor, which uses Deep Reinforcement Learning (DRL) to discover a more efficient matrix multiplication algorithm. AlphaTensor, developed by DeepMind, provided a promising framework for solving scientific problems but was challenging to reproduce due to complex techniques and lack of source codes. OpenTensor simplifies the algorithm, clarifies technical details, and makes improvements to the training process. It combines deep neural networks and Monte Carlo Tree Search (MCTS) to search for the best factor for tensor decomposition. The algorithm is evaluated on RTX 4090 and shows improved convergence compared to the original AlphaTensor. The paper also discusses the generation of synthetic data, basis transformation, action canonicalization, and order shuffling as techniques to improve the training process. The results demonstrate that OpenTensor successfully discovers efficient matrix multiplication algorithms.
Free Login To Access AI Capability
Free Access To ChatGPT

The document discusses the development of OpenTensor, a reproduction of AlphaTensor, which uses Deep Reinforcement Learning (DRL) to discover a more efficient matrix multiplication algorithm, addressing the challenges of reproducing AlphaTensor and making improvements to the training process, resulting in successful discovery of efficient matrix multiplication algorithms.

Free Access to ChatGPT

Abstract

OpenTensor is a reproduction of AlphaTensor, which discovered a new algorithm that outperforms the state-of-the-art methods for matrix multiplication by Deep Reinforcement Learning (DRL). While AlphaTensor provides a promising framework for solving scientific problems, it is really hard to reproduce due to the massive tricks and lack of source codes. In this paper, we clean up the algorithm pipeline, clarify the technical details, and make some improvements to the training process. Computational results show that OpenTensor can successfully find efficient matrix multiplication algorithms.

Method

The authors used a combination of deep neural networks and Monte Carlo Tree Search (MCTS) to develop the OpenTensor algorithm, which aims to discover efficient matrix multiplication algorithms. They reformulated the matrix multiplication problem as a tensor decomposition problem and then as a search problem, solving it recursively. The algorithm searches for one factor at a time using a neural network to output the factor selecting policy and rank estimation, and then uses MCTS to select the final factor. The algorithm is mainly trained under the supervision of synthetic data, and various techniques such as basis transformation, action canonicalization, and order shuffling are employed to improve the training process and reduce potential redundancy in the tensor synthesis.

Main Finding

The authors of the paper discovered that their OpenTensor algorithm, a reproduction of AlphaTensor, successfully replicated the implementation process of AlphaTensor and made improvements on synthetic demonstrations. They found that OpenTensor can efficiently decompose tensors and discover more efficient matrix multiplication algorithms. Additionally, they observed that their improvements in the training process helped the model converge better and faster, as demonstrated in their computational results.

Conclusion

The conclusion of the document is that OpenTensor successfully replicates the implementation process of AlphaTensor and makes improvements on synthetic demonstrations. It efficiently decomposes tensors and discovers more efficient matrix multiplication algorithms. The authors also found that their improvements in the training process helped the model converge better and faster, as demonstrated in their computational results. Additionally, they observed that OpenTensor is sensitive to hyperparameters, and the hyperparameters from the original paper are almost the best. Overall, the document highlights the successful development and performance of the OpenTensor algorithm in discovering faster matrix multiplication algorithms.

Keywords

OpenTensor, AlphaTensor, Matrix Multiplication, Deep Reinforcement Learning, DRL, Tensor Decomposition, Synthetic Demonstrations, Monte Carlo Tree Search, Hyperparameters, Computational Results, Algorithm Pipeline, Basis Transformation, Action Canonicalization, Order Shuffling, Reproduction, Scientific Problem-solving, Neural Networks, Training Process, Redundancy, Convergence.

Powered By PopAi ChatPDF Feature
The Best AI PDF Reader

Read Paper with AI

PopAi Logo

OpenTensor: Reproducing Faster Matrix Multiplication Discovering Algorithms

AI Presentation

Chrome Extension

  • One-click PDF Summary One-click PDF Summary
  • Capture and Analyze Charts Capture and Analyze Charts
  • Extract Chart Data Extract Chart Data

Download our apps and chrome extension powered by ChatGPT