ONNX — Optimization of Sentence Transformer (PyTorch) Models
ONNX Optimization of Sentence Transformers (PyTorch) Models to Minimze Computational Time

ONNX — Optimization of Sentence Transformer (PyTorch) Models

With the advancement in?Machine Learning, the models are becoming complex and utilizing great hardware capabilities with high computational time. Thus, there is a need for advancement to make these models optimized concerning the computational time across different devices.?ONNX?brings you a bag of options to handle complex Machine Learning/Deep Learning models (PyTorch and TensorFlow). I am going to experiment with ONNX to PyTorch conversion for Sentence Transformer models to minimize the computational time on the CPU machines.

ONNX and Sentence Transformers

I recently faced a problem with hardware requirements and computational time for a?Sentence Transformer Model. After researching, exploring and experimenting, I thought to write a blog about my findings so that they can help someone. I tested a simple model BERT and a complex sentence transformer model (all-MiniLM-L6-v2).

You can find the complete guide here

要查看或添加评论,请登录

ALGORYC的更多文章

社区洞察

其他会员也浏览了