BERT as a service

BERT as a service

There are multiple ways of leveraging the open source BERT model for your NLP work, for example, via huggingface transformers or spacy transformers. I recently came across a post about running BERT as a service, and it was quite easy to setup too.

If the pipeline requires efficient extraction of BERT features this easy setup may save your dev/test time.

Create a environment if needed:

conda create --name bert

Install tensorflow (works with cpu or gpu)

pip3 install tensorflow-cpu==1.15 # version must be before 2.0

OR

pip3 install tensorflow-gpu==1.15 # version must be before 2.0

Install bert serving server and client

pip3 install -U bert-serving-server bert-serving-client

Start the model serving

bert-serving-start -model_dir /path_to_the_model/ -num_worker=2

For example, I used:

bert-serving-start -model_dir ../BERT_models/wwm_uncased_L-24_H-1024_A-16/   -num_worker=4

In Python 3, start using by setting up a client

from bert_serving.client import BertClient

client = BertClient()

vector = client.encode(['lovely portrait'])

References:


要查看或添加评论,请登录

Jayant Kumar的更多文章

  • DeepSeek-R1: A Pure RL-based Reasoning Model

    DeepSeek-R1: A Pure RL-based Reasoning Model

    I summarize the key steps involved in creating the DeepSeek models, from the foundational development of DeepSeek-R1 to…

    1 条评论
  • LLaVA-OneVision

    LLaVA-OneVision

    The LLaVA-NeXT series represents a groundbreaking evolution in large multimodal models with each iteration bringing…

    2 条评论
  • GraphRAG: Powerful but Expensive and Slow Solution

    GraphRAG: Powerful but Expensive and Slow Solution

    Microsoft's GraphRAG architecture represents a significant advancement in Retrieval-Augmented Generation (RAG) systems,…

    2 条评论
  • SIGIR Day 1 - Keynotes and Industry Papers

    SIGIR Day 1 - Keynotes and Industry Papers

    Day 1 started with the opening remarks from general/program chairs. Some key insights are as follows: RecSys has the…

  • LLM Alignment: Direct Preference Optimization

    LLM Alignment: Direct Preference Optimization

    In the realm of language models (LMs), alignment is essential to ensure that the outputs generated by these models meet…

    1 条评论
  • Behind the Rankings: LLM Model Evaluation in Benchmark Datasets

    Behind the Rankings: LLM Model Evaluation in Benchmark Datasets

    Over the past few days, there's been a flurry of posts discussing the newly unveiled Llama 3 model and its impressive…

  • Navigating the Shifting Tides: Reflections on the Rollercoaster Ride of 2023

    Navigating the Shifting Tides: Reflections on the Rollercoaster Ride of 2023

    The Unfolding Drama in Early 2023: Unrealistic Projections, Layoffs, and the Pressure to Innovate As the curtains rose…

    1 条评论
  • AI Horizons: A Closer Look at the Five Big AI Bets in 2023

    AI Horizons: A Closer Look at the Five Big AI Bets in 2023

    As we navigate the ever-evolving landscape of artificial intelligence, it's natural to wonder – which bets are paying…

    1 条评论
  • Custom Object Detector

    Custom Object Detector

    Recently I had a chance to try Tensorflow object detection API to develop a custom object detector - an object…

    2 条评论
  • Learning by Teaching

    Learning by Teaching

    I had heard before that the best way to learn anything is to try to teach it to others. If you can explain a topic of…

    3 条评论

社区洞察

其他会员也浏览了