Skip to content

Annotation assistent tool that uses CLIP to find described objects in dataset and label them

License

Notifications You must be signed in to change notification settings

pavtiger/label-studio-clip-ml-backend

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

label-studio-clip-ml-backend

This project creates simple ML backend for label-studio that assists you in annotating a new dataset using CLIP, specifically OWL-ViT. As CLIP is trained on a dataset with annotated text captions, it is not needed to train any models like yolo, you just have to declare text classes, which is very convenient. For example ["a photo of a cow", "a photo of a chicken"]

demo.mp4

Clone repostory with submodules

git clone [email protected]:pavtiger/label-studio-clip-ml-backend.git --recursive

or just clone as usual and pull submodules with this command

git submodule update --init --recursive

Installation

It is suggested to use python venv for libraries installation

Activate venv

mkdir venv
python -m venv ./venv
source venv/bin/activate

Install requirements

pip install transformers  # CLIP
pip install -U -e label-studio-ml-backend  # install label studio backend
pip install redis rq  # additional libraries for the backend

Running backend

label-studio-ml init ml_backend --script ./main.py --force
label-studio-ml start ml_backend

The ML backend server becomes available at http://localhost:9090

You can also specify port for the webserver

label-studio-ml start ml_backend --port 8080 

Connecting to ML backend

Add an ML backend using the Label Studio UI

  • In the Label Studio UI, open the project that you want to use with your ML backend.
  • Click Settings > Machine Learning.
  • Click Add Model.
  • Type a Title for the model and provide the URL for the ML backend. For example, http://localhost:9090.
  • (Optional) Type a description.
  • (Optional) Select Use for interactive preannotation. See Get interactive pre-annotations for more.
  • Click Validate and Save.

Instructions to connect taken from label studio website

About

Annotation assistent tool that uses CLIP to find described objects in dataset and label them

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages