Deepak Sahu commited on
Commit
9affe81
·
unverified ·
1 Parent(s): 81388e4

final final

Browse files
Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -31,7 +31,8 @@ Try it out: https://huggingface.co/spaces/LunaticMaestro/book-recommender
31
 
32
  ## Table of Content
33
 
34
- - [Running Inference Locally](#libraries-execution)
 
35
  - [10,000 feet Approach overview](#approach)
36
  - Pipeline walkthrough in detail
37
 
@@ -47,7 +48,7 @@ Try it out: https://huggingface.co/spaces/LunaticMaestro/book-recommender
47
 
48
  - [Evaluation Metric & Result](#evaluation-metric--result)
49
 
50
- ## Running Inference Locally
51
 
52
  ### Memory Requirements
53
 
@@ -59,7 +60,7 @@ The code need <2Gb RAM to use both the following. Just CPU works fine for infere
59
 
60
  ### Libraries
61
 
62
- I used google colab (CPU Only) with following libraries extra.
63
 
64
  ```SH
65
  pip install sentence-transformers datasets gradio
 
31
 
32
  ## Table of Content
33
 
34
+ - [Running Inference Locally](#running-inference)
35
+ - [Colab 🏎️ & minimal set up](#goolge-colab)
36
  - [10,000 feet Approach overview](#approach)
37
  - Pipeline walkthrough in detail
38
 
 
48
 
49
  - [Evaluation Metric & Result](#evaluation-metric--result)
50
 
51
+ ## Running Inference
52
 
53
  ### Memory Requirements
54
 
 
60
 
61
  ### Libraries
62
 
63
+ `requirements.txt` is set up such that HF can best not create conflict. I developed the code in google colab with following libraries that required manual installation.
64
 
65
  ```SH
66
  pip install sentence-transformers datasets gradio