Hady Rashwan commited on
Commit
f962791
·
1 Parent(s): dcce17e
README.md CHANGED
@@ -9,7 +9,7 @@ app_file: app.py
9
  pinned: true
10
  license: mit
11
  inference_file: weather_clothing_suggestion.py
12
- base_model: mistralai/Mistral-7B-Instruct-v0.1
13
  pipeline_tag: text-generation
14
  inference: true
15
  ---
@@ -39,88 +39,3 @@ To run the Streamlit app locally:
39
  streamlit run app.py
40
  ```
41
 
42
- ## Weather-based Clothing Suggestion Model
43
-
44
- This model provides clothing suggestions based on given weather conditions. It uses the Mistral-7B-Instruct model to generate appropriate clothing recommendations.
45
-
46
- ### API Usage
47
-
48
- To use this model via the Hugging Face Inference API, send a POST request with weather data in the following format:
49
-
50
- ```json
51
- {
52
- "weather_data": {
53
- "temperature": 20,
54
- "weather": "Sunny",
55
- "description": "clear sky",
56
- "humidity": 60,
57
- "wind_speed": 5
58
- }
59
- }
60
- ```
61
-
62
- The model will return a clothing suggestion based on the provided weather conditions.
63
-
64
- ### Example
65
-
66
- Input:
67
- ```json
68
- {
69
- "weather_data": {
70
- "temperature": 20,
71
- "weather": "Sunny",
72
- "description": "clear sky",
73
- "humidity": 60,
74
- "wind_speed": 5
75
- }
76
- }
77
- ```
78
-
79
- Output:
80
- ```json
81
- {
82
- "clothing_suggestion": "For a sunny day with a temperature of 20°C, clear skies, 60% humidity, and a light breeze of 5 m/s, I would suggest the following outfit:\n\nTop: A light, breathable short-sleeved t-shirt or a casual button-up shirt in a light color to reflect the sun.\n\nBottom: Comfortable khaki shorts or a light pair of jeans, depending on your preference and activities planned for the day.\n\nDon't forget to bring a light jacket or sweater in case the temperature drops later in the day, especially if you plan to be out in the evening."
83
- }
84
- ```
85
-
86
- ### Example Python Code
87
-
88
- Here's an example of how to call the model using Python and the `requests` library:
89
-
90
- ```python
91
- import requests
92
-
93
- API_URL = "https://api-inference.huggingface.co/models/hadirashwan/wear_what_clothing_suggestion"
94
- headers = {"Authorization": f"Bearer YOUR_HUGGINGFACE_API_TOKEN"}
95
-
96
- def query(payload):
97
- response = requests.post(API_URL, headers=headers, json=payload)
98
- return response.json()
99
-
100
- weather_data = {
101
- "temperature": 20,
102
- "weather": "Sunny",
103
- "description": "clear sky",
104
- "humidity": 60,
105
- "wind_speed": 5
106
- }
107
-
108
- output = query({"weather_data": weather_data})
109
- print(output)
110
- ```
111
-
112
- ## Repository Structure
113
-
114
- - `app.py`: The main Streamlit application file
115
- - `weather_clothing_suggestion.py`: The inference script for the clothing suggestion model
116
- - `requirements.txt`: List of Python dependencies
117
- - Other necessary files and assets
118
-
119
- ## Setup and Deployment
120
-
121
- 1. Ensure all files are in the root of your Hugging Face repository.
122
- 2. Set up the necessary secrets and environment variables in your Hugging Face space settings.
123
- 3. The Space will automatically deploy the Streamlit app based on the `app_file` specified in the YAML header.
124
- 4. The model inference will use the script specified by `inference_file` in the YAML header.
125
-
126
- For any issues or suggestions, please open an issue in this repository.
 
9
  pinned: true
10
  license: mit
11
  inference_file: weather_clothing_suggestion.py
12
+ base_model: mistralai/Mistral-7B-Instruct-v0.3
13
  pipeline_tag: text-generation
14
  inference: true
15
  ---
 
39
  streamlit run app.py
40
  ```
41
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
app.py CHANGED
@@ -18,7 +18,6 @@ SUPABASE_KEY = os.getenv("SUPABASE_KEY")
18
 
19
  # Initialize the Hugging Face Inference Client
20
 
21
- llvm_model_url = "https://api-inference.huggingface.co/models/mistralai/Mistral-7B-Instruct-v0.3/v1/chat/completions"
22
 
23
 
24
  # Initialize Supabase
@@ -27,6 +26,7 @@ supabase: Client = create_client(SUPABASE_URL, SUPABASE_KEY)
27
  model = SentenceTransformer('thenlper/gte-small')
28
 
29
  def call_llvm_model(prompt):
 
30
  payload = {
31
  "model": "mistralai/Mistral-7B-Instruct-v0.3",
32
  "messages": [
@@ -101,8 +101,18 @@ def get_ai_weather_explanation(weather_data):
101
  return call_llvm_model(prompt)
102
 
103
  def get_relevant_quote(weather_condition):
104
- # Encode the weather condition
105
- weather_embedding = model.encode(weather_condition).tolist()
 
 
 
 
 
 
 
 
 
 
106
 
107
  response = supabase.rpc("match_quote_embeddings",{
108
  'query_embedding': weather_embedding,
 
18
 
19
  # Initialize the Hugging Face Inference Client
20
 
 
21
 
22
 
23
  # Initialize Supabase
 
26
  model = SentenceTransformer('thenlper/gte-small')
27
 
28
  def call_llvm_model(prompt):
29
+ llvm_model_url = "https://api-inference.huggingface.co/models/mistralai/Mistral-7B-Instruct-v0.3/v1/chat/completions"
30
  payload = {
31
  "model": "mistralai/Mistral-7B-Instruct-v0.3",
32
  "messages": [
 
101
  return call_llvm_model(prompt)
102
 
103
  def get_relevant_quote(weather_condition):
104
+
105
+ url = "https://api-inference.huggingface.co/models/mixedbread-ai/mxbai-embed-large-v1"
106
+
107
+ payload = { "inputs": weather_condition}
108
+ headers = {
109
+ "content-type": "application/json",
110
+ "Authorization": f"Bearer {HF_API_KEY}"
111
+ }
112
+
113
+ response = requests.post(url, json=payload, headers=headers)
114
+
115
+ weather_embedding = response.json()
116
 
117
  response = supabase.rpc("match_quote_embeddings",{
118
  'query_embedding': weather_embedding,
create_table.sql CHANGED
@@ -5,5 +5,5 @@ CREATE TABLE public.quote_embeddings (
5
  chunk_id TEXT,
6
  quote_number TEXT,
7
  quote_text TEXT,
8
- embedding VECTOR(384)
9
  );
 
5
  chunk_id TEXT,
6
  quote_number TEXT,
7
  quote_text TEXT,
8
+ embedding VECTOR(1024)
9
  );
matching_documents.sql CHANGED
@@ -1,4 +1,4 @@
1
- create or replace function match_handbook_docs (
2
  query_embedding vector(1024),
3
  match_threshold float,
4
  match_count int
@@ -11,11 +11,11 @@ returns table (
11
  language sql stable
12
  as $$
13
  select
14
- handbook_docs.id,
15
- handbook_docs.content,
16
- 1 - (handbook_docs.embedding <=> query_embedding) as similarity
17
- from handbook_docs
18
- where 1 - (handbook_docs.embedding <=> query_embedding) > match_threshold
19
- order by (handbook_docs.embedding <=> query_embedding) asc
20
  limit match_count;
21
  $$;
 
1
+ create or replace function match_quote_embeddings (
2
  query_embedding vector(1024),
3
  match_threshold float,
4
  match_count int
 
11
  language sql stable
12
  as $$
13
  select
14
+ quote_embeddings.id,
15
+ quote_embeddings.quote_text,
16
+ 1 - (quote_embeddings.embedding <=> query_embedding) as similarity
17
+ from quote_embeddings
18
+ where 1 - (quote_embeddings.embedding <=> query_embedding) > match_threshold
19
+ order by (quote_embeddings.embedding <=> query_embedding) asc
20
  limit match_count;
21
  $$;
setup_db.py CHANGED
@@ -2,7 +2,6 @@ import os
2
  from langchain.text_splitter import RecursiveCharacterTextSplitter
3
  from sentence_transformers import SentenceTransformer
4
  from supabase import create_client, Client
5
- from postgrest.exceptions import APIError
6
 
7
  # Load environment variables
8
  from dotenv import load_dotenv
@@ -14,7 +13,7 @@ supabase_key = os.getenv("SUPABASE_KEY")
14
  supabase: Client = create_client(supabase_url, supabase_key)
15
 
16
  # Initialize SentenceTransformer
17
- model = SentenceTransformer('thenlper/gte-small')
18
 
19
  def process_text_file(file_path: str):
20
  # Read the file
 
2
  from langchain.text_splitter import RecursiveCharacterTextSplitter
3
  from sentence_transformers import SentenceTransformer
4
  from supabase import create_client, Client
 
5
 
6
  # Load environment variables
7
  from dotenv import load_dotenv
 
13
  supabase: Client = create_client(supabase_url, supabase_key)
14
 
15
  # Initialize SentenceTransformer
16
+ model = SentenceTransformer('mixedbread-ai/mxbai-embed-large-v1')
17
 
18
  def process_text_file(file_path: str):
19
  # Read the file
weather_clothing_suggestion.py CHANGED
@@ -1,9 +1,30 @@
1
- from transformers import AutoTokenizer, AutoModelForCausalLM
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
 
3
- # Load model and tokenizer
4
- model_name = "mistralai/Mistral-7B-Instruct-v0.1"
5
- tokenizer = AutoTokenizer.from_pretrained(model_name)
6
- model = AutoModelForCausalLM.from_pretrained(model_name)
7
 
8
  def generate_clothing_suggestion(weather_data):
9
  prompt = f"""
@@ -16,11 +37,7 @@ def generate_clothing_suggestion(weather_data):
16
  Suggest appropriate clothing to wear, including top and bottom.
17
  """
18
 
19
- inputs = tokenizer(prompt, return_tensors="pt")
20
- outputs = model.generate(**inputs, max_new_tokens=150, temperature=0.7, top_k=50, top_p=0.95)
21
- suggestion = tokenizer.decode(outputs[0], skip_special_tokens=True)
22
-
23
- return suggestion.split(prompt)[-1].strip()
24
 
25
  def inference(weather_data):
26
  try:
 
1
+ import requests
2
+ import os
3
+ HF_API_KEY = os.getenv("HUGGINGFACE_API_KEY",)
4
+
5
+ def call_llvm_model(prompt):
6
+ llvm_model_url = "https://api-inference.huggingface.co/models/mistralai/Mistral-7B-Instruct-v0.3/v1/chat/completions"
7
+ payload = {
8
+ "model": "mistralai/Mistral-7B-Instruct-v0.3",
9
+ "messages": [
10
+ {
11
+ "role": "user",
12
+ "content": prompt,
13
+ }
14
+ ],
15
+ "max_tokens": 500,
16
+ "stream": False
17
+ }
18
+ headers = {
19
+ "Authorization": f"Bearer {HF_API_KEY}",
20
+ "content-type": "application/json"
21
+ }
22
+
23
+ response = requests.post(llvm_model_url, json=payload, headers=headers)
24
+
25
+ response = response.json()
26
+ return response['choices'][0]['message']['content']
27
 
 
 
 
 
28
 
29
  def generate_clothing_suggestion(weather_data):
30
  prompt = f"""
 
37
  Suggest appropriate clothing to wear, including top and bottom.
38
  """
39
 
40
+ return call_llvm_model(prompt)
 
 
 
 
41
 
42
  def inference(weather_data):
43
  try: