RaushanTurganbay HF Staff commited on
Commit
cc1dff4
·
verified ·
1 Parent(s): 4474658

Update pipeline example

Browse files
Files changed (1) hide show
  1. README.md +22 -1
README.md CHANGED
@@ -36,7 +36,28 @@ other versions on a task that interests you.
36
 
37
  ### How to use
38
 
39
- You can load and use the model like following:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
40
  ```python
41
  from transformers import LlavaNextProcessor, LlavaNextForConditionalGeneration
42
  import torch
 
36
 
37
  ### How to use
38
 
39
+ To run the model with the `pipeline`, see the below example:
40
+
41
+ ```python
42
+ from transformers import pipeline
43
+
44
+ pipe = pipeline("image-text-to-text", model="llava-hf/llava-next-72b-hf")
45
+ messages = [
46
+ {
47
+ "role": "user",
48
+ "content": [
49
+ {"type": "image", "url": "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/ai2d-demo.jpg"},
50
+ {"type": "text", "text": "What does the label 15 represent? (1) lava (2) core (3) tunnel (4) ash cloud"},
51
+ ],
52
+ },
53
+ ]
54
+
55
+ out = pipe(text=messages, max_new_tokens=20)
56
+ print(out)
57
+ >>> [{'input_text': [{'role': 'user', 'content': [{'type': 'image', 'url': 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/ai2d-demo.jpg'}, {'type': 'text', 'text': 'What does the label 15 represent? (1) lava (2) core (3) tunnel (4) ash cloud'}]}], 'generated_text': 'Lava'}]
58
+ ```
59
+
60
+ You can also load and use the model like following:
61
  ```python
62
  from transformers import LlavaNextProcessor, LlavaNextForConditionalGeneration
63
  import torch