YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Copied from https://huggingface.co/susnato/phi-2 commit@9070ddb4fce238899ddbd2aca1faf6a0aeb6e444.

This model can be loaded using HuggingFace transformers commit@4ab5fb8941a38d172b3883c152c34ae2a0b83a68.

Below is the original introduction, which may be expired now.


DISCLAIMER: I don't own the weights to this model, this is a property of Microsoft and taken from their official repository : microsoft/phi-2. The sole purpose of this repository is to use this model through the transformers API or to load and use the model using the HuggingFace transformers library.

Usage

First make sure you have the latest version of the transformers installed.

pip install -U transformers

Then use the transformers library to load the model from the library itself

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("susnato/phi-2")
tokenizer = AutoTokenizer.from_pretrained("susnato/phi-2")

inputs = tokenizer('''def print_prime(n):
   """
   Print all primes between 1 and n
   """''', return_tensors="pt", return_attention_mask=False)

outputs = model.generate(**inputs, max_length=200)
text = tokenizer.batch_decode(outputs)[0]
print(text)
Downloads last month
9
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.