---
library_name: setfit
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
metrics:
- accuracy
widget:
- text: Blockbuster Cuts Online Price, Challenges Netflix (Reuters) Reuters - Video
chain Blockbuster Inc on\Friday said it would lower the price of its online DVD
rentals\to undercut a similar move by Netflix Inc. that sparked a stock\a sell-off
of both companies' shares.
- text: Goss Gets Senate Panel's OK for CIA Post (AP) AP - A Senate panel on Tuesday
approved the nomination of Rep. Porter Goss, R-Fla., to head the CIA, overcoming
Democrats' objections that Goss was too political for the job.
- text: 'Crazy Like a Firefox Today, the Mozilla Foundation #39;s Firefox browser
officially launched -- welcome, version 1.0. In a way, it #39;s much ado about
nothing, seeing how it wasn #39;t that long ago that we reported on how Mozilla
had set '
- text: North Korea eases tough stance against US in nuclear talks North Korea on
Friday eased its tough stance against the United States, saying it is willing
to resume stalled six-way talks on its nuclear weapons if Washington is ready
to consider its demands.
- text: Mauresmo confident of LA victory Amelie Mauresmo insists she can win the Tour
Championships this week and finish the year as world number one. The Frenchwoman
could overtake Lindsay Davenport with a win in Los Angeles.
pipeline_tag: text-classification
inference: true
base_model: sentence-transformers/paraphrase-mpnet-base-v2
model-index:
- name: SetFit with sentence-transformers/paraphrase-mpnet-base-v2
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: accuracy
value: 0.8636842105263158
name: Accuracy
---
# SetFit with sentence-transformers/paraphrase-mpnet-base-v2
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 4 classes
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 2 |
- 'It girl turned sole trader makes millions as Jimmy Choo deal is <b>...</b> Tamara Mellon, the former It girl and fashion PR, yesterday saw a 150,000 loan from her father turn into a 50m fortune when the Jimmy Choo shoes business was bought by the American '
- 'Airport stands to lose big if ATA departs With ATA Airlines having crash-landed into bankruptcy court last week, the future of the airlines Fort Wayne-to-Chicago air service isnt the only thing up in the air.'
- '\\$149 million lottery winner: #39;No idea #39; what comes next Newly minted multimillionaire Juan Rodriguez, who won the \\$149 million Mega Millions lottery jackpot, said Sunday he hasn #39;t quit his job at a midtown Manhattan parking garage yet.'
|
| 0 | - "Musharraf ally wins prime minister vote Pakistan's parliament elected former finance minister Shaukat Aziz, a close ally of President Pervez Musharraf, as prime minister yesterday after the opposition boycotted the vote, saying it was undemocratic. Aziz, 55, worked 30 years as a Citibank executive before becoming finance minister when Musharraf took power in a bloodless military coup in 1999. Suave, articulate, and media-savvy, Aziz is credited ..."
- 'World ; Mexico Opposition Has Early Lead in State Voting Victories would put the PRI on track to win back the presidency after losing it in 2000 to President Vicente Fox #39;s center-right National Action Party (PAN) after 71 years of uninterrupted rule.'
- 'Schwarzenegger Lauds Bush on Terror Fight (AP) AP - California Gov. Arnold Schwarzenegger drew on his childhood in Soviet-occupied Austria to endorse President Bush\'s war on terror. "Terrorism is more insidious than communism," the bodybuilder-turned-politician said Tuesday in a speech to the Republican convention.'
|
| 1 | - 'Security for Olympics Successful, Greek Defense Minister Says Greece #39;s defense minister says his country #39;s security preparations for the Athens Olympics have been so successful that it is now in a position to advise China, the host of the 2008 Summer Games, on how to avoid terrorist attacks. But the minister says ...'
- 'Gordo at Fenway: It #39;s slow-pitch softball with jacked-up fans First and foremost, the Red Sox can hit. They look like a slow-pitch softball team, with all that unruly hair, and they tee off on pitches like one.'
- 'Lara Proves the Undoing of Flintoff Brian Laras brilliance undermined Englands ICC Champions Trophy victory bid at the Oval this morning. West Indies captain Lara, who earlier won the toss, inflicted a major '
|
| 3 | - 'Google and Mozilla Firefox Working Together Google have hosted a customized Internet Explorer search page for many years now specially designed to act as sidebar search. And with the launch of the first final release of the very popular Mozilla Firefox '
- 'Gaps in Intel Itanium, Xeon Road Maps During its bi-annual Developers Forum in San Francisco this week, the chipmaker focused on promoting its next-generation Itanium and Xeon processors.'
- "Conway or the highway CNET News.com's Charles Cooper says the board is only telling half the story behind the surprise firing of PeopleSoft's CEO."
|
## Evaluation
### Metrics
| Label | Accuracy |
|:--------|:---------|
| **all** | 0.8637 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("vidhi0206/setfit-paraphrase-mpnet-ag_news")
# Run inference
preds = model("Mauresmo confident of LA victory Amelie Mauresmo insists she can win the Tour Championships this week and finish the year as world number one. The Frenchwoman could overtake Lindsay Davenport with a win in Los Angeles.")
```
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:--------|:----|
| Word count | 22 | 36.1562 | 67 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0 | 8 |
| 1 | 8 |
| 2 | 8 |
| 3 | 8 |
### Training Hyperparameters
- batch_size: (8, 8)
- num_epochs: (1, 1)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 20
- body_learning_rate: (2e-05, 2e-05)
- head_learning_rate: 2e-05
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:------:|:----:|:-------------:|:---------------:|
| 0.0063 | 1 | 0.2415 | - |
| 0.3125 | 50 | 0.0396 | - |
| 0.625 | 100 | 0.0008 | - |
| 0.9375 | 150 | 0.0002 | - |
### Framework Versions
- Python: 3.8.10
- SetFit: 1.0.3
- Sentence Transformers: 2.3.1
- Transformers: 4.37.2
- PyTorch: 2.2.0+cu121
- Datasets: 2.17.0
- Tokenizers: 0.15.1
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```