Create Transformers and Pretrained models.py
Browse files
pages/Transformers and Pretrained models.py
ADDED
@@ -0,0 +1,82 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
import streamlit as st
|
2 |
+
from transformers import pipeline
|
3 |
+
import torch
|
4 |
+
|
5 |
+
# Page Config
|
6 |
+
st.set_page_config(page_title='Transformers in NLP', layout='wide')
|
7 |
+
|
8 |
+
# Page Title
|
9 |
+
st.markdown('<h1 style="color:#4CAF50; text-align:center;">π€ Transformers & Pretrained Models in NLP π</h1>', unsafe_allow_html=True)
|
10 |
+
|
11 |
+
# Transformer Architecture
|
12 |
+
st.markdown('<h2 style="color:#FF5733">π 1. Transformer Architecture</h2>', unsafe_allow_html=True)
|
13 |
+
|
14 |
+
st.subheader('π Definition:')
|
15 |
+
st.write("""
|
16 |
+
The **Transformer architecture** revolutionized NLP by using **self-attention** to process sequences in parallel.
|
17 |
+
- **Self-attention** enables words to focus on others dynamically.
|
18 |
+
- The **encoder-decoder** structure is used in tasks like translation.
|
19 |
+
|
20 |
+
π Introduced in "**Attention is All You Need**" (Vaswani et al., 2017).
|
21 |
+
""")
|
22 |
+
|
23 |
+
st.subheader('π οΈ Key Components:')
|
24 |
+
st.write("""
|
25 |
+
- **Encoder**: Processes input tokens into internal representations.
|
26 |
+
- **Decoder**: Uses encoder outputs to generate predictions.
|
27 |
+
- **Multi-head Attention**: Allows diverse focus across sequences.
|
28 |
+
- **Positional Encoding**: Injects sequence order into embeddings.
|
29 |
+
""")
|
30 |
+
|
31 |
+
# Pretrained Models
|
32 |
+
st.markdown('<h2 style="color:#3E7FCB">π 2. Pretrained Models</h2>', unsafe_allow_html=True)
|
33 |
+
|
34 |
+
st.subheader('π Definition:')
|
35 |
+
st.write("""
|
36 |
+
Pretrained models leverage vast corpora to understand language patterns.
|
37 |
+
- **BERT**: Bi-directional context learning for diverse NLP tasks.
|
38 |
+
- **GPT**: Text generation with autoregressive modeling.
|
39 |
+
- **RoBERTa**: Optimized BERT variant.
|
40 |
+
- **T5**: Universal text-to-text learning.
|
41 |
+
- **XLNet**: Captures dependencies in all positions.
|
42 |
+
""")
|
43 |
+
|
44 |
+
# Sentiment Analysis Example
|
45 |
+
st.subheader('π Pretrained Model Example: Sentiment Analysis')
|
46 |
+
nlp = pipeline("sentiment-analysis", model="bert-base-uncased")
|
47 |
+
text = st.text_area("π Enter text to analyze", "Transformers are amazing!")
|
48 |
+
if st.button('π Analyze Sentiment'):
|
49 |
+
result = nlp(text)
|
50 |
+
st.write(f"**π§ Result:** {result}")
|
51 |
+
|
52 |
+
# Fine-tuning Pretrained Models
|
53 |
+
st.markdown('<h2 style="color:#E67E22">π 3. Fine-tuning Pretrained Models</h2>', unsafe_allow_html=True)
|
54 |
+
|
55 |
+
st.subheader('βοΈ Definition:')
|
56 |
+
st.write("""
|
57 |
+
Fine-tuning tailors pretrained models to specific NLP tasks:
|
58 |
+
- **Sentiment Analysis**: Classifies text sentiments.
|
59 |
+
- **Named Entity Recognition (NER)**: Detects names, locations, organizations.
|
60 |
+
- **Question Answering**: Extracts answers from given contexts.
|
61 |
+
""")
|
62 |
+
|
63 |
+
# NER Example
|
64 |
+
st.subheader('π Named Entity Recognition (NER)')
|
65 |
+
nlp_ner = pipeline("ner", model="dbmdz/bert-large-cased-finetuned-conll03-english")
|
66 |
+
text_ner = st.text_area("π€ Enter text for NER", "Barack Obama was born in Hawaii.")
|
67 |
+
if st.button('π¬ Perform NER'):
|
68 |
+
ner_results = nlp_ner(text_ner)
|
69 |
+
st.write("**π NER Results:**")
|
70 |
+
for entity in ner_results:
|
71 |
+
st.write(f"π {entity['word']} - {entity['entity']} - Confidence: {entity['score']:.2f}")
|
72 |
+
|
73 |
+
# Question Answering Example
|
74 |
+
st.subheader('π§ Question Answering with BERT')
|
75 |
+
nlp_qa = pipeline("question-answering", model="bert-large-uncased-whole-word-masking-finetuned-squad")
|
76 |
+
context = st.text_area("π Enter context", "Transformers revolutionized NLP with parallel processing.")
|
77 |
+
question = st.text_input("β Ask a question", "What did transformers revolutionize?")
|
78 |
+
if st.button('π€ Get Answer'):
|
79 |
+
answer = nlp_qa(question=question, context=context)
|
80 |
+
st.write(f"**π Answer:** {answer['answer']}")
|
81 |
+
|
82 |
+
st.markdown('<h3 style="color:#4CAF50; text-align:center;">β¨ Thanks for Exploring NLP! β¨</h3>', unsafe_allow_html=True)
|