Model Card for Model ID
Works as a cyber assistant.
Model Details
Model Description
This is the model card of a 🤗 transformers model that has been pushed on the Hub.
- Developed by: Zardian18
- Model type: GPT2
- Language(s) (NLP): English
- Finetuned from model [optional]: OpenAi GPT2
Model Sources [optional]
- Repository: Github repo
Uses
Can be used to handle and solve basic cybersec queries rather than beating the bush.
Bias, Risks, and Limitations
Currently it is fine-tuned on GPT2, which is good but not comparable to state of the art LLMs and Transformers. Moreover, the dataset is small. Moreover, the predictions are not always accurate. There might be cases where it just doesn't responds directly to the question.
[More Information Needed]
Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
How to Get Started with the Model
from transformers import pipeline
pipe = pipeline("text-generation", model="Zardian/Cyber_assist2.0")
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Zardian/Cyber_assist2.0")
model = AutoModelForCausalLM.from_pretrained("Zardian/Cyber_assist2.0")
Training Details
Training Data
Cybersec queries and responses dataset consisting of 12408 rows and 2 columns.
Training Hyperparameters
- Training regime:
- Block size = 128
- Epochs = 10
- Batch Size = 16
- Save step size = 5000
- Save step limit =3
Speeds, Sizes, Times [optional]
Training time: 1hr 11mins 58sec
[More Information Needed]
Evaluation
Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type: Tesla T4 GPU
- Cloud Provider: Google Colab
- Compute Region: Asia
- Carbon Emitted: 0.08 kg of CO2eq
Technical Specifications [optional]
Objective
To construct an assistant which can help us provide solutions to any cybersecurity related queries.
- Downloads last month
- 146