bal_arxiv_scientific_paps_berttopic_model
This is a BERTopic model. BERTopic is a flexible and modular topic modeling framework that allows for the generation of easily interpretable topics from large datasets.
Usage
To use this model, please install BERTopic:
pip install -U bertopic
You can use the model as follows:
from bertopic import BERTopic
topic_model = BERTopic.load("Rchamba/bal_arxiv_scientific_paps_berttopic_model")
topic_model.get_topic_info()
Topic overview
- Number of topics: 14
- Number of training documents: 360
Click here for an overview of all topics.
Topic ID | Topic Keywords | Topic Frequency | Label |
---|---|---|---|
-1 | data - steganography - secret - probability - method | 13 | -1_data_steganography_secret_probability |
0 | sp - intelligence - processing - human - image | 35 | 0_sp_intelligence_processing_human |
1 | quantum - automata - finite - classical - measurement | 64 | 1_quantum_automata_finite_classical |
2 | problems - complexity - constraints - symmetry - csps | 37 | 2_problems_complexity_constraints_symmetry |
3 | logic - computability - cl - edu - www | 25 | 3_logic_computability_cl_edu |
4 | science - citation - journals - social - communication | 24 | 4_science_citation_journals_social |
5 | tetraquark - vector - bar - rm - qcd | 23 | 5_tetraquark_vector_bar_rm |
6 | combinatorial - problems - design - problem - clustering | 22 | 6_combinatorial_problems_design_problem |
7 | prediction - entropy - model - universal - cc | 22 | 7_prediction_entropy_model_universal |
8 | notes - informal - spaces - analysis - metric | 21 | 8_notes_informal_spaces_analysis |
9 | orbital - earth - postnewtonian - effects - artificial | 21 | 9_orbital_earth_postnewtonian_effects |
10 | keyphrases - word - algorithm - semantic - similarity | 20 | 10_keyphrases_word_algorithm_semantic |
11 | kernel - gmm - kernels - datasets - classification | 17 | 11_kernel_gmm_kernels_datasets |
12 | data - ultrametric - ultrametricity - analysis - application | 16 | 12_data_ultrametric_ultrametricity_analysis |
Training hyperparameters
- calculate_probabilities: True
- language: english
- low_memory: False
- min_topic_size: 10
- n_gram_range: (1, 1)
- nr_topics: None
- seed_topic_list: None
- top_n_words: 10
- verbose: False
- zeroshot_min_similarity: 0.7
- zeroshot_topic_list: None
Framework versions
- Numpy: 2.0.2
- HDBSCAN: 0.8.40
- UMAP: 0.5.7
- Pandas: 2.2.2
- Scikit-Learn: 1.6.1
- Sentence-transformers: 4.1.0
- Transformers: 4.52.4
- Numba: 0.60.0
- Plotly: 5.24.1
- Python: 3.11.13
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support