BERT-Conv 文本分类模型

模型概述

在BERT基础上引入多尺度一维卷积模块,通过三种卷积核并行捕捉局部语义特征,与Transformer的全局注意力机制形成互补。代码实现见 modeling.ipynb

web demo

Downloads last month
15
Safetensors
Model size
113M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train Awaitinf/bert-conv