PTHQL_level2_North_Sea_Germanic

This is the Level 2 North Sea Germanic Phylogenetic Tree Hierarquical QLoRAs (PTHQL) adapter from Generating from AMRs into High and Low-Resource Languages using Phylogenetic Knowledge and Hierarchical QLoRA Training (HQL) used for AMR-to-Text generation.

Use

This model is the Level 2 of 4 hierarquical LoRAs. It is strongly adviseable to load all 4 LoRAs in order.

The following is minimal code to generate English text from an AMR graph:

from transformers import MT5ForConditionalGeneration, AutoTokenizer
from peft import PeftModel

model = MT5ForConditionalGeneration.from_pretrained('google/mt5-large')
tokennizer = AutoTokenizer.from_pretrained('google/mt5-large')

model = PeftModel.from_pretrained(model, 'WilliamSotoM/PTHQL_level0_Indo_European')
model = model.merge_and_unload()

model = PeftModel.from_pretrained(model, 'WilliamSotoM/PTHQL_level1_Germanic')
model = model.merge_and_unload()

model = PeftModel.from_pretrained(model, 'WilliamSotoM/PTHQL_level2_North_Sea_Germanic')
model = model.merge_and_unload()

model = PeftModel.from_pretrained(model, 'WilliamSotoM/PTHQL_language_English')
model = model.merge_and_unload()

graph = '''
(w / want-01
   :ARG0 (b / boy)
   :ARG1 (b2 / believe-01
             :ARG0 (g / girl)
             :ARG1 b))
'''
tokenized_input = tokenizer(graph, return_tensors='pt')

with torch.inference_mode():
    prediction = model.generate(**tokenized_input)
    generated_text = tokenizer.batch_decode(prediction, skip_special_tokens=True)[0]

print(f'Generated text:', generated_text)

Expected outpu:

The boy wants the girl to believe him.
Downloads last month
18
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for WilliamSotoM/PTHQL_level2_North_Sea_Germanic

Base model

google/mt5-large
Adapter
(24)
this model

Space using WilliamSotoM/PTHQL_level2_North_Sea_Germanic 1

Collection including WilliamSotoM/PTHQL_level2_North_Sea_Germanic