I am getting error while trying to run code llama in mac os m2
#11
by
Yerrramsetty
- opened
Apple Mac Max
96 GB
Apple M2 Max:
Chipset Model: Apple M2 Max
Type: GPU
Bus: Built-In
Total Number of Cores: 38
Vendor: Apple (0x106b)
Metal Support: Metal 3
Displays:
Colour LCD:
Display Type: Built-in Liquid Retina XDR Display
Resolution: 3456x2234 Retina
Main Display: Yes
Mirror: Off
Online: Yes
Automatically Adjust Brightness: Yes
Connection Type: Internal
We can't help you if you don't share the error that you are getting (the full tracback!)
Make sure to also share the output of transformers-cli envs