The LLM-VM suports several default models intended to make experimentation with LLMs accessible to everyone, but if you have the memory required, larger parameter models will perform far better!

Here is an example if you want to use a large and small neo model for your teacher and student, and you have enough ram:

Loading Non-default models
# import our client
from llm_vm.client import Client

# Select the LlaMA model
client = Client(
       big_model = 'neo',
       small_model ='neo',

# Put in your prompt and go!
response = client.complete(prompt = 'What is Anarchy?', context = '')
# Anarchy is a political philosophy that advocates no government...

Now, keep in mind the gpt-neox-20b is almost 42 GB in size, and would require at least as much RAM to use, in addition to the gpt-neo-125m which is another ~ 0.5 GB.

Visit our Github Repo

Interested in learning more? Come see the code!