Skip to content

NRP-Managed LLMs

Accessing NRP LiteLLM

LiteLLM Logo

We use the LiteLLM proxy to provide access to the LLMs we’re running on nautilus.

You can start from logging to the NRP litellm UI. If you’re coming from an .edu domain, you’ll be placed into NRP team unless you’re in a special SDSC list, in which case you’ll be placed in the SDSC team. Otherwise request in Matrix to be assigned to a Team.

Once you’re a member of a team, you can create the tokens and access the models.

To create a token, open the Virtual Keys tab and create a new token to access the API. There are also examples on using the API.

The base URL for all models is https://llm.nrp-nautilus.io.

LiteLLM nameModelFeatures
embed-mistralintfloat/e5-mistral-7b-instructembeddings
gorillagorilla-llm/gorilla-openfunctions-v2function calling
llama3meta-llama/Llama-3.2-90B-Vision-Instructmultimodal general purpose
llama3-sdscmeta-llama/Llama-3.3-70B-Instructgeneral purpose
llava-onevisionllava-hf/llava-onevision-qwen2-7b-ov-hfvision
olmoallenai/OLMo-2-1124-13B-Instructopen source
phi3microsoft/Phi-3.5-vision-instructvision

The following apps provide access to models with no additional tokens.

NRP OpenWebUI

OpenwebUI Logo

https://nrp-openwebui.nrp-nautilus.io - OpenWebUI chat

NRP LibreChat

LibreChat Logo

https://librechat.nrp-nautilus.io - LibreChat

NRP AnythingLLM

AnythingLLM Logo

https://nrp-llm-chat.nrp-nautilus.io - AnythingLLM (chat with documents)

Chat UIs using LiteLLM tokens

ChatBox

ChatBox Logo

You can also create your own token and use it via the Chatbox AI app