Skip to main content

Quickstart

Access to Portkey is only permitted via NYU VPN

The machine sending requests to the LLM gateway needs to be connected to the NYU VPN to access the LLM gateway. If it is not connected to the VPN, your requests will timeout and result in connection errors.

If you are sending requests from a server like Colab notebook while your laptop is on the VPN the requests will fail as the machine sending the requests is the Colab server, not your laptop!

Getting started with the LLM gateway

Gateway URL

Whenever you instantiate a Portkey client, the base_url must be set to base_url="https://ai-gateway.apps.cloud.rt.nyu.edu/v1/". If you miss this parameter you would be connecting to the vendor's SaaS platform and NYU provisioned virtual keys will not work.

With the virtual key and the API key in hand, you should now be able to run the following snippet of code:

from portkey_ai import Portkey

portkey = Portkey(
base_url="https://ai-gateway.apps.cloud.rt.nyu.edu/v1/",
api_key="", # Replace with your Portkey API key
virtual_key="", # Replace with your virtual key
)

completion = portkey.chat.completions.create(
messages=[
{"role": "system", "content": "You are not a helpful assistant"},
{"role": "user", "content": "Say this is a test"},
],
model="", # Replace with the LLM model you'd like to use
)

print(completion)

Once the script is executed, you can head back to app.portkey.ai to view the logs for the call!