FAQ

Are my conversations or usage data used for AI training or similar purposes?

No, whether you use internal or external models, your conversations and data are not used to train any AI models.

When using internal models, are my messages and conversations stored on your servers at any stage?

No, user messages and AI responses are not stored at any stage on our servers. Once your message is sent and you receive the response, the conversation is only available in your browser.

What data does Chat AI keep when I access the service?

We do not keep any conversations or messages on our servers. We only record some usage stastistics, in order to monitor the load on our service and improve the user experience. This includes usernames, timestamps, and the models/services that were requested. Everything else the ChatBot remembers (like the history of your last conversation, etc.) is only stored locally in your browser.

Are the OpenAI GPT... (external) models the real ChatGPT/GPT4 by OpenAI?

They are. We have signed a contract with Microsoft to be able to access the models running in their Azure cloud. Since the service costs money, it is only available to users in Lower Saxony or from the Max Planck Society. Thank you for your understanding.

When using external models, are my messages and conversations stored on Microsoft’s servers at any stage?

While we do not keep any conversations or messages on our servers, Microsoft retains the right to store messages/conversations for up to 30 days in order to prevent abuse. Since the request is sent directly from GWDG’s servers, no user information is included in the requests to Microsoft. For more information, see: https://learn.microsoft.com/en-us/legal/cognitive-services/openai/data-privacy

Why does GPT-4 refer to itself as GPT-3 when I ask what model it is?

This is a known issue when using GPT-4 via it’s API, see: https://community.openai.com/t/gpt-4-through-api-says-its-gpt-3/286881 Nevertheless, the model is in fact GPT-4, even if it states otherwise.

Do you offer an API? How can I request an API key?

Yes, we do. You can submit an application to use our API service from the KISSKI service webpage: https://kisski.gwdg.de/en/leistungen/2-02-llm-service/ Just click on “Book” and select “API access to our chat service” as the service type.

How can I use the API?

The API service is compatible with the OpenAI API standard: https://platform.openai.com/docs/api-reference/chat We provide the following endpoints:

  • /chat/completions
  • /completions

My institution is interested in using Chat AI. Can we advertise it to our users? Would you be able to handle an additional load for XXX users?

For large institutions, please contact us directly at info@kisski.de (?)

Can I use my own system prompts with the OpenAI models?

No, sorry. The system prompt used by the OpenAI models can’t be changed by end users. Please use our internal models if you need to set custom prompts.