OpenRouter vs OpenAI issue

This topic is: resolved

 

Thank you for contacting me. Please note that I live in the GMT+3 time zone - responses might be delayed by this.

This topic has 1 reply, 2 voices, and was last updated 9 months, 4 weeks ago by Szabi – CodeRevolution.

Viewing 1 reply thread
  • Author
    Posts
    • #9778


      pagenet
      Participant
      Post count: 4

      I tried setting OpenRouter models for all settings in single AI post.  There were NO settings for OpenAI, but still got OpenAI error messages in log stating I’m over my FREE OpenAI limit.

      1. If all settings are for OpenRouter and none for OpenAI, why is OpenAI API still being called?

      2.  How can I use OpenRouter instead of OpenAI as my primary (or only) API?

    • #9781


      Szabi – CodeRevolution
      Keymaster
      Post count: 4577

      Hello,

      First of all, thank you for your purchase.

      Currently OpenAI is the main API source of the plugin, OpenRouter is set as an additional possibility of API which can be used in the plugin.

      To not use OpenAI at all, you will  need to switch all AI models from OpenAI ‘gpt-3.5-turbo’ (which is the default model for all model selection settings fields in the plugin), to a model from OpenRouter.

      To do this, please switch all model selector settings fields in rule settings and also in the ‘Main Settings’ menu. Please go through all tabs and where you see the gpt-3.5-turbo model listed in settings, change it to an OpenRouter model.

      After saving settings, OpenAI should not be used any more by the plugin.

      Regards, Szabi – CodeRevolution.

Viewing 1 reply thread

The topic ‘OpenRouter vs OpenAI issue’ is closed to new replies.