Request Parameters

This documentation offers a structured and concise overview of Konko's API parameters, guiding users in leveraging the platform's capabilities optimally.


Model Parameter

Selecting the right model is vital for content quality and relevance. Each model has unique capabilities. For details, see: List of Available Models.


Messages Parameter

The messages parameter structures conversation-based interactions. It's an array of message objects, each with a role (system, user, assistant) and content. System messages set the conversation tone, user messages are queries or commands, and assistant messages are responses or examples.


Advanced Parameters

  • Temperature: Controls randomness (0 to 1). Higher values increase variety, lower values produce more focused outputs.
  • top_p: Governs response selectiveness. Higher values allow more diversity, lower values restrict to likely responses.
  • Stop Sequences: Defines sequences to end the model's completion.
  • Presence/Frequency Penalties: Adjusts new topic exploration and token repetition.
  • Logit Bias: (For OpenAI models) Influences specific token probabilities.
  • n: Number of different responses generated.
  • stream: For real-time interactions, sends partial messages incrementally.
  • max_tokens: Sets the limit on the number of tokens in a completion.

Example SDK Usage

import konko

def get_konko_response(query_content: str):
    """
    Get a response from Konko's API based on a user query.

    Parameters:
    - query_content: Content of the user's query.

    Returns:
    - Response from Konko's API.
    """
    
    # Set the default values for the parameters
    default_parameters = {
        "model": "mistralai/Mistral-7B-Instruct-v0.1",
        "messages": [
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": query_content}
        ],
        "stream": False,
        "max_tokens": 400,
        "temperature": 1.0,
        "top_p": 1.0,
        "stop": None,  # You can add sequences like ["\n", "<end>"]
        "presence_penalty": 0,
        "frequency_penalty": 0,
        "logit_bias": None,  # This should be a dictionary mapping token IDs to bias values
        "n": 1
    }
    
    # Make the API call
    response = konko.ChatCompletion.create(**default_parameters)
    
    return response

# Example of use:
response = get_konko_response("Summarize the Foundation by Isaac Asimov")
print(response)

Output

ChatCompletion(
  id='014c221a-b3fa-478f-aebc-c07af842ae43',
  choices=[
    Choice(
      finish_reason=None,
      index=0,
      logprobs=None,
      message=ChatCompletionMessage(
        content=" The Foundation is a science fiction novel by Isaac Asimov that follows the story of a mathematician named Hari Seldon who predicts the fall of the Galactic Empire and the rise of a new empire called the Second Foundation. The novel explores the concept of psychohistory, a mathematical science that can predict the behavior of large groups of people, and the idea that individual actions can have unintended consequences. The story follows a group of characters who are tasked with saving the Second Foundation from destruction and uncovering the truth behind Hari Seldon's predictions.",
        role='assistant',
        function_call=None,
        tool_calls=None
      )
    )
  ],
  created=1704902334,
  model='mistralai/mistral-7b-instruct-v0.1',
  object='chat.completion',
  system_fingerprint=None,
  usage=None
)



What’s Next