Quantcast
Channel: ToolJet
Viewing all articles
Browse latest Browse all 152

Build an AI Model Comparison App with ToolJet & Portkey

$
0
0

This tutorial will guide you through the process of building an AI Model Comparison App using ToolJet and Portkey. You will learn how to integrate these platforms to allow users to compare responses from different AI models seamlessly. The application will enable users to input prompts and receive outputs from various models, showcasing the differences in their responses. The AI Model Comparison App will be able to:

  • Input System and User Prompts: Users can enter both system prompts and user prompts.
  • Model Selection: Users can select from multiple AI models (e.g., OpenAI and Grok) for comparison.
  • Response Display: The app will display the responses from selected models side-by-side.
  • Token Usage Tracking: It will show the number of tokens used for each query.
  • Execution Time Measurement: The application will measure and display the time taken for each model to respond.

Check out this tutorial to learn how to build an AI-Assisted ATS using Supabase and OpenAI.

Prerequisites

This tutorial will walk you through building an AI Model comparison application on ToolJet. If you prefer to watch a video tutorial, check out this link.

Application Overview

To begin, let’s take a look at the main tabs we will build for our AI Model Comparision App:

  • The main interface is where users can input prompts and select models. Below that it displays the responses from different models, along with token usage and execution time. We can choose the models to test and benchmark their results based on time and token count.

Step 1: Adding Portkey as a Data Source

  • Log into your ToolJet account and go to Data Sources.
  • Find Portkey and click on + Add.
  • Enter the API Key and Default Virtual Key. You’ll get both of these from the Portkey Dashboard.
  • You can Test the connection and Save it.
  • Remember you need to add separate Data sources for each AI Model provider you want to add (like OpenAI and Groq) as they have different Default Virtual Keys.
  • We have added openai and groq as Data Sources for this tutorial.

Step 2: Building the User Interface

Go to Applications and create a new application. You get a blank canvas when you initially launch a new application.

  • Add a Text Component and change its text to “AI Model Comparison App”. Make it Size 24, Weight bolder, and Color Blue. Feel free to style it to fit your needs.
  • Add a Container Component.
  • Drop a Text Component inside the Container with the text “System”. Make it bolder.
  • Place a Texarea Component in the remaining area.
  • Make a Copy of this Container and change the label of its Text Component from “System” to “User”.
  • Change the Placeholder for both Textarea Components to “Enter your prompt…”
  • Add two Button Components below the Containers. Label them “Add” and “Run”.
  • Add another Button Component and label it “Clear All”.

Let’s work on the comparison pane next.

  • Add a Dropdown Component and remove its label text.
  •  In Properties, under Options, add the following Option labels – [{"OpenAI/gpt-3.5-turbo", "OpenAI/gpt-4.0", "OpenAI/gpt-4.0-mini", "OpenAI/gpt-4-turbo"}]
  • Similarly, under Options, add the following Option values – [{"gpt-3.5-turbo", "gpt-4.0", "gpt-4.0-mini", "gpt-4-turbo"}]
  • Change the Default value to {{"gpt-3.5-turbo"}}.
  • Add a Text Component that will show us the Model responses. Change its Data to Markdown and Line height to 2.
  • Add a Vertical Divider Component in the middle and increase its height to fit the canvas.
  • Make a copy of the Dropdown and Text Components and paste them on the other side of the Divider.
  • The new Dropdown is for Groq in our demo, so change the Option values to – {{["llama-3.1-8b-instant", "llama-3.1-70b-versatile", "mixtral-8x7b-32768", "llama3-8b-8192", "llama3-70b-8192", "gemma-7b-it"]}}
  • And the Option labels to – {{["Groq/llama-3.1-8b-instant", "Groq/llama-3.1-70b-versatile", "Groq/mixtral-8x7b-32768", "Groq/llama3-8b-8192", "Groq/llama3-70b-8192", "Groq/gemma-7b-it"]}}
  • Change the Default value to {{"llama-3.1-8b-instant"}}.

We are done building the User interface. Feel free to make changes as per your needs.

Step 3: Adding Queries

Expand the Query Panel at the bottom of the screen. Click on the + Add button to create a new query. You can learn more about queries here.

Set up openai Query

  • Rename this query to openai.
  • Under portkey, choose openai as the Data Source.
  • For the Operation, choose Chat.
  • Add the Message in the required format – [{"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "How are you?"}]
  • We can get the Model from the Dropdown. We have OpenAI models in the first Dropdown so we will add {{components.dropdown1.value}}.

Set up groq Query

  • Duplicate the openai query.
  • Rename this query to groq.
  • Under portkey, choose groq as the Data Source.
  • Add the Message in the required format – [{"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "How are you?"}]
  • We can get the Model from the Dropdown. We have Groq models in the second Dropdown so we will add {{components.dropdown2.value}}.

Set up onPageLoad Query

  • Choose Run JavaScript code.
  • Rename this query to onPageLoad.
  • Enter the following code in the editor:
actions.setVariable( 'messagesOpenAI’, [{ "role": "system", "content": "You are a helpful assistant.”}])

actions.setVariable( 'messagesGroqI’, [{ "role": "system", "content": "You are a helpful assistant.”}])

Once the user enters a prompt in the Textarea Components, we want to save it in the variables. To do that we will add JS code in the following query.

Set up calculateTime Query

We need another JS Query to calculate the execution time of the models.

  • Choose Run JavaScript code.
  • Rename this query to calculateTime.
  • Enter the following code in the editor:
actions.setVariable('messagesOpenAI', [ ...variables.messagesOpenAI, { "role": "user", "content": components.textarea2.value }]);

actions.setVariable('messagesGroq', [ ...variables.messagesGroq, { "role": "user", "content": components.textarea2.value }]);

const startTime1 = performance.now();

await actions.runQuery('openai');

const endTime1 = performance.now();

const time1 = (endTime1 - startTime1) / 1000;

const startTime2 = performance.now();

await actions.runQuery('groq');

const endTime2 = performance.now();

const time2 = (endTime2 - startTime2) / 1000;

return { time1, time2 }

We also would like to store the message received from OpenAI and Groq inside variables. Create another Run JavaScript code Query.

Set up addMessageToOpenai Query

  • Choose Run JavaScript code.
  • Rename this query to addMessageToOpenai.
  • Enter the following code in the editor:
let content = queries.openai.data.choices[0].message.content;

let tokens = queries.openai?.data?.usage?.total_tokens;

actions.setVariable('messagesOpenAI', [ ...variables.messagesOpenAI, { "role": "assistant", "content": content, "tokens": tokens }

]);

Duplicate this query for saving messages from Groq.

Set up addMessageToGroq Query

  • Duplicate the addMessageToOpenai Query.
  • Rename this query to addMessageToGroq.
  • Enter the following code in the editor-
let content = queries.groq.data.choices[0].message.content;

let tokens = queries.groq?.data?.usage?.total_tokens;

actions.setVariable('messagesGroq', [ ...variables.messagesOpenAI, { "role": "assistant", "content": content, "tokens": tokens }]);

Step 4: Adding Response to Text Components

We need to add the response received from either provider and convert it to Markdown. Create another Run JavaScript code query that’ll convert responses to Markdown.

Set up convertToMarkdown Query

  • Choose Run JavaScript code.
  • Rename this query to convertToMarkdown.
  • Enter the following code in the editor-
const formatConversation = (messages, model) => {

    let output = '';

    messages.forEach(message => {

        output += `**${message.role.charAt(0).toUpperCase() + message.role.slice(1)}**\n\n`;

        output += `${message.content}\n\n\n`;

        if (message.role === "assistant") {

            output += `**${message.tokens} token | ${model === "openAi" ? 

                (queries?.calculateTime?.data?.time1?.toFixed(2) || "calculation") + "s" : 

                (queries?.calculateTime?.data?.time2?.toFixed(2) || "calculation") + "s"}**\n\n`;

        }

    });

    return output.trim();

};

let openAiMessages = formatConversation(variables.messages0penAI,"openAi" ) ;

let groqMessages = formatConversation(variables.messagesGroq, "groq");

return { openAiMessages,groqMessages };

Add Message Response to Text Components

  • Click on the Text Component of the Groq model.
  • Click on Markdown under Data and enter {{queries.convertToMarkdown.data.groqMessages}}.
  • Click on the Text Component of the OpenAI model.
  • Click on Markdown under Data and enter {{queries.convertToMarkdown.data.openAiMessages}}.

Connecting everything

  • Click on the Run Button and add an On Click Event Handler with Action Run Query and choose calculateTime query.
  • Under Properties, in the Run Button, add the calculateTime query as {{queries.calculateTime.isLoading}} to the active Loading state. This will display a loader while data is being updated.  
  • Go to openai query and edit the Message to – {{variables.messagesOpenAI}}.
  • Add another Event Handler with a Query Success Event with Action as Run Query and it runs addMessageToOpenai Query.
  • Go to groq query and edit the Message to: 
{{variables.messagesGroq.map(message => {

const { tokens, ...rest } = message;

return rest;

})}}
  • Add another Event Handler with a Query Success Event with Action as Run Query and it runs addMessageToGroq Query.
  • Go to addMessageToGroq query and add another Event Handler with a Query Success Event with Action as Run Query and it runs convertToMarkdown Query.
  • Go to addMessageToOpenai query and add another Event Handler with a Query Success Event with Action as Run Query and it runs convertToMarkdown Query.
  • Click on the Clear all Button and add two Event Handlers
    • An On Click Event Handler with Action Run Query and choose onPageLoad Query.
    • An On Click Event Handler with Action Run Query and choose convertToMarkdown Query.

Set up addSystemPrompt Query

  • Choose Run JavaScript code.
  • Rename this query to addSystemPrompt.
  • Enter the following code in the editor:
actions.setVariable('messagesGroq', [...variables.messagesGroq, { "role": "system", "content": components.textarea1.value }]);

actions.setVariable('messagesOpenAI', [...variables.messagesOpenAI, { "role": "system", "content": components.textarea1.value }]);
  • Click on the Add Button and add an On Click Event Handler with Action Run Query and choose addSystemPrompt query.

Final Application Overview

In this tutorial, we have successfully built an AI Model Comparison App using ToolJet and Portkey. This application allows users to compare outputs from different AI models side-by-side, providing a seamless experience for evaluating their responses.

Conclusion

Congratulations on following this tutorial to build your own AI Model Comparison App! You have learned how to integrate ToolJet with Portkey, set up data sources, create a user-friendly interface, and implement the necessary logic to compare AI models.

If you have any doubts or questions as you continue your journey with ToolJet and Portkey, feel free to reach out to us on Slack.

The post Build an AI Model Comparison App with ToolJet & Portkey appeared first on ToolJet.


Viewing all articles
Browse latest Browse all 152

Trending Articles