IBM watsonx.ai Flows Engine

Getting Started

  • Overview
  • Installation
  • Authentication
  • Getting Started

Getting Started

Before you can start building AI applications using watsonx.ai Flows Engine:

  • 1. Sign up for a free account

  • 2. Download & install the CLI for Python or Node.js

  • 3.Authenticate your account

You can then start building your own tool by following the instructions below.


  • 1. Initialize a new project, in a new directory on your machine:

    wxflows init --endpoint=api/getting-started

  • 2. Build or import a tool:

    By importing your own data source (REST, GraphQL, Databases):

    wxflows import curl https://jsonplaceholder.typicode.com/todos/ --name todos --query-name todos --query-type Todos

    See here for instructions on importing other data sources as tools

    Or, by importing a prebuilt tool:

    wxflows import tool https://raw.githubusercontent.com/IBM/wxflows/refs/heads/main/tools/wikipedia.zip

  • 3. Register your tool(s):

    When you generate a GraphQL schema for your data source, a placeholder tool definition will be added to the

    file in your project. Open the file and uncomment the generated tool definition, or create your own tool definition from scratch. For example:

    extend type Query {
      todos: TC_GraphQL
        @supplies(query: "tc_tools")
        @materializer(
          query: "tc_graphql_tool"
          arguments: [
            { name: "name", const: "todos" }
            { name: "description", const: "Retrieve a list of todos" }
            { name: "fields", const: "todos" }
          ]
        )
    }

    Importing a tool from the directory of prebuilt tools will automatically generate the tool definition based on the contents of the tool. You can edit this in the

    file in your project

  • 4. Deploy a watsonx.ai Flows Engine endpoint with your tool definitions:

    wxflows deploy

    The deployed endpoint will be shown in your terminal and looks something like https://YOUR_USERNAME.REGION.ibm.stepzen.net/api/getting-started/.

  • 5. Integrate into libraries such as LangChain:

    import { WatsonxAI } from "@langchain/community/llms/watsonx_ai";
    import {
      HumanMessage,
      SystemMessage,
    } from "@langchain/core/messages";
    import wxflows from "@wxflows/sdk/langchain";
    
    const toolClient = new wxflows({
      endpoint: "YOUR_WXFLOWS_ENDPOINT",
      apikey: "YOUR_WXFLOWS_APIKEY",
    });
    
    const tools = await toolClient.lcTools;
    
    const model = new WatsonxAI({
      ibmCloudApiKey: "YOUR_IBM_CLOUD_APIKEY",
      projectId: "YOUR_WATSONXAI_PROJECTID"
      modelId: 'meta-llama/llama-3-2-3b-instruct',
    }).bindTools(tools);
    
    // Create message thread
    const messages = [
      new SystemMessage(
        "You are a helpful assistant that will only use the tools available and doesn't answer the question based on pre-trained data. Only perform a single tool call to retrieve all the information you need."
      ),
      new HumanMessage(
        "Search information about the book escape from james patterson"
      ),
    ];
    
    const aiMessage = await model.invoke(messages);
    console.log({ aiMessage });
    // should print a suggested tool call you can execute using the wxflows SDK
    

    See here for instructions on integrating with other AI libraries and agentic frameworks.

    Or, use the below application in StackBlitz. You only have to add your credentials to the

    file:

    StackBlitz

Continue building

You can now continue building your application by exploring the rest of our documentation or by using one of our many tutorials:

More tutorials will be coming soon. Requests for specific topics? Please let us know on Discord