Skip to main content

Why Vercel AI SDK + Alloy?

The Vercel AI SDK is designed for building streaming AI interfaces in Next.js and React. Combined with Alloy’s MCP Server, you can:
  • Create AI chatbots that can book meetings, send emails, and update CRMs
  • Build dashboards that pull real-time data from multiple platforms
  • Stream responses while executing background tasks across services
  • Deploy to Vercel with edge functions for global performance

Installation

npm install ai @ai-sdk/openai zod

How It Works

1

Set up the MCP client to communicate with your Alloy server
2

Define tools that wrap MCP functionality for the AI
3

Stream responses back to your React components
4

Execute actions seamlessly across connected platforms

Basic Example

Here’s a Next.js route handler that creates an AI assistant with platform integration capabilities:
// app/api/chat/route.ts
import { openai } from '@ai-sdk/openai';
import { streamText, tool } from 'ai';
import { z } from 'zod';

class AlloyMCPClient {
  private baseUrl: string;

  constructor(serverId: string, accessToken: string) {
    this.baseUrl = `https://mcp.runalloy.com/mcp/${serverId}/${accessToken}`;
  }

  async callTool(name: string, args: Record<string, any> = {}): Promise<any> {
    const response = await fetch(this.baseUrl, {
      method: 'POST',
      headers: { 'Content-Type': 'application/json', 'Accept': 'application/json, text/event-stream' },
      body: JSON.stringify({
        jsonrpc: '2.0',
        method: 'tools/call',
        params: { name, arguments: args },
        id: Date.now()
      })
    });

    const text = await response.text();
    // Parse event-stream response
    for (const line of text.split('\n')) {
      if (line.startsWith('data: ')) {
        return JSON.parse(line.slice(6)).result;
      }
    }
  }
}

// Initialize once
const mcp = new AlloyMCPClient(
  process.env.MCP_SERVER_ID!,
  process.env.MCP_ACCESS_TOKEN!
);

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = await streamText({
    model: openai('gpt-4'),
    messages,
    tools: {
      listConnectors: tool({
        description: 'Get available platform integrations',
        parameters: z.object({
          category: z.string().optional().describe('Filter by category')
        }),
        execute: async ({ category }) => {
          const result = await mcp.callTool('list_connectors_alloy', { category });
          return `Available connectors: ${JSON.stringify(result)}`;
        }
      }),
      executeAction: tool({
        description: 'Execute an action on a platform',
        parameters: z.object({
          connectorId: z.string().describe('Platform identifier'),
          actionId: z.string().describe('Action to execute'),
          parameters: z.record(z.any()).describe('Action parameters')
        }),
        execute: async ({ connectorId, actionId, parameters }) => {
          const result = await mcp.callTool('execute_action_alloy', {
            connectorId,
            actionId,
            parameters
          });
          return `Action completed: ${JSON.stringify(result)}`;
        }
      })
    },
    system: `You are a helpful assistant that can interact with various platforms.
             You can list available integrations and execute actions on them.
             Always explain what you're doing before taking actions.`
  });

  return result.toDataStreamResponse();
}

React Component

// app/chat.tsx
'use client';

import { useChat } from 'ai/react';

export default function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat();

  return (
    <div className="flex flex-col h-screen">
      <div className="flex-1 overflow-y-auto p-4">
        {messages.map(m => (
          <div key={m.id} className="mb-4">
            <span className="font-bold">{m.role}: </span>
            <span>{m.content}</span>
          </div>
        ))}
      </div>

      <form onSubmit={handleSubmit} className="p-4 border-t">
        <input
          value={input}
          onChange={handleInputChange}
          placeholder="Ask me to do something..."
          className="w-full p-2 border rounded"
        />
      </form>
    </div>
  );
}

Real-World Use Cases

  • Sales Assistant: “Find all Shopify orders from today and create tasks in Linear for fulfillment”
  • HR Automation: “Check Slack for new team members and add them to our Notion directory”
  • Customer Success Bot: “Pull usage data from our database and send weekly reports via SendGrid”
  • Content Manager: “Post this update to Twitter, LinkedIn, and our Discord community”

Production Tips

  • Store your MCP credentials in environment variables
  • Use Vercel KV for caching connector metadata
  • Implement rate limiting for API calls
  • Add error boundaries for graceful failures

Next Steps

I