Home
/
Blog
/
LangChain iMessage Tool
April 5, 2026
10 min read
Nikita Jerschow

LangChain iMessage Tool — Add Blue Bubble Messaging to AI Agents (2026)

LangChain agents can call any HTTP API as a tool. Wrap Sendblue's iMessage API in a custom LangChain tool and your agent can send blue bubble messages, receive replies, and carry on real conversations — all without leaving the LangChain framework.

Why connect LangChain to iMessage?

Most AI agent demos send responses to a chat widget or terminal. Real users live in their messaging apps. iMessage is the default communication channel for the 1.5 billion iPhone users worldwide — and blue bubbles get 30–45% response rates compared to 6% for email.

By giving your LangChain agent an iMessage tool, it can:

  • Send personalized outreach at scale without spam filters blocking it
  • Receive replies and continue multi-turn conversations
  • Escalate to a human when the conversation requires it
  • Send media, documents, and contact cards inline

The integration is a single Python class — no new infrastructure, no Apple credentials to manage.

Install dependencies

pip install langchain langchain-openai requests python-dotenv fastapi uvicorn

Create a .env file:

SENDBLUE_API_KEY_ID=your_key_id_here SENDBLUE_API_SECRET_KEY=your_secret_key_here OPENAI_API_KEY=your_openai_key_here

Get your Sendblue keys at dashboard.sendblue.com/company-signup — free sandbox, no credit card.

Create the SendblueTool class

LangChain tools inherit from BaseTool and implement _run. The agent passes a JSON string as input; we parse it and call the Sendblue API:

import json import os import requests from typing import Optional, Type from langchain.tools import BaseTool from pydantic import BaseModel, Field class SendMessageInput(BaseModel): """Input schema for the SendblueTool.""" number: str = Field( description="Recipient phone number in E.164 format, e.g. +14155551234" ) content: str = Field( description="Text content of the iMessage to send" ) send_style: Optional[str] = Field( default=None, description="Optional iMessage effect: invisible, celebration, shooting_star, etc." ) media_url: Optional[str] = Field( default=None, description="Optional public HTTPS URL of an image, video, or .vcf contact card" ) class SendblueTool(BaseTool): """LangChain tool for sending iMessages via Sendblue.""" name: str = "send_imessage" description: str = ( "Send an iMessage (blue bubble) to a phone number. " "Use this when you need to contact a person via their iPhone. " "Provide the phone number and the message content. " "Returns the delivery status and message handle." ) args_schema: Type[BaseModel] = SendMessageInput def _run( self, number: str, content: str, send_style: Optional[str] = None, media_url: Optional[str] = None, **kwargs ) -> str: payload = {"number": number, "content": content} if send_style: payload["send_style"] = send_style if media_url: payload["media_url"] = media_url response = requests.post( "https://api.sendblue.co/api/send-message", json=payload, headers={ "sb-api-key-id": os.environ["SENDBLUE_API_KEY_ID"], "sb-api-secret-key": os.environ["SENDBLUE_API_SECRET_KEY"], }, timeout=30, ) data = response.json() return json.dumps({ "status": data.get("status"), "messageHandle": data.get("messageHandle"), "error": data.get("error"), }) async def _arun(self, *args, **kwargs) -> str: raise NotImplementedError("Use async httpx version for async agents")

Wire the tool into an agent

Pass the tool to a LangChain agent with GPT-4o or Claude:

from dotenv import load_dotenv from langchain_openai import ChatOpenAI from langchain.agents import AgentExecutor, create_openai_tools_agent from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder load_dotenv() # Initialize the tool sendblue_tool = SendblueTool() # System prompt that instructs the agent when to use the tool prompt = ChatPromptTemplate.from_messages([ ("system", "You are a helpful sales assistant. When asked to reach out to a prospect, " "use the send_imessage tool to send a personalized iMessage. " "Keep messages concise, friendly, and under 160 characters for best engagement. " "Always confirm the phone number before sending."), MessagesPlaceholder(variable_name="chat_history", optional=True), ("human", "{input}"), MessagesPlaceholder(variable_name="agent_scratchpad"), ]) llm = ChatOpenAI(model="gpt-4o", temperature=0.3) agent = create_openai_tools_agent(llm, [sendblue_tool], prompt) agent_executor = AgentExecutor(agent=agent, tools=[sendblue_tool], verbose=True) # Run the agent result = agent_executor.invoke({ "input": ( "Send an iMessage to +14155551234 introducing our product. " "Mention we have a free trial and keep it casual." ) }) print(result["output"])

Handle replies via FastAPI webhook

Configure your webhook URL in the Sendblue dashboard. When a recipient replies, Sendblue POSTs to your server. Use FastAPI to receive the webhook and feed the reply back into the agent:

from fastapi import FastAPI, Request from pydantic import BaseModel from typing import Optional app = FastAPI() # In production, use a database or Redis for conversation history conversation_store: dict[str, list] = {} class SendblueWebhook(BaseModel): accountEmail: Optional[str] = None content: Optional[str] = None isOutbound: bool status: str messageHandle: str fromNumber: str number: str mediaUrl: Optional[str] = None @app.post("/webhook/sendblue") async def handle_reply(payload: SendblueWebhook): # Only process inbound messages (replies from contacts) if payload.isOutbound or not payload.content: return {"status": "ok"} phone = payload.fromNumber reply_text = payload.content # Retrieve existing conversation history for this contact history = conversation_store.get(phone, []) history.append({"role": "human", "content": f"Reply from {phone}: {reply_text}"}) # Run the agent with the reply as input result = agent_executor.invoke({ "input": f"Contact {phone} replied: '{reply_text}'. Respond appropriately.", "chat_history": history, }) # Store the updated history history.append({"role": "assistant", "content": result["output"]}) conversation_store[phone] = history return {"status": "ok"} # Run with: uvicorn main:app --reload --port 8000 # Expose locally: ngrok http 8000

Full agent that sends iMessages end to end

A complete example combining the tool, agent, and a simple outreach loop:

import csv from dotenv import load_dotenv load_dotenv() # Load prospect list from CSV: name, phone, company prospects = [ {"name": "Alice", "phone": "+14155551234", "company": "Acme Corp"}, {"name": "Bob", "phone": "+14155555678", "company": "Widget Co"}, ] for prospect in prospects: result = agent_executor.invoke({ "input": ( f"Send a personalized iMessage to {prospect['name']} at {prospect['phone']}. " f"They work at {prospect['company']}. " "Introduce Sendblue, mention the 30-45% response rate advantage over SMS, " "and ask if they have 15 minutes this week." ) }) print(f"Sent to {prospect['name']}: {result['output']}")

This pattern processes a CRM export and sends individualized outreach at scale — each message is generated by the LLM with context about that specific prospect, not a template blast.

Next steps

Ready to add iMessage to your AI agent?

Free sandbox, no credit card required. Get API keys in minutes.

Get API Access