Blog

Luis Majano

March 18, 2025

Spread the word


Share your thoughts

Welcome to the Future of AI in BoxLang

We are thrilled to introduce BoxLang Gen AI, a powerful and intuitive module that brings seamless AI generation capabilities to your BoxLang applications. With a fluent, functional API, this module allows developers to interact with multiple AI providers in a consistent and abstracted manner.

If you're looking for even more power, check out our bx-aiplus module (Coming Soon), available as part of our BoxLang +/++ subscriptions, which extends this module with additional AI providers, enhanced capabilities, and advanced features.

Open Source & Easy to Get Started

BoxLang AI is open-source, licensed under the Apache 2.0 license, and easy to install:

install-bx-module bx-ai

For CommandBox-based web applications, add it to your server.json or use:

box install bx-ai

Once installed, you can start leveraging built-in functions (BIFs) for AI interactions.

Quick Example

// chat.bxs
answer = aiChat( "How amazing is BoxLang?" )
println( answer )

Review Ortus Books

Supported AI Providers

BoxLang AI supports multiple Large Language Model (LLM) providers:

  • OpenAI
  • DeepSeek
  • Gemini
  • Grok
  • Perplexity

More providers are available in the bx-aiplus module.

Key Features

  • Multi-provider integration
  • Fluent API for composing AI interactions
  • Asynchronous chat requests
  • Global defaults for streamlined requests
  • Raw chat request composition
  • Support for AI service and tool creation
  • System message support

Configuration

You can configure the module by adding settings to boxlang.json:

{
    "modules": {
        "bxai": {
            "provider": "openai",
            "apiKey": "your-api-key",
            "defaultParams": {
                "model": "gpt-3.5-turbo",
                "temperature": 0.5
            }
        }
    }
}

Global Functions (BIFs)

BoxLang AI exposes several global functions (BIFs) for AI interactions:

aiChat() - Simple AI Chat

aiChat( "Write a haiku about recursion in programming." )

aiChatAsync() - Asynchronous Chat

For non-blocking AI interactions, use:

var future = aiChatAsync( "Write a haiku about recursion." )
    .then( result -> println( "AI Response: " + result ) )
    .onError( error -> writeLog( text: "AI Chat failed: " + error.getMessage(), type: "error" ) )
println( future.get() )

aiChatRequest() - Compose a Raw Chat Request

chatRequest = aiChatRequest(
    "Write a haiku about recursion in programming.",
    { model: "gpt-3.5-turbo", temperature: 0.5 },
    { provider: "grok", timeout: 10 }
);
response = aiService().invoke( chatRequest );

aiMessage() - Build AI Messages

aiChat(
    aiMessage()
        .system( "You are a helpful assistant." )
        .user( "Write a haiku about recursion in programming." )
);

aiService() - AI Service Objects

Create an AI Service reference for reusable interactions:

service = aiService( "openai" ).configure( "myApiKey" );
response = service.invoke( aiChatRequest( "Explain quantum computing." ) );

aiTool() - AI Tool Creation

Define tools for real-time system processing:

tool = aiTool( "get_weather", "Fetches real-time temperature.", location -> {
    return (location contains "San Francisco") ? "68F" : "unknown";
});
result = aiChat( "What's the temperature in San Francisco?", { tools: [ tool ] } );
println( result );

Resources

Looking for more information, support, or community engagement? Check out the following resources:

Conclusion

With BoxLang Gen AI, integrating Large Language Models into your applications has never been easier. Whether you're interacting with OpenAI, DeepSeek, Gemini, or other providers, this fluent and functional API empowers you to build powerful AI-driven applications effortlessly.

Get started today with BoxLang AI and take your applications to the next level!

Review Ortus Books

Add Your Comment

Recent Entries

πŸš€ ColdBox CLI 8.11: The Era of AI Skills Comes to Every ColdBox & BoxLang App

πŸš€ ColdBox CLI 8.11: The Era of AI Skills Comes to Every ColdBox & BoxLang App

ColdBox CLI 8.11 is here, and it's one of the most significant releases we've shipped for AI-assisted development. This release wires the CLI directly into our brand new public skills directory at skills.boxlang.io, brings our AI tooling in line with industry-wide agent conventions, and introduces a wave of quality-of-life improvements that make AI integration feel less like setup and more like infrastructure.

Luis Majano
Luis Majano
April 30, 2026
πŸ›°οΈ Introducing cbMCP β€” Your ColdBox App, Live to Every AI Agent

πŸ›°οΈ Introducing cbMCP β€” Your ColdBox App, Live to Every AI Agent

Today we're releasing cbMCP, the official ColdBox MCP Server β€” a BoxLang-only module that turns your running ColdBox application into a fully-compliant Model Context Protocol (MCP) server. Plug in any MCP-capable AI client β€” Claude Desktop, VS Code Copilot, Cursor, Codex, Gemini CLI, OpenCode β€” and your AI assistant gets live, read-only introspection across the entire ColdBox platform: routing, handlers, modules, WireBox, CacheBox, LogBox, schedulers, interceptors, and async executors. 🎯

Luis Majano
Luis Majano
April 30, 2026