AI Orchestrator Beyond LangChain šŸ¦œā›“ļø

Angelina Yang
3 min readAug 21

By now, most AI practitioners are likely familiar with LangChain or have had some experience using it. But is there something new and even more promising on the horizon?

Introducing Semantic Kernel

Microsoft’s Semantic Kernel presents a concept quite akin to LangChain. It stands as a sophisticated Software Development Kit (SDK) that integrates Large Language Models (LLMs) such as OpenAI, Azure OpenAI, and Hugging Face with conventional programming languages like C#, Python, and Java. The real magic happens when you harness Semantic Kernel’s potential to define plugins that can be intricately connected with just a few lines of code.

What makes Semantic Kernel special, however, is its ability to automatically orchestrate plugins with AI. With Semantic Kernel planners, you can ask an LLM to generate a plan that achieves a user’s unique goal. Afterwards, Semantic Kernel will execute the plan for the user.

Kernel and Plugins

Within this framework, the term ā€œkernelā€ designates a processing engine instance responsible for fulfilling a user’s specific request by assembling an array of plugins.

Source

In the above example, the kernel fuses together two plugins, each serving different functions. Similar to the idea of UNIX kernel but this time we are chaining together AI prompts and native functions, rather than programs.

Within the WriteSkill plugin, there can be multiple functions, each accompanied by its own semantic description, exemplified as follows:

Source

Planner

The Planner orchestrates which function to call for a specific user task. Just as its name suggests, it plans for each request. In the above example, the Planner would likely employ the ShortPoem and StoryGen functions, leveraging the provided semantic descriptions to fulfill the user's query.

For those who are familiar with the earlier era of chatbot development predating ChatGPT, the term ā€œintentā€ may ring a bell.

Angelina Yang