Logo Bartleby.dev
Insights

Introducing Our Workflow Framework

Micaela Conners 7 min

Agent Brain

“Thoughts without content are empty, intuitions without concepts are blind.”

– Immanuel Kant, Critique of Pure Reason

It’s a great time to be a philosophy nerd. There’s no single right way to use Large Language Models for intelligent systems, so engineers must develop their own frameworks. And unlike the coding frameworks that many of us use – React, Django, Flutter, etc. – frameworks for building AI workflows are non-deterministic; they require both precise logic as well as higher-level abstract concepts to achieve the right outcomes.

Since the 1940s, one of the most popular conceptual frameworks for AI has been the human brain. The analogy remains popular today, and for good reason. Like the brain, AI works best when it blends multiple intelligent components: reason is shaped by experience, and learnings are encoded in memory. The same is true for LLMs.

At Bartleby.dev, we’ve developed a framework for AI-powered workflows. By structuring our workflows around reasoning, memory, and learning, we can create adaptable and powerful intelligent systems that grow with your organization. Here’s how we do it:

Reasoning

Reasoning

Reasoning is the backbone of any intelligent system. For our workflows, we’ve distilled logical processes into a few core pathways:

LLM Chain two or more LLMs collaborate sequentially to complete a task
LLM Loop two LLMs are assigned specific roles (e.g. instructor and assistant) and iterate on a task until it is complete or a set limit is reached
Parallelization multiple LLMs tackle subtasks simultaneously, then combine outputs in a later step
Human Interrupt an LLM works in concert with a human-in-the-loop for iterative improvement

Just as the brain integrates various inputs to support reasoning, so do the LLMs in our workflows. Knowledge Graphs provide enrichments along our pathways, as either inputs or outputs. Inputs provide organizational context for a task (see our piece here for a deep dive), whereas outputs store the results of a task or context for future tasks.

Here’s an example putting together pathways and knowledge graphs to create a robust workflow for a marketing campaign:

Reasoning Framework Example

LLM Chain: One LLM drafts the campaign strategy; another generates initial content such as social media posts and ad copy based on the strategy

LLM Loop: The content creation LLM iterates with another LLM acting as a branding expert, refining the tone, style, and messaging

Parallelization: Multiple LLMs generate tailored marketing materials for different platforms (e.g., email, Instagram, and blog posts) simultaneously

Human Interrupt: A marketing manager reviews key elements, such as the campaign theme and high-priority ads, providing critical feedback

Knowledge Graph Input Context: The knowledge graph provides detailed insights into the brand’s guidelines, target audience personas, and previous successful campaigns

Knowledge Graph Output Storage: Finalized materials and campaign metrics are stored, enabling continuous learning and performance benchmarking for future campaigns

Additionally, we programmatically explore different reasoning pathways, testing multiple prompts and enrichment combinations to optimize results. We’ll discuss running multiple pathways at once in more detail in the Learning section below.

Memory

Memory

Embedding memory into workflows ensures high-quality, context-aware performance. We segment memory into short-term and long-term components.

Short-Term Memory Information passed directly between steps in a workflow (e.g. outputs from one LLM becoming inputs for another)
Long-Term Memory Persistent storage of data in knowledge graphs, enabling workflows to reference past decisions and outputs

Here we use customer support automation as an example:

Memory Framework Example

Short-Term Memory: A customer query and a real-time response log are passed through workflow steps

Long-Term Memory: Historical interaction data informs personalized responses and predictive suggestions

By layering long-term memory with reasoning, our workflows are equipped with semantic understanding tailored to your organization.

Learning

Learning

For workflows to improve, they must learn from outcomes. We achieve this through direct feedback and iterative optimization. There are three key components to our learning framework:

Human Feedback Loop Outputs adjusted by experts are stored in the knowledge graph for future reference

Performance Metrics Every pathway in each of our workflows are scored across the following metrics:
- Completion Rate: Percentage of tasks the workflow completes fully
- Acceptance Rate: Frequency of outputs accepted without changes
- Modification Intensity: Degree of human adjustments needed
- User Satisfaction: Qualitative feedback from end users

Pathway Optimization Using the performance metrics, our learning framework identifies the most effective logic pathways for accurately completing workflows. By tracking performance across iterations, we evaluate and refine different pathway combinations. Underperforming pathways are gradually eliminated, while high-performing ones are enhanced through iterative improvements, aiming to maximize overall workflow efficiency.

Here’s an example for an engineering workflow:

Learning Framework Example

Human Feedback: An engineer corrects generated code

Metrics: Modification Intensity metric reveals common errors

Pathway Optimization: The workflow is refined by removing underperforming pathways, and systematically tweaking successful pathways for increased accuracy and efficiency

Through our client portal, users can monitor these metrics for all workflows in production for both transparency and control. As our optimizations happen in deterministic iterations, there’s no black box to contend with.

Let’s Connect

Let’s Connect

Kant’s insight—that intuition and logic must work together—resonates in the era of LLMs. By structuring AI-powered workflows around reasoning, memory and learning, we create systems that optimize and evolve as your organization grows.

At Bartleby.dev, we’re excited to share how our frameworks can empower your organization. Let’s build the future of intelligent systems together.