Building AI Agents That Actually Work: My Journey with Agno and Crawl4ai

How I built a team of AI agents that can extract information from websites and create engaging blog posts

The Problem: AI Agents That Don’t Deliver

I’ve been fascinated by AI agents for years. The promise of having AI assistants that can actually do things for you – not just chat – is incredibly compelling. But like many developers, I’ve been frustrated by the reality: most AI agent frameworks are either too complex to set up, too limited in what they can do, or just plain unreliable.

That’s why I was excited when I discovered Agno, a framework for building AI agents that actually work. And when I combined it with Crawl4ai, a tool for extracting information from websites, I found myself with a powerful system for creating content.

In this post, I’ll share my experience building a team of AI agents that can extract information from websites and create engaging blog posts. I’ll show you the code, explain how it works, and share some practical tips for building your own AI agent teams.

The Solution: A Team of Specialized Agents

The key insight I had was that instead of trying to build one super-agent that can do everything, it’s better to create a team of specialized agents that work together. Each agent has a specific role and set of tools, and they coordinate to accomplish a larger task.

For my blog post generator, I created two agents:

  1. A Crawler agent that extracts information from websites
  2. A Writer agent that crafts the final blog post

Here’s how I set it up:

from agno.agent import Agent
from agno.team import Team
from agno.models.openai import OpenAIChat
from agno.tools.crawl4ai import Crawl4aiTools

# Create web crawler agent to extract information
crawler = Agent(
    name="Crawler",
    role="Website information extractor",
    tools=[Crawl4aiTools(max_length=None)],
    model=OpenAIChat("gpt-4o"),
    instructions="""
    You are an expert at extracting comprehensive information from websites.
    1. Extract all relevant content, features, and capabilities
    2. Identify key components and use cases
    3. Organize information logically
    4. Preserve technical details accurately
    """
)

# Create writer agent to craft the final blog post
writer = Agent(
    name="Writer",
    role="Creative content writer",
    model=OpenAIChat("gpt-4o"),
    instructions="""
    You are an expert blog writer who crafts engaging, personal content.
    
    Use this style guide for the blog post:
    1. Write in an easy, conversational, and personal language style
    2. Structure content as a story with beginning, middle, and end
    3. Use a friendly, first-person tone that connects with readers
    4. Include practical insights that readers can apply
    5. Balance technical information with accessible explanations
    6. Incorporate storytelling elements to keep readers engaged
    7. Use analogies to explain complex concepts
    8. Include some personal reflections or questions that engage readers
    9. Keep paragraphs relatively short and digestible
    10. Use headings to break up content into logical sections
    11. End with a compelling call to action or thought-provoking conclusion
    """
)

# Create a team with these specialized agents
blog_team = Team(
    name="Blog Creation Team",
    mode="coordinate",
    members=[crawler, writer],
    instructions="""
    You are a specialized blog creation team that works together to:
    1. Extract comprehensive information from websites
    2. Create engaging blog posts with the requested style
    3. Structure content as a compelling story with beginning, middle, and end
    
    Follow this process:
    1. Crawler extracts all relevant information from the target website
    2. Writer creates a final blog post that applies the style instructions
    """,
    model=OpenAIChat("gpt-4o"),
    markdown=True,
)

How It Works: The Magic Behind the Scenes

The beauty of this approach is that each agent has a specific role and set of tools. The Crawler agent uses the Crawl4aiTools to extract information from websites, while the Writer agent focuses on crafting engaging content.

When you run the team with a task, it follows this process:

  1. The Crawler agent visits the target website and extracts all relevant information
  2. The Writer agent takes that information and crafts a blog post according to the style guide
  3. The team coordinates to ensure the final output meets all requirements

Here’s how you would run the team:

# Run the team with the blog creation task
blog_team.print_response("""
Create a blog post about https://docs.crawl4ai.com/ introducing the reader to the project and giving first practical insights.
The blog post must be in easy & personal language and follow the style of https://www.oneusefulthing.org/p/no-elephants-breakthroughs-in-image.
The blog post should be in the form of a story, with a beginning, middle and end.
""")

Beyond Blog Posts: Other Use Cases

Once I had this basic setup working, I started exploring other use cases. One of my favorites is a website-to-markdown converter that extracts content from any website and saves it as a well-formatted markdown file:

def extract_and_save_website(url):
    """Extract website content and save it as markdown."""
    # Create the website extractor agent
    extractor = Agent(
        name="WebsiteExtractor",
        role="Website content extractor",
        tools=[Crawl4aiTools(max_length=100000)],
        model=OpenAIChat("gpt-4o"),
        instructions="""
        You are an expert at extracting and formatting website content.
        
        Your task is to:
        1. Access the provided URL
        2. Extract all relevant content from the website
        3. Format the content in clean, well-structured markdown
        4. Preserve the original structure, headings, and organization
        5. Include all important text, lists, and tables
        6. Maintain proper heading hierarchy
        7. Remove unnecessary elements like ads, navigation menus, footers
        8. Ensure images are properly referenced with alt text when possible
        9. Format code blocks properly if present
        10. Convert any complex elements like interactive widgets into static descriptions
        11. Make sure to extract the COMPLETE content of the page, including lengthy text sections
        """,
        markdown=True
    )
    
    # Extract the content
    response = extractor.run(f"""
    Extract the complete content from this website: {url}
    
    Format the entire content in clean, well-structured markdown that preserves:
    - The original document structure
    - All headings with proper hierarchy
    - All paragraphs, lists, and tables
    - Any code blocks with appropriate formatting
    - References to images
    
    IMPORTANT: Make sure to extract ALL content from the page, including any lengthy sections of text.
    """)
    
    # Save the content to a file
    output_dir = ensure_output_directory()
    filename = sanitize_filename(url)
    filepath = os.path.join(output_dir, filename)
    
    with open(filepath, 'w', encoding='utf-8') as f:
        f.write(f"# Content from {url}\n\n")
        f.write(f"*Extracted on: {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}*\n\n")
        f.write(f"---\n\n")
        f.write(response.content)
    
    return filepath

I’ve also built an article summarizer that can extract key information from news articles:

# Create article summarizer agent
summarizer = Agent(
    name="ArticleSummarizer",
    role="Financial news analyst and summarizer",
    tools=[Crawl4aiTools(max_length=None)],
    model=OpenAIChat("gpt-4o"),
    instructions="""
    You are an expert financial news analyst specializing in creating comprehensive, detailed summaries.
    
    When summarizing the article:
    
    1. Extract and present all key information, arguments, and data points
    2. Maintain the original context and nuance of the article
    3. Organize information into logical sections with clear headings
    4. Include relevant quotes from key figures mentioned in the article
    5. Provide context for economic terms and policies mentioned
    6. Analyze implications of the statements and policies discussed
    7. Include any relevant market reactions or forecasts mentioned
    8. Structure the summary with:
       - Executive summary at the top (key points)
       - Main body with detailed breakdown of the article's content
       - Conclusion with implications or next steps mentioned
    9. Ensure all factual claims and statistics are accurately represented
    10. Format the output in clean, professional markdown
    """,
    markdown=True,
    show_tool_calls=True
)

# Run the agent with an article URL
article_url = "https://www.bloomberg.com/news/articles/2025-04-06/bessent-strikes-defiant-tariff-tone-as-he-rejects-us-recession"
summarizer.print_response(f"""
Create an extensive and detailed summary of the content from this Bloomberg article: {article_url}

The summary should:
1. Capture all key information, arguments, and data points
2. Preserve the original context and nuance
3. Include relevant quotes from key figures
4. Analyze the implications of the statements and policies discussed
5. Be structured with clear sections and professional formatting
""")

Lessons Learned: Building Effective AI Agent Teams

After spending time building these systems, I’ve learned a few key lessons about creating effective AI agent teams:

  1. Specialize your agents: Give each agent a specific role and set of tools. Don’t try to make one agent do everything.

  2. Be explicit with instructions: The more detailed your instructions, the better the results. Include examples, style guides, and specific requirements.

  3. Coordinate effectively: Make sure your agents know how to work together. The Team class in Agno makes this easy.

  4. Iterate and refine: Start with a simple setup and gradually add complexity as you learn what works.

  5. Monitor and debug: Use the show_tool_calls=True parameter to see what tools your agents are using and how they’re reasoning.

Getting Started: Your First AI Agent Team

If you’re interested in building your own AI agent teams, here’s how to get started:

  1. Install Agno and Crawl4ai:

    pip install agno crawl4ai
    
  2. Set up your OpenAI API key:

    export OPENAI_API_KEY=your_api_key_here
    
  3. Create a simple agent:

    from agno.agent import Agent
    from agno.models.openai import OpenAIChat
    from agno.tools.crawl4ai import Crawl4aiTools
    
    my_agent = Agent(
        name="MyAgent",
        role="A helpful assistant",
        tools=[Crawl4aiTools()],
        model=OpenAIChat("gpt-4o"),
        instructions="You are a helpful assistant that can extract information from websites."
    )
    
    # Run the agent
    my_agent.print_response("Extract the main features from https://docs.crawl4ai.com/")
    
  4. Build a team of agents:

    from agno.team import Team
    
    my_team = Team(
        name="My Team",
        mode="coordinate",
        members=[my_agent, another_agent],
        instructions="Work together to accomplish the task.",
        model=OpenAIChat("gpt-4o"),
        markdown=True,
    )
    
    # Run the team
    my_team.print_response("Create a summary of the features of Crawl4ai and explain how to use them.")
    

Conclusion: The Future of AI Agents

I believe we’re just scratching the surface of what’s possible with AI agents. As the technology continues to improve, we’ll see more sophisticated agents that can handle increasingly complex tasks.

The key to unlocking this potential is frameworks like Agno that make it easy to build and coordinate teams of specialized agents. By combining these frameworks with powerful tools like Crawl4ai, we can create systems that are greater than the sum of their parts.