Close Menu
TFFH – The Financial Freedom Hub
    What's Hot

    Why JD.com Stock Topped the Market Today

    13/05/2025

    Innovative Agricultural Business Ideas for Sustainable Success

    13/05/2025

    What to Do With an Inheritance

    13/05/2025
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram
    TFFH – The Financial Freedom HubTFFH – The Financial Freedom Hub
    • Home
    • Money Basics
    • Budgeting 101
    • Saving Strategies
    • Debt Management
    • Emergency Funds
    • Credit & Loans
    • Youtube
    TFFH – The Financial Freedom Hub
    Home»Tech»Computing»Implementing an LLM Agent with Tool Access Using MCP-Use
    Computing

    Implementing an LLM Agent with Tool Access Using MCP-Use

    MathsXP.com By MathsXP.com13/05/2025No Comments4 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Implementing an LLM Agent with Tool Access Using MCP-Use
    Share
    Facebook Twitter LinkedIn Pinterest Email

    MCP-Use is an open-source library that lets you connect any LLM to any MCP server, giving your agents tool access like web browsing, file operations, and more — all without relying on closed-source clients. In this tutorial, we’ll use langchain-groq and MCP-Use’s built-in conversation memory to build a simple chatbot that can interact with tools via MCP. 

    Installing uv package manager

    We will first set up our environment and start with installing the uv package manager. For Mac or Linux:

    For Windows (PowerShell):

    powershell -ExecutionPolicy ByPass -c "irm  | iex"

    Creating a new directory and activating a virtual environment

    We will then create a new project directory and initialize it with uv

    uv init mcp-use-demo
    cd mcp-use-demo

    We can now create and activate a virtual environment. For Mac or Linux:

    uv venv
    source .venv/bin/activate

    For Windows:

    uv venv
    .venv\Scripts\activate

    Installing Python dependencies

    We will now install the required dependencies

    uv add mcp-use langchain-groq python-dotenv

    Groq API Key

    To use Groq’s LLMs:

    1. Visit Groq Console and generate an API key.
    2. Create a .env file in your project directory and add the following line:

     Replace with the key you just generated.

    Brave Search API Key

    This tutorial uses the Brave Search MCP Server.

    1. Get your Brave Search API key from: Brave Search API
    2. Create a file named mcp.json in the project root with the following content:
    {
      "mcpServers": {
        "brave-search": {
          "command": "npx",
          "args": [
            "-y",
            "@modelcontextprotocol/server-brave-search"
          ],
          "env": {
            "BRAVE_API_KEY": ""
          }
        }
      }
    }

    Replace with your actual Brave API key.

    Node JS

    Some MCP servers (including Brave Search) require npx, which comes with Node.js.

    • Download the latest version of Node.js from nodejs.org
    • Run the installer.
    • Leave all settings as default and complete the installation

    Using other servers

    If you’d like to use a different MCP server, simply replace the contents of mcp.json with the configuration for that server.

    Create an app.py file in the directory and add the following content:

    Importing the libraries

    from dotenv import load_dotenv
    from langchain_groq import ChatGroq
    from mcp_use import MCPAgent, MCPClient
    import os
    import sys
    import warnings
    
    warnings.filterwarnings("ignore", category=ResourceWarning)

    This section loads environment variables and imports required modules for LangChain, MCP-Use, and Groq. It also suppresses ResourceWarning for cleaner output.

    Setting up the chatbot

    async def run_chatbot():
        """ Running a chat using MCPAgent's built in conversation memory """
        load_dotenv()
        os.environ["GROQ_API_KEY"] = os.getenv("GROQ_API_KEY")
    
        configFile = "mcp.json"
        print("Starting chatbot...")
    
        # Creating MCP client and LLM instance
        client = MCPClient.from_config_file(configFile)
        llm = ChatGroq(model="llama-3.1-8b-instant")
    
        # Creating an agent with memory enabled
        agent = MCPAgent(
            llm=llm,
            client=client,
            max_steps=15,
            memory_enabled=True,
            verbose=False
        )

    This section loads the Groq API key from the .env file and initializes the MCP client using the configuration provided in mcp.json. It then sets up the LangChain Groq LLM and creates a memory-enabled agent to handle conversations.

    Implementing the chatbot

    # Add this in the run_chatbot function
        print("\n-----Interactive MCP Chat----")
        print("Type 'exit' or 'quit' to end the conversation")
        print("Type 'clear' to clear conversation history")
    
        try:
            while True:
                user_input = input("\nYou: ")
    
                if user_input.lower() in ["exit", "quit"]:
                    print("Ending conversation....")
                    break
               
                if user_input.lower() == "clear":
                    agent.clear_conversation_history()
                    print("Conversation history cleared....")
                    continue
               
                print("\nAssistant: ", end="", flush=True)
    
                try:
                    response = await agent.run(user_input)
                    print(response)
               
                except Exception as e:
                    print(f"\nError: {e}")
    
        finally:
            if client and client.sessions:
                await client.close_all_sessions()

    This section enables interactive chatting, allowing the user to input queries and receive responses from the assistant. It also supports clearing the chat history when requested. The assistant’s responses are displayed in real-time, and the code ensures that all MCP sessions are closed cleanly when the conversation ends or is interrupted.

    Running the app

    if __name__ == "__main__":
        import asyncio
        try:
            asyncio.run(run_chatbot())
        except KeyboardInterrupt:
            print("Session interrupted. Goodbye!")
       
        finally:
            sys.stderr = open(os.devnull, "w")

    This section runs the asynchronous chatbot loop, managing continuous interaction with the user. It also handles keyboard interruptions gracefully, ensuring the program exits without errors when the user terminates the session.

    You can find the entire code here

    To run the app, run the following command

    This will start the app, and you can interact with the chatbot and use the server for the session

    I am a Civil Engineering Graduate (2022) from Jamia Millia Islamia, New Delhi, and I have a keen interest in Data Science, especially Neural Networks and their application in various areas.


    Source link

    Access Agent Implementing LLM MCPUse Tool
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    MathsXP.com
    • Website

    Related Posts

    Innovative Agricultural Business Ideas for Sustainable Success

    13/05/2025

    QUIZ03 – neurotrk.astrotarot.live – MathsXP

    13/05/2025

    A Heartwarming Tale of Kindness and Understanding

    13/05/2025
    Add A Comment
    Leave A Reply Cancel Reply

    Latest post

    Why JD.com Stock Topped the Market Today

    13/05/2025

    Innovative Agricultural Business Ideas for Sustainable Success

    13/05/2025

    What to Do With an Inheritance

    13/05/2025

    Your Spotify AI DJ Is Now Taking Verbal Requests

    13/05/2025

    20 Reasons I Love YNAB

    13/05/2025

    QUIZ03 – neurotrk.astrotarot.live – MathsXP

    13/05/2025

    Why Venture Global Rallied Today, Despite an Earnings “Miss”

    13/05/2025

    A Heartwarming Tale of Kindness and Understanding

    13/05/2025

    Why Coinbase Global Stock Soared Today

    13/05/2025

    8 Ineffective Marketing Strategies Small, Local Businesses Should Avoid

    13/05/2025
    About The Financial Freedom Hub

    The Financial Freedom Hub is your go-to resource for mastering personal finance. We provide easy-to-understand guides, practical tips, and expert advice to help you take control of your money, budget effectively, save for the future, and manage debt. Whether you're just starting out or looking to refine your financial strategy, we offer the tools and knowledge you need to build a secure financial future. Start your journey to financial freedom with us today!

    Company
    • Home
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms and conditions
    Latest post

    Why JD.com Stock Topped the Market Today

    13/05/2025

    Innovative Agricultural Business Ideas for Sustainable Success

    13/05/2025

    What to Do With an Inheritance

    13/05/2025

    Your Spotify AI DJ Is Now Taking Verbal Requests

    13/05/2025
    TFFH – The Financial Freedom Hub
    Facebook X (Twitter) Instagram YouTube
    • Home
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms and conditions
    © 2025 The Financial Freedom Hub. All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.