Seamlessly integrate Wolfram Alpha into your chat applications. This project implements an MCP (Model Context Protocol) server designed to interface with the Wolfram Alpha API. It enables chat-based applications to perform computational queries and retrieve structured knowledge, facilitating advanced conversational capabilities. Included is an MCP-Client example utilizing Gemini via LangChain, dem
Add this skill
npx mdskills install ricocf/mcp-wolframalphaWell-documented MCP server with clear setup, examples, and multiple integration options
1# MCP Wolfram Alpha (Server + Client)2Seamlessly integrate Wolfram Alpha into your chat applications.34This project implements an MCP (Model Context Protocol) server designed to interface with the Wolfram Alpha API. It enables chat-based applications to perform computational queries and retrieve structured knowledge, facilitating advanced conversational capabilities.56Included is an MCP-Client example utilizing Gemini via LangChain, demonstrating how to connect large language models to the MCP server for real-time interactions with Wolfram Alpha’s knowledge engine.78[](https://deepwiki.com/akalaric/mcp-wolframalpha)9---1011## Features1213- **Wolfram|Alpha Integration** for math, science, and data queries.1415- **Modular Architecture** Easily extendable to support additional APIs and functionalities.1617- **Multi-Client Support** Seamlessly handle interactions from multiple clients or interfaces.1819- **MCP-Client example** using Gemini (via LangChain).20- **UI Support** using Gradio for a user-friendly web interface to interact with Google AI and Wolfram Alpha MCP server.2122---2324## Installation252627### Clone the Repo28 ```bash29 git clone https://github.com/ricocf/mcp-wolframalpha.git3031 cd mcp-wolframalpha32 ```333435### Set Up Environment Variables3637Create a .env file based on the example:3839- WOLFRAM_API_KEY=your_wolframalpha_appid4041- GeminiAPI=your_google_gemini_api_key *(Optional if using Client method below.)*4243### Install Requirements44 ```bash45 pip install -r requirements.txt46 ```4748 Install the required dependencies with uv:49 Ensure [`uv`](https://github.com/astral-sh/uv) is installed.5051 ```bash52 uv sync53 ```5455### Configuration5657To use with the VSCode MCP Server:581. Create a configuration file at `.vscode/mcp.json` in your project root.592. Use the example provided in `configs/vscode_mcp.json` as a template.603. For more details, refer to the [VSCode MCP Server Guide](https://sebastian-petrus.medium.com/vscode-mcp-server-42286eed3ee7).6162To use with Claude Desktop:63```json64{65 "mcpServers": {66 "WolframAlphaServer": {67 "command": "python3",68 "args": [69 "/path/to/src/core/server.py"70 ]71 }72 }73}74```75## Client Usage Example7677This project includes an LLM client that communicates with the MCP server.7879#### Run with Gradio UI80- Required: GeminiAPI81- Provides a local web interface to interact with Google AI and Wolfram Alpha.82- To run the client directly from the command line:83```bash84python main.py --ui85```86#### Docker87To build and run the client inside a Docker container:88```bash89docker build -t wolframalphaui -f .devops/ui.Dockerfile .9091docker run wolframalphaui92```93#### UI94- Intuitive interface built with Gradio to interact with both Google AI (Gemini) and the Wolfram Alpha MCP server.95- Allows users to switch between Wolfram Alpha, Google AI (Gemini), and query history.96979899#### Run as CLI Tool100- Required: GeminiAPI101- To run the client directly from the command line:102```bash103python main.py104```105#### Docker106To build and run the client inside a Docker container:107```bash108docker build -t wolframalpha -f .devops/llm.Dockerfile .109110docker run -it wolframalpha111```112113## Contact114115Feel free to give feedback. The e-mail address is shown if you execute this in a shell:116117```sh118printf "\x61\x6b\x61\x6c\x61\x72\x69\x63\x31\x40\x6f\x75\x74\x6c\x6f\x6f\x6b\x2e\x63\x6f\x6d\x0a"119```120121
Full transparency — inspect the skill content before installing.