HarmonyOS MCP Server This is a MCP server for manipulating harmonyOS Device. 1. Clone this repo 2. Setup the envirnment. You can use Claude Desktop to try our tool. You can also use openai-agents SDK to try the mcp server. Here's an example You can use LangGraph, a flexible LLM agent framework to design your workflows. Here's an example Write the system prompt in server.py Use l
Add this skill
npx mdskills install XixianLiang/harmonyos-mcp-serverEnables AI agents to control HarmonyOS devices with solid setup docs and multiple framework examples
1<div align="center">2<h1>HarmonyOS MCP Server</h1>34 <a href='LICENSE'><img src='https://img.shields.io/badge/License-MIT-orange'></a> 5 <a><img src='https://img.shields.io/badge/python-3.13-blue'></a>6</div>78<div align="center">9 <img style="max-width: 500px; width: 60%;" width="1111" alt="image" src="https://github.com/user-attachments/assets/7c2e6879-f583-48d7-b467-c4c6d99c5fab" />10</div>1112## Intro1314This is a MCP server for manipulating harmonyOS Device.151617https://github.com/user-attachments/assets/7af7f5af-e8c6-4845-8d92-cd0ab30bfe17181920## Quick Start2122### Installation23241. Clone this repo2526```bash27git clone https://github.com/XixianLiang/HarmonyOS-mcp-server.git28cd HarmonyOS-mcp-server29```30312. Setup the envirnment.3233```bash34uv python install 3.1335uv sync36```3738### Usage394041#### 1.Claude Desktop4243You can use [Claude Desktop](https://modelcontextprotocol.io/quickstart/user) to try our tool.4445#### 2.Openai SDK46You can also use [openai-agents SDK](https://openai.github.io/openai-agents-python/mcp/) to try the mcp server. Here's an example4748```python49"""50Example: Use Openai-agents SDK to call HarmonyOS-mcp-server51"""52import asyncio53import os5455from agents import Agent, Runner, gen_trace_id, trace56from agents.mcp import MCPServerStdio, MCPServer5758async def run(mcp_server: MCPServer):59 agent = Agent(60 name="Assistant",61 instructions="Use the tools to manipulate the HarmonyOS device and finish the task.",62 mcp_servers=[mcp_server],63 )6465 message = "Launch the app `settings` on the phone"66 print(f"Running: {message}")67 result = await Runner.run(starting_agent=agent, input=message)68 print(result.final_output)697071async def main():7273 # Use async context manager to initialize the server74 async with MCPServerStdio(75 params={76 "command": "<...>/bin/uv",77 "args": [78 "--directory",79 "<...>/harmonyos-mcp-server",80 "run",81 "server.py"82 ]83 }84 ) as server:85 trace_id = gen_trace_id()86 with trace(workflow_name="MCP HarmonyOS", trace_id=trace_id):87 print(f"View trace: https://platform.openai.com/traces/trace?trace_id={trace_id}\n")88 await run(server)8990if __name__ == "__main__":91 asyncio.run(main())92```9394#### 3.Langchain95You can use [LangGraph](https://langchain-ai.github.io/langgraph/concepts/why-langgraph/), a flexible LLM agent framework to design your workflows. Here's an example9697```python98"""99langgraph_mcp.py100"""101102server_params = StdioServerParameters(103 command="/home/chad/.local/bin/uv",104 args=["--directory",105 ".",106 "run",107 "server.py"],108109)110111112#This fucntion would use langgraph to build your own agent workflow113async def create_graph(session):114 llm = ChatOllama(model="qwen2.5:7b", temperature=0)115 #!!!load_mcp_tools is a langchain package function that integrates the mcp into langchain.116 #!!!bind_tools fuction enable your llm to access your mcp tools117 tools = await load_mcp_tools(session)118 llm_with_tool = llm.bind_tools(tools)119120121 system_prompt = await load_mcp_prompt(session, "system_prompt")122 prompt_template = ChatPromptTemplate.from_messages([123 ("system", system_prompt[0].content),124 MessagesPlaceholder("messages")125 ])126 chat_llm = prompt_template | llm_with_tool127128 # State Management129 class State(TypedDict):130 messages: Annotated[List[AnyMessage], add_messages]131132 # Nodes133 def chat_node(state: State) -> State:134 state["messages"] = chat_llm.invoke({"messages": state["messages"]})135 return state136137 # Building the graph138 # graph is like a workflow of your agent.139 #If you want to know more langgraph basic,reference this link (https://langchain-ai.github.io/langgraph/tutorials/get-started/1-build-basic-chatbot/#3-add-a-node)140 graph_builder = StateGraph(State)141 graph_builder.add_node("chat_node", chat_node)142 graph_builder.add_node("tool_node", ToolNode(tools=tools))143 graph_builder.add_edge(START, "chat_node")144 graph_builder.add_conditional_edges("chat_node", tools_condition, {"tools": "tool_node", "__end__": END})145 graph_builder.add_edge("tool_node", "chat_node")146 graph = graph_builder.compile(checkpointer=MemorySaver())147 return graph148149150151152153async def main():154 async with stdio_client(server_params) as (read, write):155 async with ClientSession(read, write) as session:156 await session.initialize()157158 config = RunnableConfig(thread_id=1234,recursion_limit=15)159 # Use the MCP Server in the graph160 agent = await create_graph(session)161162 while True:163 message = input("User: ")164 try:165 response = await agent.ainvoke({"messages": message}, config=config)166 print("AI: "+response["messages"][-1].content)167 except RecursionError:168 result = None169 logging.error("Graph recursion limit reached.")170171172if __name__ == "__main__":173 asyncio.run(main())174```175176Write the system prompt in `server.py`177178```python179"""180server.py181"""182@mcp.prompt()183def system_prompt() -> str:184 """System prompt description"""185 return """186 You are an AI assistant use the tools if needed.187 """188```189Use `load_mcp_prompt` function to get your prompt from mcp server.190```python191"""192langgraph_mcp.py193"""194prompts = await load_mcp_prompt(session, "system_prompt")195```196
Full transparency — inspect the skill content before installing.