LNR-server-02-cascading-failure-scenario-simulatio
This server is to precess files for LNR.
README
ā ļø Important Notice ā ļø
As the paper is under review, all contents in this repository are currently not permitted for reuse by anyone until this announcement is removed. Thank you for your understanding! š
1. Overview & Objectives
This repository contains the complete implementation, experimental data, and supplementary results for the paper ĆĆĆ developed by XXX University in China, and .
Pending publication, the code is shared under a restrictive license. Once the paper is accepted, the repository will transition to a MIT license. Please contact the corresponding author for any inquiries regarding academic use during the review period.
2. Videos of agents operation
2.1 Operation of the developed prototype
āāā A snippet of using the developed prototype to run the TS-ReAct-based agents driven by GPT-4o
āāā A snippet of updating the tool kit in the prototype
The full video to showcase the prototype and tool kit updating can be found in:
2.2 Operation of agents based on ReAct pattern
āāā A snippet of running the ReAct-based agents driven by GPT-4o, GPT-4, and GPT-3.5 Turbo.
The full video can be found here ()
āāā A snippet of running the ReAct-based agents driven by Qwen2.5, Deepseek-V3, Gemma-2, Llama-3.1, and Mixtral MoE.
The full video can be found here ()
2.3 Operation of agents based on TS-ReAct pattern
āāā A snippet of running the TS agent based on TS-ReAct pattern.
The full video can be found here ()
āāā A snippet of running the ReAct agent based on TS-ReAct pattern.
The full video can be found here ()
3. Repository Structure
4. Acknowledgments
This work heavily relies on excellent open-source projects, including but not limited to:
- LangGraph & LangChain
- Hugging Face MTEB leaderboard
- NetworkX, PyTorch Geometric, and numerous LLM providers (OpenAI, Anthropic, Qwen, Llama, etc.)
We are deeply grateful to all contributors of these foundational work.
5. How to Reuse This Repository
5.1 Importing the Lifeline Recovery Tool Set
- Copy all tool definition files from
tools/into your target agent directory. - Import the tools using the standardized registry pattern shown in the example notebooks.
5.2 Running Baseline ReAct Agents
- Directory:
agents_reAct/ - Supports 8 different LLMs (GPT-4o, Claude-3, Llama-3.1-405B, Qwen2.5, etc.)
- Ready-to-run scripts with configuration YAMLs
5.3 Running the Proposed GraphRAG + MCP Agents
- Directory:
agents_graphRAG_MCP/ - Same 8 backbone LLMs
- Includes GraphRAG index construction scripts and MCP search configurations
5.4 Running the Interactive Prototype
- Directory:
prototype/ - Dynamic tool registration/hot-reloading
- Web-based GUI + terminal interface
- Supports on-the-fly addition of new recovery actions
Recommended Servers
playwright-mcp
A Model Context Protocol server that enables LLMs to interact with web pages through structured accessibility snapshots without requiring vision models or screenshots.
Magic Component Platform (MCP)
An AI-powered tool that generates modern UI components from natural language descriptions, integrating with popular IDEs to streamline UI development workflow.
Audiense Insights MCP Server
Enables interaction with Audiense Insights accounts via the Model Context Protocol, facilitating the extraction and analysis of marketing insights and audience data including demographics, behavior, and influencer engagement.
VeyraX MCP
Single MCP tool to connect all your favorite tools: Gmail, Calendar and 40 more.
graphlit-mcp-server
The Model Context Protocol (MCP) Server enables integration between MCP clients and the Graphlit service. Ingest anything from Slack to Gmail to podcast feeds, in addition to web crawling, into a Graphlit project - and then retrieve relevant contents from the MCP client.
Kagi MCP Server
An MCP server that integrates Kagi search capabilities with Claude AI, enabling Claude to perform real-time web searches when answering questions that require up-to-date information.
E2B
Using MCP to run code via e2b.
Neon Database
MCP server for interacting with Neon Management API and databases
Exa Search
A Model Context Protocol (MCP) server lets AI assistants like Claude use the Exa AI Search API for web searches. This setup allows AI models to get real-time web information in a safe and controlled way.
Qdrant Server
This repository is an example of how to create a MCP server for Qdrant, a vector search engine.