Financial Analyst's Research Tool: Multi-Server MCP Integration

1. Introduction
This tutorial demonstrates how financial analysts specializing in cryptocurrency and DeFi can leverage multiple MCP servers to create a comprehensive market analysis workflow. By integrating Dune Queries, Excel, Bright Data, MongoDB, and Notion servers, analysts can extract blockchain data, scrape market information, perform complex calculations, maintain historical records, and produce professional investment reports.

**Key Benefits:**

- Direct access to on-chain data via Dune Analytics queries
- Automated market data collection through Bright Data scraping
- Advanced financial modeling and analysis in Excel
- Historical data preservation in MongoDB for trend analysis
- Professional report generation in Notion with live data
- Integrated workflow from data collection to investment recommendations

**Target Users:**

- Cryptocurrency analysts
- DeFi researchers
- Investment managers
- Hedge fund analysts
- Financial advisors specializing in digital assets
- Blockchain data analysts
2. System Prompt
You are an advanced AI financial analyst specializing in cryptocurrency markets, DeFi protocols, and blockchain analytics. You have integrated access to on-chain data, market information, analytical tools, and reporting platforms through MCP servers.

Your capabilities include:

**Blockchain Data Analysis:**

- Querying on-chain metrics via Dune Analytics
- Analyzing DeFi protocol performance and TVL
- Tracking wallet behaviors and token flows
- Monitoring smart contract interactions
- Identifying market trends and opportunities

**Market Data Collection:**

- Scraping real-time price data from exchanges
- Gathering governance token information
- Collecting trading volumes and liquidity metrics
- Monitoring market sentiment indicators
- Tracking competitor protocols

**Financial Modeling:**

- Building complex Excel models for valuation
- Calculating financial ratios (P/TVL, P/E equivalents)
- Creating sensitivity analyses and scenarios
- Developing investment scoring models
- Generating performance attribution reports

**Historical Analysis:**

- Storing time-series data in MongoDB
- Building historical performance databases
- Creating backtesting frameworks
- Tracking prediction accuracy
- Maintaining audit trails

**Investment Reporting:**

- Creating professional investment memos in Notion
- Building interactive dashboards with charts
- Generating executive summaries
- Documenting investment theses
- Tracking portfolio recommendations

When conducting financial analysis, always:

- Verify data from multiple sources
- Consider market volatility and risks
- Maintain professional skepticism
- Document all assumptions clearly
- Follow regulatory compliance guidelines
- Provide balanced risk/reward assessments
3. MCP Servers in this Agent Flow
1. **Dune Queries MCP Server**

- **Purpose**: Access to blockchain analytics and on-chain data
- **Features**: SQL queries on indexed blockchain data, DeFi metrics, wallet analytics
- **Coverage**: Ethereum, Polygon, Arbitrum, Optimism, and other chains

2. **Excel MCP Server**

- **Purpose**: Advanced financial modeling and calculations
- **Capabilities**: Formula execution, pivot tables, charts, scenario analysis
- **Integration**: Real-time data updates, model automation

3. **Bright Data MCP Server**

- **Purpose**: Web scraping for market data collection
- **Features**: Exchange price scraping, rate limit handling, proxy rotation
- **Coverage**: Major crypto exchanges, DeFi aggregators, data providers

4. **MongoDB MCP Server**

- **Purpose**: Historical data storage and time-series analysis
- **Database**: NoSQL storage for flexible data structures
- **Analytics**: Aggregation pipelines, trend analysis, backtesting

5. **Notion MCP Server**

- **Purpose**: Investment reporting and documentation
- **Features**: Rich text editing, embedded charts, collaborative workspaces
- **Output**: Investment memos, research reports, portfolio tracking

4. MCP Server Setup Prompt

Prerequisites

  • Claude AI access (Claude 3.5 Sonnet or newer recommended)
  • Node.js 18+ installed
  • Dune Analytics account and API key
  • Bright Data account with crypto scraping permissions
  • MongoDB database instance
  • Notion workspace with API access
  • Excel/Office 365 subscription

Environment Preparation

  1. API Keys and Authentication
  2. MCP Server Installation
# Install all required MCP servers
    npm install -g @dune/mcp-server
npm install -g @excel/mcp-server
npm install -g @brightdata/mcp-server
npm install -g @mongodb/mcp-server
npm install -g @notion/mcp-server
    

Claude Configuration

Add the following to your Claude MCP configuration:

{
  "servers": {
    "dune": {
      "command": "npx",
      "args": ["@dune/mcp-server"],
      "env": {
        "DUNE_API_KEY": "your_dune_api_key"
      }
    },
    "excel": {
      "command": "npx",
      "args": ["@excel/mcp-server"],
      "env": {
        "EXCEL_CLIENT_ID": "your_microsoft_client_id",
        "EXCEL_CLIENT_SECRET": "your_microsoft_secret",
        "EXCEL_TENANT_ID": "your_tenant_id"
      }
    },
    "brightdata": {
      "command": "npx",
      "args": ["@brightdata/mcp-server"],
      "env": {
        "BRIGHTDATA_USERNAME": "your_username",
        "BRIGHTDATA_PASSWORD": "your_password",
        "BRIGHTDATA_ZONE": "your_zone_id"
      }
    },
    "mongodb": {
      "command": "npx",
      "args": ["@mongodb/mcp-server"],
      "env": {
        "MONGODB_URI": "mongodb+srv://user:pass@cluster.mongodb.net/crypto_analysis"
      }
    },
    "notion": {
      "command": "npx",
      "args": ["@notion/mcp-server"],
      "env": {
        "NOTION_API_KEY": "your_notion_api_key",
        "NOTION_DATABASE_ID": "your_research_database_id"
      }
    }
  }
}

5. Step-by-step User Guide

Step 1: Initial Setup and Testing

  1. Verify All Servers Are Connected
"Please verify that all MCP servers are connected: Dune Queries, Excel, Bright Data, MongoDB, and Notion."
    
  1. Test Each Server Individually
    • Dune: “Query the current TVL for Uniswap”
    • Excel: “Create a simple test spreadsheet with formulas”
    • Bright Data: “Scrape current BTC price from Binance”
    • MongoDB: “List collections in crypto_analysis database”
    • Notion: “Create a test page in my research workspace”

Step 2: Blockchain Data Collection

Query DeFi Protocol Metrics:

"Use Dune to query the top 10 DeFi protocols by TVL:
- Protocol name and contract addresses
- Current TVL in USD
- 24h, 7d, and 30d TVL changes
- Number of unique users
- Transaction count last 24h"

Advanced Protocol Analysis:

"Create a Dune query to analyze for each top protocol:
1. TVL breakdown by asset type
2. User retention metrics (DAU/MAU)
3. Average transaction size
4. Fee revenue generated
5. Liquidity concentration (Gini coefficient)"

Cross-Chain Analysis:

"Query DeFi metrics across multiple chains:
- Ethereum mainnet TVL and activity
- Arbitrum protocol deployments
- Polygon user adoption rates
- Optimism transaction costs
- Compare same protocols across chains"

Step 3: Market Data Scraping

Token Price Collection:

"Use Bright Data to scrape governance token prices:
1. UNI from Binance, Coinbase, Kraken
2. AAVE from major exchanges
3. CRV, COMP, MKR prices
4. Include 24h volume for each
5. Get order book depth at ±2%"

Historical Price Data:

"Scrape historical price data for analysis:
- Daily close prices for past 90 days
- High/low/open/close for each day
- Volume-weighted average prices
- Market cap calculations
- Circulating supply changes"

Market Sentiment Indicators:

"Collect additional market data:
1. Social sentiment scores from LunarCrush
2. GitHub development activity
3. Twitter mention volumes
4. Google Trends data
5. Fear & Greed Index readings"

Step 4: Excel Analysis

Create Valuation Models:

"Build Excel model for DeFi protocol valuation:
1. Import TVL and token price data
2. Calculate P/TVL ratios for each protocol
3. Create sector average benchmarks
4. Build relative valuation matrix
5. Generate buy/sell/hold signals"

Financial Ratio Analysis:

"Develop comprehensive ratio analysis:
- P/TVL (Price to Total Value Locked)
- Fee Revenue / Market Cap
- Active Users / Market Cap
- TVL Growth Rate vs Token Performance
- Protocol Efficiency Metrics"

Scenario Analysis:

"Create scenario models in Excel:
1. Bull case: 50% TVL growth projections
2. Base case: Current growth trends
3. Bear case: 30% TVL contraction
4. Impact on token valuations
5. Risk-adjusted return calculations"

Step 5: Historical Data Management

Store Analysis Results:

"Save analysis data to MongoDB:
1. Create collection 'protocol_analyses'
2. Store daily TVL snapshots
3. Save calculated ratios
4. Track recommendation history
5. Include metadata and timestamps"

Build Time Series Database:

"Structure MongoDB for historical tracking:
Collections:
- daily_tvl_snapshots
- token_price_history
- ratio_calculations
- market_sentiment_scores
- analysis_reports"

Performance Tracking:

"Track investment recommendation performance:
1. Store initial recommendations with rationale
2. Update with actual performance
3. Calculate accuracy metrics
4. Identify successful patterns
5. Learn from missed opportunities"

Step 6: Investment Memo Creation

Executive Summary:

"Create investment memo in Notion:
1. Executive Summary with key findings
2. Market Overview section
3. Top 3 investment opportunities
4. Risk analysis for each
5. Recommended portfolio allocation"

Detailed Protocol Analysis:

"For each recommended protocol, create:
- Protocol overview and mechanism
- Financial metrics dashboard
- Competitive positioning
- Growth drivers and risks
- Price target and timeline"

Visual Reporting:

"Add charts and visualizations:
1. TVL growth charts
2. P/TVL ratio comparisons
3. Market share evolution
4. Correlation matrices
5. Risk/return scatter plots"

Step 7: Comprehensive Analysis Workflow

Weekly DeFi Market Report:

"Execute complete weekly analysis:
1. Query Dune for latest protocol metrics
2. Scrape current market prices
3. Update Excel valuation models
4. Compare to historical data in MongoDB
5. Generate weekly report in Notion"

Deep Dive Protocol Research:

"Conduct detailed analysis of [Protocol Name]:
1. Query all on-chain metrics from Dune
2. Scrape token metrics and competition
3. Build detailed Excel model
4. Analyze historical performance
5. Create investment thesis document"

Advanced Workflows

Arbitrage Opportunity Detection:

"Identify cross-exchange arbitrage:
1. Scrape prices from 5+ exchanges
2. Calculate spreads in Excel
3. Query on-chain liquidity depths
4. Store opportunities in MongoDB
5. Create alert system in Notion"

Risk Management Framework:

"Build risk monitoring system:
1. Query protocol vulnerability metrics
2. Track concentration risks
3. Monitor liquidation levels
4. Calculate VaR in Excel
5. Create risk dashboard in Notion"

Portfolio Optimization:

"Optimize DeFi portfolio allocation:
1. Query correlation data from Dune
2. Scrape risk-free rates
3. Build optimization model in Excel
4. Backtest with MongoDB data
5. Document strategy in Notion"

Real-Time Monitoring

Live Dashboard Creation:

"Build real-time monitoring system:
1. Set up recurring Dune queries
2. Automate price scraping every 5 min
3. Update Excel dashboards
4. Stream to MongoDB
5. Sync key metrics to Notion"

Alert System:

"Create alert conditions:
- TVL drops >10% in 24h
- P/TVL ratio below historical average
- Unusual on-chain activity
- Price divergence from fundamentals
- New protocol launches"

Compliance and Documentation

Audit Trail:

"Maintain compliance records:
1. Document all data sources used
2. Save query timestamps
3. Track model assumptions
4. Record recommendation changes
5. Archive all reports"

Regulatory Reporting:

"Prepare regulatory-compliant reports:
1. Include proper disclaimers
2. Document methodology
3. Disclose potential conflicts
4. Maintain data accuracy
5. Follow investment advisor guidelines"

Best Practices

  1. Daily Analysis Routine:
    • Morning: Check overnight DeFi changes
    • Run standardized Dune queries
    • Update price data via scraping
    • Refresh Excel models
    • Review alerts and anomalies
  2. Weekly Deep Dives:
    • Comprehensive protocol reviews
    • Model recalibration
    • Historical performance analysis
    • Strategy adjustment
    • Client report preparation
  3. Data Quality Standards:
    • Cross-verify from multiple sources
    • Document data anomalies
    • Maintain consistent methodologies
    • Regular model validation
    • Clear assumption documentation
  4. Risk Management:
    • Never rely on single data source
    • Consider smart contract risks
    • Account for liquidity constraints
    • Stress test all models
    • Maintain conservative assumptions

Troubleshooting Common Issues

Data Discrepancies:

  • Verify blockchain finality
  • Check for reorgs or forks
  • Confirm exchange API status
  • Validate scraping accuracy
  • Cross-reference multiple sources

Model Errors:

  • Debug Excel formulas
  • Check data type consistency
  • Verify calculation logic
  • Test edge cases
  • Document known limitations

Performance Issues:

  • Optimize Dune query efficiency
  • Implement caching strategies
  • Use MongoDB indexes properly
  • Limit API call frequency
  • Batch process where possible

Example Analysis Scenarios

New Protocol Evaluation:

"Evaluate newly launched DeFi protocol:
1. Query initial TVL and growth rate
2. Analyze tokenomics and distribution
3. Compare to successful launches
4. Model potential scenarios
5. Create investment recommendation"

Market Crash Analysis:

"Analyze DeFi market during downturn:
1. Query liquidation volumes
2. Track TVL flight patterns
3. Identify resilient protocols
4. Calculate drawdown metrics
5. Find investment opportunities"

Yield Strategy Optimization:

"Optimize yield farming strategies:
1. Query current APY across protocols
2. Calculate risk-adjusted returns
3. Model impermanent loss scenarios
4. Track historical yield stability
5. Recommend optimal allocations"