Skip to main content

Overview

BeaconScore (Validator Efficiency) is a transparent, comprehensive metric that measures how well your validators perform their duties. It combines attestations, block proposals, and sync committee participation into a single score.
API Endpoints: This guide covers /api/v2/ethereum/validators/performance-aggregate for summary metrics and /api/v2/ethereum/validators/performance-list for per-epoch breakdowns.
BeaconScore is designed to normalize for luck—validators with fewer block proposals aren’t unfairly penalized. This makes it ideal for comparing performance across different validators, nodes, or client configurations.

Why Monitor Performance?

Identify Issues Early

Spot underperforming validators before they significantly impact your rewards.

Compare Configurations

Evaluate different client software, hardware setups, or network configurations.

Client Reporting

Provide transparent performance metrics to your staking customers.

Optimize Operations

Track improvements over time and validate infrastructure changes.

BeaconScore Components

BeaconScore integrates three components weighted by their contribution to validator rewards:
ComponentWeightDescription
Attestations84.4%Head, source, and target votes each epoch (~6.4 min)
Block Proposals12.5%CL rewards from proposed blocks (luck-normalized)
Sync Committees3.1%Participation when elected to sync committee
Learn more about how each component is calculated in Efficiency / BeaconScore.

Benchmark Values

BeaconScoreRatingAction
≥ 99.5%ExcellentOptimal performance
99.0% - 99.5%GoodWithin acceptable range
98.0% - 99.0%FairMinor issues, monitor closely
< 98.0%PoorInvestigate immediately

Quick Start: Check Your Performance

Get the aggregated BeaconScore for your validators over the last 30 days:
curl --request POST \
  --url https://beaconcha.in/api/v2/ethereum/validators/performance-aggregate \
  --header 'Authorization: Bearer <YOUR_API_KEY>' \
  --header 'Content-Type: application/json' \
  --data '
{
  "chain": "mainnet",
  "validator": {
    "dashboard_id": 123
  },
  "range": {
    "evaluation_window": "30d"
  }
}
'

Response

{
  "data": {
    "beaconscore": {
      "total": 0.9945,
      "attestation": 0.9952,
      "proposal": 0.9876,
      "sync_committee": 0.9991
    },
    "duties": {
      "attestation": {
        "included": 7983797,
        "assigned": 7985250,
        "missed": 1453
      },
      "proposal": {
        "successful": 229,
        "assigned": 231,
        "missed": 2
      },
      "sync_committee": {
        "successful": 134833,
        "assigned": 135818,
        "missed": 985
      }
    }
  },
  "range": {
    "epoch": { "start": 407453, "end": 414202 }
  }
}

Compare Performance Across Groups

Use Dashboard Groups to compare performance across different nodes, clients, or configurations:
import requests

API_KEY = "<YOUR_API_KEY>"
DASHBOARD_ID = 123

# Define groups (e.g., by client or node)
GROUPS = {
    "Lighthouse Node": 1,
    "Prysm Node": 2,
    "Teku Node": 3
}

def get_performance(dashboard_id: int, group_id: int, window: str = "30d"):
    response = requests.post(
        "https://beaconcha.in/api/v2/ethereum/validators/performance-aggregate",
        headers={
            "Authorization": f"Bearer {API_KEY}",
            "Content-Type": "application/json"
        },
        json={
            "chain": "mainnet",
            "validator": {
                "dashboard_id": dashboard_id,
                "group_id": group_id
            },
            "range": {"evaluation_window": window}
        }
    )
    return response.json()

# Compare all groups
print("30-Day BeaconScore Comparison")
print("=" * 40)

results = []
for name, group_id in GROUPS.items():
    data = get_performance(DASHBOARD_ID, group_id)
    score = data.get("data", {}).get("beaconscore", {}).get("total", 0) * 100
    results.append((name, score))
    print(f"{name}: {score:.2f}%")

# Find best performer
results.sort(key=lambda x: x[1], reverse=True)
print(f"\nBest: {results[0][0]} ({results[0][1]:.2f}%)")

Per-Epoch Performance History

For detailed analysis, query performance for specific epochs:
curl --request POST \
  --url https://beaconcha.in/api/v2/ethereum/validators/performance-list \
  --header 'Authorization: Bearer <YOUR_API_KEY>' \
  --header 'Content-Type: application/json' \
  --data '
{
  "chain": "mainnet",
  "validator": {
    "validator_identifiers": [1, 2, 3]
  },
  "epoch": 413950,
  "page_size": 100
}
'
The epoch parameter is required for the performance-list endpoint. Use this to investigate specific time periods or track performance over time.

Available Evaluation Windows

WindowDescription
24hLast 24 hours (rolling)
7dLast 7 days (rolling)
30dLast 30 days (rolling)
90dLast 90 days (rolling)
all_timeSince validator activation

Best Practices

Use 30-Day Windows

Short windows are noisy due to luck. Use 30d or longer for meaningful comparisons.

Group by Infrastructure

Create dashboard groups for each node, client, or geographic location.

Set Alerts

Configure notifications for performance drops.

Regular Reviews

Schedule weekly or monthly performance reviews to catch gradual degradation.

Deep Dive Guides


For detailed API specifications, see the BeaconScore & Performance section in the V2 API Docs sidebar.