SuperMarioGPT / QUICK_REFERENCE.md
DarkDriftz's picture
Upload 18 files
49b8c43 verified

A newer version of the Gradio SDK is available: 6.1.0

Upgrade

MarioGPT Quick Reference Guide

๐Ÿš€ Quick Start Commands

Installation

# Basic installation
pip install -r requirements.txt

# With extras
pip install -e ".[gradio,mcp]"

# Development mode
pip install -e ".[dev,gradio,mcp]"

Running

Gradio Web Interface:

python app.py
# Visit: http://localhost:7860

MCP Server:

python mcp_server.py

Tests:

python test_mcp_server.py

๐Ÿ“ Level Generation Parameters

Prompt Components

Component Options Description
Pipes no, little, some, many Number of pipes in level
Enemies no, little, some, many Enemy density
Blocks little, some, many Platform/block count
Elevation low, high Vertical platforming

Example Prompts

"many pipes, some enemies, low elevation"
"no pipes, many enemies, high elevation"  
"some pipes, some enemies, some blocks, low elevation"

Advanced Parameters

Parameter Range Default Description
temperature 0.1-2.0 2.0 Higher = more diverse
level_size 100-2799 1399 Level length in tokens

๐Ÿ”Œ MCP Tools

1. generate_mario_level

{
  "prompt": "many enemies, high elevation",
  "temperature": 1.5,
  "level_size": 1000
}

Returns: PNG image + text description

2. get_level_suggestions

{}

Returns: List of example prompts

๐ŸŽฎ Keyboard Controls

In the playable demo:

  • Arrow Keys - Move left/right
  • A - Run
  • S - Jump
  • D - Shoot fireballs

๐Ÿ› Common Issues & Fixes

CUDA Out of Memory

# Use smaller level_size
level_size = 500

# Or force CPU mode
export CUDA_VISIBLE_DEVICES=""

MCP Server Won't Start

# Check dependencies
pip install mcp pydantic

# Verify Python path
which python

Static Files Missing

# Create directory
mkdir -p static

# Set permissions
chmod 755 static

Import Errors

# Install in editable mode
pip install -e .

# Or add to PYTHONPATH
export PYTHONPATH="${PYTHONPATH}:$(pwd)"

๐Ÿ“Š Temperature Guide

Temperature Behavior Use Case
0.1-0.5 Very consistent Reproducible levels
0.5-1.0 Balanced Production quality
1.0-1.5 Creative Interesting variety
1.5-2.0 Wild Experimental levels

๐Ÿ—๏ธ Project Structure

mario-gpt/
โ”œโ”€โ”€ app.py              # Gradio web interface
โ”œโ”€โ”€ mcp_server.py       # MCP server for HuggingChat
โ”œโ”€โ”€ requirements.txt    # Python dependencies
โ”œโ”€โ”€ setup.py           # Package configuration
โ”œโ”€โ”€ README.md          # Documentation
โ”œโ”€โ”€ INSTALLATION.md    # Deployment guide
โ”œโ”€โ”€ BUGFIXES.md        # Change log
โ”œโ”€โ”€ test_mcp_server.py # Test suite
โ”œโ”€โ”€ mcp_config.json    # MCP configuration
โ”œโ”€โ”€ static/            # Generated HTML files
โ”œโ”€โ”€ data/
โ”‚   โ””โ”€โ”€ tiles/         # Mario tile assets
โ””โ”€โ”€ supermariogpt/     # Core package
    โ”œโ”€โ”€ lm.py          # Language model
    โ”œโ”€โ”€ dataset.py     # Data handling
    โ”œโ”€โ”€ prompter.py    # Prompt engineering
    โ””โ”€โ”€ utils.py       # Utilities

๐Ÿ”ง Environment Variables

# Force CPU mode
export CUDA_VISIBLE_DEVICES=""

# Set tile directory
export TILE_DIR="/path/to/tiles"

# Logging level
export LOG_LEVEL="DEBUG"

# Python path
export PYTHONPATH="."

๐Ÿ“ฆ Dependencies Overview

Core

  • torch - Deep learning framework
  • transformers - GPT-2 model
  • numpy - Numerical operations
  • scipy - Scientific computing

Web Interface

  • gradio - Web UI framework
  • fastapi - API framework
  • uvicorn - ASGI server
  • spaces - HuggingFace decorator

Image Processing

  • pillow - Image manipulation

MCP Support

  • mcp - Model Context Protocol
  • pydantic - Data validation

Development

  • pytest - Testing framework
  • black - Code formatting
  • flake8 - Linting
  • isort - Import sorting

๐ŸŽฏ Best Practices

For Development

  1. Use virtual environment
  2. Install in editable mode: pip install -e .
  3. Run tests before committing
  4. Format code with black
  5. Check with flake8

For Deployment

  1. Use specific package versions
  2. Enable GPU for production
  3. Set up monitoring/logging
  4. Configure persistent storage
  5. Use environment variables

For MCP Integration

  1. Test server independently first
  2. Verify configuration file syntax
  3. Check Python path in config
  4. Monitor server logs
  5. Handle errors gracefully

๐Ÿ“š Additional Resources

Documentation

Repositories

๐Ÿ’ก Pro Tips

  1. Faster Generation: Use smaller level_size during testing
  2. Better Quality: Use temperature 1.0-1.5 for best results
  3. Diverse Levels: Combine different prompt styles
  4. GPU Optimization: Enable TF32 for Ampere GPUs
  5. Memory Management: Clear CUDA cache between generations

๐Ÿ” Debugging

Enable Verbose Logging

import logging
logging.basicConfig(level=logging.DEBUG)

Check CUDA Availability

import torch
print(f"CUDA available: {torch.cuda.is_available()}")
print(f"CUDA device: {torch.cuda.get_device_name(0)}")

Test Model Loading

from supermariogpt.lm import MarioLM
mario_lm = MarioLM()
print("Model loaded successfully!")

Test MCP Server

# Run test suite
python test_mcp_server.py

# Manual test
echo '{"jsonrpc":"2.0","id":1,"method":"tools/list"}' | python mcp_server.py

๐ŸŽจ Example Use Cases

1. Easy Tutorial Level

Prompt: "little pipes, no enemies, little blocks, low elevation"
Temperature: 1.0
Level Size: 800

2. Challenging Action Level

Prompt: "many pipes, many enemies, some blocks, high elevation"
Temperature: 1.5
Level Size: 1500

3. Platform-Heavy Level

Prompt: "no pipes, little enemies, many blocks, high elevation"
Temperature: 1.2
Level Size: 1200

4. Experimental Level

Prompt: "many pipes, many enemies, many blocks, high elevation"
Temperature: 2.0
Level Size: 2000

Last Updated: 2024 Version: 1.0.0