Skip to content

A Graph Neural Network (GNN) based VR teaching system that enables collaborative control of virtual characters between teachers and students. The system dynamically adjusts weights to optimize student action following and improve action accuracy.

Notifications You must be signed in to change notification settings

LingmaFuture/Graphnn-VR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

2 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

GraphNN-VR: Graph Neural Network for VR Teaching System

Python PyTorch PyTorch Geometric

A Graph Neural Network (GNN) based VR teaching system that enables collaborative control of virtual characters between teachers and students. The system dynamically adjusts weights to optimize student action following and improve action accuracy.

๐ŸŽฏ Project Overview

This project implements a real-time VR teaching system using Graph Neural Networks to predict teacher movements based on student input data. The system features dynamic weight adjustment that optimizes the balance between teacher guidance and student autonomy.

Key Features

  • Real-time Prediction: Predicts teacher future positions and rotations based on student past 1-second (50 frames) data
  • Dynamic Weight Adjustment: Automatically adjusts weights based on hand distance offset from baseline
  • High Accuracy: Achieves โ‰ฅ95% accuracy with <40ms inference delay
  • Multi-model Support: Includes GNN, LSTM, and GCN baseline implementations
  • Comprehensive Visualization: Trajectory comparison plots and ablation study heatmaps

๐Ÿ“Š Data Specifications

Input Data Format

  • Sampling Rate: 50Hz
  • Frame Data: 32-dimensional tensor per frame
    • 0-5: Hand positions (6D) - [Left X, Y, Z, Right X, Y, Z]
    • 6-13: Hand rotations (8D) - [Left quaternion x,y,z,w, Right quaternion x,y,z,w]
    • 14-19: Hand velocities (6D) - [Left velocity X,Y,Z, Right velocity X,Y,Z]
    • 20-25: Current target ball distances (6D) - [Left to ball X,Y,Z, Right to ball X,Y,Z]
    • 26-31: Next target ball offsets (6D) - [Left offset X,Y,Z, Right offset X,Y,Z]

Data Normalization

  • All positions, velocities, distances, and offsets are normalized by dividing by RADIUS (150cm)
  • Euler angles are converted to quaternions in [x, y, z, w] order

Dataset Structure

data/
โ”œโ”€โ”€ processed/          # Processed .pt files
โ”‚   โ”œโ”€โ”€ Data_CYJ_J/
โ”‚   โ”‚   โ”œโ”€โ”€ player_0_All.pt  # Teacher data
โ”‚   โ”‚   โ””โ”€โ”€ player_1_All.pt  # Student data
โ”‚   โ””โ”€โ”€ ... (25 datasets)
โ””โ”€โ”€ raw/               # Original JSON data
    โ”œโ”€โ”€ Data_CYJ_J/
    โ”œโ”€โ”€ Data_GWX_J/
    โ””โ”€โ”€ ... (25 participants)

๐Ÿš€ Installation

Prerequisites

  • Python 3.10+
  • CUDA-compatible GPU (optional, for acceleration)

Using Poetry (Recommended)

# Install Poetry if not already installed
curl -sSL https://install.python-poetry.org | python3 -

# Clone the repository
git clone <repository-url>
cd GraphNN-VR-v5.2

# Install dependencies
poetry install

# Activate virtual environment
poetry shell

Using pip

# Clone the repository
git clone <repository-url>
cd GraphNN-VR-v5.2

# Create virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install dependencies
pip install torch torch-geometric numpy tqdm hydra-core matplotlib

๐Ÿ“ Project Structure

GraphNN-VR-v5.2/
โ”œโ”€โ”€ graphnn/                    # Main package
โ”‚   โ”œโ”€โ”€ data/                   # Data processing modules
โ”‚   โ”‚   โ”œโ”€โ”€ dataset.py         # Dataset classes
โ”‚   โ”‚   โ”œโ”€โ”€ preprocess.py      # Data preprocessing
โ”‚   โ”‚   โ””โ”€โ”€ gen_mean_std.py    # Statistics generation
โ”‚   โ”œโ”€โ”€ train/                  # Training scripts
โ”‚   โ”‚   โ”œโ”€โ”€ gnn_vr_2.py       # Main GNN training
โ”‚   โ”‚   โ”œโ”€โ”€ improved_train_gnn_vr.py
โ”‚   โ”‚   โ”œโ”€โ”€ gru.py             # GRU baseline
โ”‚   โ”‚   โ””โ”€โ”€ lstm.py            # LSTM baseline
โ”‚   โ”œโ”€โ”€ engine/                 # Model and inference
โ”‚   โ”‚   โ”œโ”€โ”€ model.py           # Model architecture
โ”‚   โ”‚   โ”œโ”€โ”€ eval.py            # Evaluation metrics
โ”‚   โ”‚   โ””โ”€โ”€ infer_sliding_gnn.py  # Inference engine
โ”‚   โ”œโ”€โ”€ server/                 # Real-time server
โ”‚   โ”‚   โ”œโ”€โ”€ gnn_socket.py      # GNN inference server
โ”‚   โ”‚   โ””โ”€โ”€ ue_socket.py       # UE4 communication
โ”‚   โ”œโ”€โ”€ utils/                  # Utilities
โ”‚   โ”‚   โ”œโ”€โ”€ metrics.py         # Performance metrics
โ”‚   โ”‚   โ””โ”€โ”€ smooth.py          # Data smoothing
โ”‚   โ””โ”€โ”€ viz/                    # Visualization
โ”‚       โ”œโ”€โ”€ plot_traj_all.py   # Trajectory plotting
โ”‚       โ””โ”€โ”€ plot_convergence.py # Training convergence
โ”œโ”€โ”€ data/                       # Data directory
โ”œโ”€โ”€ docs/                       # Documentation
โ”œโ”€โ”€ logs/                       # Training logs
โ”œโ”€โ”€ outputs/                    # Model outputs
โ””โ”€โ”€ scripts/                    # Utility scripts

๐ŸŽฎ Usage

Data Preprocessing

# Preprocess raw data
python -m graphnn.data.preprocess

# Generate statistics for normalization
python -m graphnn.data.gen_mean_std

Training Models

GNN Model (Main)

# Train GNN model
python -m graphnn.train.gnn_vr_2

# Train improved GNN
python -m graphnn.train.improved_train_gnn_vr

Baseline Models

# Train LSTM baseline
python -m graphnn.train.lstm

# Train GRU baseline
python -m graphnn.train.gru

Evaluation

# Evaluate model performance
python -m graphnn.engine.eval

# Run inference with sliding window
python -m graphnn.engine.infer_sliding_gnn

Real-time Server

# Start GNN inference server
python -m graphnn.server.gnn_socket

# Start UE4 communication server
python -m graphnn.server.ue_socket

Visualization

# Plot trajectory comparisons
python -m graphnn.viz.plot_traj_all

# Plot training convergence
python -m graphnn.viz.plot_convergence

๐ŸŽฏ Performance Metrics

Metric Target Current
Accuracy โ‰ฅ95% โœ…
Inference Delay <40ms โœ…
Model Size Optimized โœ…

Baseline Comparison

  • GNN: Primary model with graph-based architecture
  • LSTM: Sequential baseline for comparison
  • GCN: Graph Convolutional Network baseline

๐Ÿ”ง Configuration

The project uses Hydra for configuration management. Key configuration files are located in the training scripts and can be customized for different experiments.

Key Parameters

  • Sequence Length: 50 frames (1 second)
  • Prediction Horizon: Future positions and rotations
  • Weight Adjustment: Dynamic based on offset from baseline
  • Normalization Radius: 150cm

๐Ÿ“ˆ Results

Trajectory Prediction

The system successfully predicts teacher movements with high accuracy, enabling smooth collaborative control in VR environments.

Weight Adjustment

Dynamic weight adjustment ensures optimal balance between teacher guidance and student autonomy based on performance metrics.

๐Ÿค Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ™ Acknowledgments

  • VR teaching system research team
  • PyTorch Geometric community
  • All contributors and testers

๐Ÿ“ž Contact

For questions and support, please open an issue on GitHub or contact the development team.


Note: This project is designed for VR teaching applications and requires appropriate VR hardware and software for full functionality.

About

A Graph Neural Network (GNN) based VR teaching system that enables collaborative control of virtual characters between teachers and students. The system dynamically adjusts weights to optimize student action following and improve action accuracy.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages