A Graph Neural Network (GNN) based VR teaching system that enables collaborative control of virtual characters between teachers and students. The system dynamically adjusts weights to optimize student action following and improve action accuracy.
This project implements a real-time VR teaching system using Graph Neural Networks to predict teacher movements based on student input data. The system features dynamic weight adjustment that optimizes the balance between teacher guidance and student autonomy.
- Real-time Prediction: Predicts teacher future positions and rotations based on student past 1-second (50 frames) data
- Dynamic Weight Adjustment: Automatically adjusts weights based on hand distance offset from baseline
- High Accuracy: Achieves โฅ95% accuracy with <40ms inference delay
- Multi-model Support: Includes GNN, LSTM, and GCN baseline implementations
- Comprehensive Visualization: Trajectory comparison plots and ablation study heatmaps
- Sampling Rate: 50Hz
- Frame Data: 32-dimensional tensor per frame
- 0-5: Hand positions (6D) - [Left X, Y, Z, Right X, Y, Z]
- 6-13: Hand rotations (8D) - [Left quaternion x,y,z,w, Right quaternion x,y,z,w]
- 14-19: Hand velocities (6D) - [Left velocity X,Y,Z, Right velocity X,Y,Z]
- 20-25: Current target ball distances (6D) - [Left to ball X,Y,Z, Right to ball X,Y,Z]
- 26-31: Next target ball offsets (6D) - [Left offset X,Y,Z, Right offset X,Y,Z]
- All positions, velocities, distances, and offsets are normalized by dividing by RADIUS (150cm)
- Euler angles are converted to quaternions in [x, y, z, w] order
data/
โโโ processed/ # Processed .pt files
โ โโโ Data_CYJ_J/
โ โ โโโ player_0_All.pt # Teacher data
โ โ โโโ player_1_All.pt # Student data
โ โโโ ... (25 datasets)
โโโ raw/ # Original JSON data
โโโ Data_CYJ_J/
โโโ Data_GWX_J/
โโโ ... (25 participants)
- Python 3.10+
- CUDA-compatible GPU (optional, for acceleration)
# Install Poetry if not already installed
curl -sSL https://install.python-poetry.org | python3 -
# Clone the repository
git clone <repository-url>
cd GraphNN-VR-v5.2
# Install dependencies
poetry install
# Activate virtual environment
poetry shell# Clone the repository
git clone <repository-url>
cd GraphNN-VR-v5.2
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install torch torch-geometric numpy tqdm hydra-core matplotlibGraphNN-VR-v5.2/
โโโ graphnn/ # Main package
โ โโโ data/ # Data processing modules
โ โ โโโ dataset.py # Dataset classes
โ โ โโโ preprocess.py # Data preprocessing
โ โ โโโ gen_mean_std.py # Statistics generation
โ โโโ train/ # Training scripts
โ โ โโโ gnn_vr_2.py # Main GNN training
โ โ โโโ improved_train_gnn_vr.py
โ โ โโโ gru.py # GRU baseline
โ โ โโโ lstm.py # LSTM baseline
โ โโโ engine/ # Model and inference
โ โ โโโ model.py # Model architecture
โ โ โโโ eval.py # Evaluation metrics
โ โ โโโ infer_sliding_gnn.py # Inference engine
โ โโโ server/ # Real-time server
โ โ โโโ gnn_socket.py # GNN inference server
โ โ โโโ ue_socket.py # UE4 communication
โ โโโ utils/ # Utilities
โ โ โโโ metrics.py # Performance metrics
โ โ โโโ smooth.py # Data smoothing
โ โโโ viz/ # Visualization
โ โโโ plot_traj_all.py # Trajectory plotting
โ โโโ plot_convergence.py # Training convergence
โโโ data/ # Data directory
โโโ docs/ # Documentation
โโโ logs/ # Training logs
โโโ outputs/ # Model outputs
โโโ scripts/ # Utility scripts
# Preprocess raw data
python -m graphnn.data.preprocess
# Generate statistics for normalization
python -m graphnn.data.gen_mean_std# Train GNN model
python -m graphnn.train.gnn_vr_2
# Train improved GNN
python -m graphnn.train.improved_train_gnn_vr# Train LSTM baseline
python -m graphnn.train.lstm
# Train GRU baseline
python -m graphnn.train.gru# Evaluate model performance
python -m graphnn.engine.eval
# Run inference with sliding window
python -m graphnn.engine.infer_sliding_gnn# Start GNN inference server
python -m graphnn.server.gnn_socket
# Start UE4 communication server
python -m graphnn.server.ue_socket# Plot trajectory comparisons
python -m graphnn.viz.plot_traj_all
# Plot training convergence
python -m graphnn.viz.plot_convergence| Metric | Target | Current |
|---|---|---|
| Accuracy | โฅ95% | โ |
| Inference Delay | <40ms | โ |
| Model Size | Optimized | โ |
- GNN: Primary model with graph-based architecture
- LSTM: Sequential baseline for comparison
- GCN: Graph Convolutional Network baseline
The project uses Hydra for configuration management. Key configuration files are located in the training scripts and can be customized for different experiments.
- Sequence Length: 50 frames (1 second)
- Prediction Horizon: Future positions and rotations
- Weight Adjustment: Dynamic based on offset from baseline
- Normalization Radius: 150cm
The system successfully predicts teacher movements with high accuracy, enabling smooth collaborative control in VR environments.
Dynamic weight adjustment ensures optimal balance between teacher guidance and student autonomy based on performance metrics.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- VR teaching system research team
- PyTorch Geometric community
- All contributors and testers
For questions and support, please open an issue on GitHub or contact the development team.
Note: This project is designed for VR teaching applications and requires appropriate VR hardware and software for full functionality.