This project is a web application that indexes the Altered TCG marketplace and tries to provide a more user-friendly interface to search for unique cards. It does not try to index prices for Rares and Commons, since those can be easily found on the Altered website.
You can find the live version of the app at https://market.sabotageafter.rest.
- Node.js environement (v22)
- A Docker executable (Docker Desktop for Windows/Mac, or podman on Linux)
- PostgreSQL CLI tools (
psql,createdbcommands). Download from postgresql.org/download. - A text editor or IDE (I recommend VSCode)
Create a .env file in the root of the project. Paste the contents of this example file:
# used by Prisma
DATABASE_URL="postgresql://test:test@localhost:5411/test"
# used by docker-compose / pg-admin
POSTGRES_DB=test
POSTGRES_USER=test
POSTGRES_PASSWORD=test
# For the marketplace crawler - this is an internal ID for the Database, it can be anything
ALT_SESSION_NAME=my-dev-account
There is a docker-compose.yml file in the root of the project that will run a containerized PostgreSQL database along with a pgAdmin instance.
docker compose up -d
The database will run on port 5411, and the pgAdmin instance will run on port 5412. Note that this is not the standard port for PostgreSQL (which is 5432), so you will need to specify this port when connecting from the CLI.
When done, you can stop the containers with docker compose down.
createdb -p 5411 -U test -h localhost -W mydatabase
This will allow you to test the UI with a full dataset without having to run the crawler manually.
Download it from Google Drive : marketbot_export_2025-07-01_clean.7z and decompress it.
Once you have the file, you can import using the psql CLI:
psql -p 5411 -U test -h localhost -d mydatabase -f marketbot_export_2025-05-21_clean.sql
npm install
There might be some new migrations that have been added since this dump was created. You can run the following command to apply them:
npx prisma migrate dev
At this point, all the one-time setup is done. You should be able to run the app and crawler locally using the commands below.
Start the Remix dev server with:
npm run dev
The app will run on http://localhost:5173 by default, or another port if it's already in use. This will be shown on the terminal.
Prisma Studio is a tool that allows you to view the data in your database. It uses the information from the schema.prisma file to generate a UI, making this more lightweight than using a Database GUI like pgAdmin.
You can start it with the following command:
npx prisma studio
Prisma is a ORM that generates a Typescript library to interact with the database.
The schema.prisma file defines the structure of the database, and the relations between the tables.
The migrations folder contains the history of the changes made to the database, and any change to the schema.prisma file should be accompanie by a migration. See (Prisma Docs) Prototyping your schema for more information.
Run npx prisma migrate dev to generate a migration and apply your schema changes to the database.
To query the database, see (Prisma Docs) Querying the database or follow examples for the codebase.
Both the Crawler and the Web UI use the same Prisma client to interact with the database.
The web UI is built with Remix, a React framework for server-side rendering (SSR). Everything related to the UI is in the app/ folder:
app/routes/is the main entry point for the web UI. It lists all the URLs supported by the app.app/components/contains reusable React components.app/components/ui/contains Shadcn/ui components.
app/loadersareloaderfunctions, which are only run on the server and get data from the Database to render in the routes (pages). Currently some of the routes have the whole loader logic inside the route file, but they should be extracted to theloadersfolder eventually.- The root
/public/folder contains static assets like images, fonts, and other files that are served as-is.
This project uses Tailwind CSS for styling. The tailwind.config.ts file contains the configuration for the project, and the app/tailwind.css file contains some global styles.
This project uses Shadcn/ui for the UI components. The app/components/ui folder contains the components generated by Shadcn/ui. Prefer to use Shadcn/ui components whenever possible rather than rolling your own components. These components are based on Radix UI Primitives and provide good accessibility and consistent styling out of the box.
The market-crawler directory contains a simple Node.js application. It is broken down into several components which should each have a specific task:
GenericIndexeris a class that provides a generic implementation of a crawler that can be used enqueue and process requests. It is the base class for the other crawlers:marketis the Marketplace crawler, it loads all the pages fromapi.altered.gg/cards/statsand records the Unique IDs and current Offer price. This can be run regularly to get the latest offers from the Marketplace.uniquesfetches Uniques characteristics (name, effects, cost, power, etc.) from the Altered API using theapi.altered.gg/cards/<ID>endpoint. Each unique should only be fetched once, and the results are saved to the database.
refresh-tokenprovides aAuthTokenServicethat can be used to get a fresh token for the Altered API.post-processtakes the Uniques data and break them down into sub-components (Trigger, Condition, Effect, etc.) and builds to make querying the database faster.
The main files provide entry points for each of the tasks, and can be run from the command line using the corresponding npm scripts, for example:
# runs the market crawler
npm run crawler
# refreshes the Altered API token (this is also done by the crawler, so only useful for debugging)
npm run crawler-refresh-token
# fetches all the missing uniques from the Altered API
npm run crawler-get-all-uniques
# run the post-process step on all uniques in database
npm run crawler-post-process
The data/cards_min.json file contains the list of all cards in the game, for the purpose of directing the Crawler to do searches by family and faction. When a new set is released, it should be updated to include new cards so they too can be crawled.
The dev-utils/strip-card-data.ts script can help generate this file. The script uses the English files from https://github.com/PolluxTroy0/Altered-TCG-Card-Database/ and references their path at the top of the script. Download the files from that Git repository and place them in the data folder, then run the script via npm run dev:strip-card-data. It should update the cards_min.json file, which you can then commit into the repository.