- This repo contains a simple example of how to leverage various web technologies to run a Small Language Model right on your browser.
- You can have a read at the medium article here
- The tech stack includes:
- React because it's the best Frontend library ;)
- Typescript, cause I don't like JS.
- Web GPU APIs enabling the browser to use GPU power.
- Transformer.js for SLM inference
- Vite for bundling
- Clone this repo with
git clone --recursive [email protected]:Tarun047/linky.git - Then we need to install and build the dependencies of the transformers v3 (pre-release), at some point in future when the stable version of this library is out we don't need this but for now run
cd transformers.js && npm install && npm run build && cd .. - Then we need to install and build the dependencies of this project by running
cd linky-ui && npm install - Now we can launch the dev server by running
npm run devand fireup a browser that supports WebGPU like chrome and head tohttp://localhost:5173to experience the SLM. - If you want to experience this straight away, you can head to the deployed site here: https://tarun047.github.io/slm-demo/