Skip to content

Tarun047/linky

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Leveraging SLMs on your browser

  1. This repo contains a simple example of how to leverage various web technologies to run a Small Language Model right on your browser.
  2. You can have a read at the medium article here
  3. The tech stack includes:
    1. React because it's the best Frontend library ;)
    2. Typescript, cause I don't like JS.
    3. Web GPU APIs enabling the browser to use GPU power.
    4. Transformer.js for SLM inference
    5. Vite for bundling

How to run this locally

  1. Clone this repo with git clone --recursive [email protected]:Tarun047/linky.git
  2. Then we need to install and build the dependencies of the transformers v3 (pre-release), at some point in future when the stable version of this library is out we don't need this but for now run cd transformers.js && npm install && npm run build && cd ..
  3. Then we need to install and build the dependencies of this project by running cd linky-ui && npm install
  4. Now we can launch the dev server by running npm run dev and fireup a browser that supports WebGPU like chrome and head to http://localhost:5173 to experience the SLM.
  5. If you want to experience this straight away, you can head to the deployed site here: https://tarun047.github.io/slm-demo/

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published