Skip to content

Commit e66703b

Browse files
committed
add okai pages
1 parent 7b0eaa7 commit e66703b

File tree

4 files changed

+1169
-0
lines changed

4 files changed

+1169
-0
lines changed
Lines changed: 114 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,114 @@
1+
---
2+
title: Free LLM Chat Prompts
3+
---
4+
5+
## okai chat
6+
7+
As part of the development of [okai](/autoquery/okai-models) for generating [Blazor CRUD Apps from a text prompt](/autoquery/text-to-blazor) using your preferred AI Models, we've also made available a generic **chat** prompt that can be used as a convenient way to conduct personal research against many of the worlds most popular Large Language Models - for Free!
8+
9+
You can just start immediately using the `npx okai chat` script to ask LLMs for assistance:
10+
11+
:::sh
12+
npx okai chat "command to copy a folder with rsync?"
13+
:::
14+
15+
This will use the default model (currently codestral:22b) to answer your question.
16+
17+
### Select Preferred Model
18+
19+
You can also use your preferred model with the `-m <model>` flag with either the model **name** or its **alias**, e.g you can use
20+
[Microsoft's PHI-4 14B](https://techcommunity.microsoft.com/blog/aiplatformblog/introducing-phi-4-microsoft%E2%80%99s-newest-small-language-model-specializing-in-comple/4357090)
21+
model with:
22+
23+
:::sh
24+
npx okai -m phi chat "command to copy folder with rsync?"
25+
:::
26+
27+
### List Available Models
28+
29+
We're actively adding more great performing and leading experimental models as they're released.
30+
You can view the list of available models with `ls models`:
31+
32+
:::sh
33+
npx okai ls models
34+
:::
35+
36+
Which at this time will return the following list of available models along with instructions for how to use them:
37+
38+
```txt
39+
USAGE (5 models max):
40+
a) OKAI_MODELS=codestral,llama3.3,flash
41+
b) okai -models codestral,llama3.3,flash <prompt>
42+
c) okai -m flash chat <prompt>
43+
44+
FREE MODELS:
45+
claude-3-haiku (alias hakiu)
46+
codestral:22b (alias codestral)
47+
deepseek-r1:70b
48+
deepseek-v3:671b (alias deepseek)
49+
gemini-flash-1.5
50+
gemini-flash-1.5-8b (alias flash-8b)
51+
gemini-flash-2.0 (alias flash)
52+
gemini-flash-lite-2.0 (alias flash-lite)
53+
gemini-flash-thinking-2.0 (alias flash-thinking)
54+
gemini-pro-2.0 (alias gemini-pro)
55+
gemma2:9b (alias gemma)
56+
gpt-3.5-turbo (alias gpt-3.5)
57+
gpt-4o-mini
58+
llama3.1:70b (alias llama3.1)
59+
llama3.3:70b (alias llama3.3)
60+
llama3:8b (alias llama3)
61+
mistral-nemo:12b (alias mistral-nemo)
62+
mistral-small:24b (alias mistral-small)
63+
mistral:7b (alias mistral)
64+
mixtral:8x22b
65+
mixtral:8x7b (alias mixtral)
66+
nova-lite
67+
nova-micro
68+
phi-4:14b (alias phi,phi-4)
69+
qwen-plus
70+
qwen-turbo
71+
qwen2.5-coder:32b (alias qwen2.5-coder)
72+
qwen2.5:72b (alias qwen2.5)
73+
qwq:32b (alias qwq)
74+
qwq:72b
75+
76+
PREMIUM MODELS: *
77+
claude-3-5-haiku
78+
claude-3-5-sonnet
79+
claude-3-sonnet
80+
deepseek-r1:671b (alias deepseek-r1)
81+
gemini-pro-1.5
82+
gpt-4
83+
gpt-4-turbo
84+
gpt-4o
85+
mistral-large:123b
86+
nova-pro
87+
o1-mini
88+
o1-preview
89+
o3-mini
90+
qwen-max
91+
92+
* requires valid license:
93+
a) SERVICESTACK_LICENSE=<key>
94+
b) SERVICESTACK_CERTIFICATE=<LC-XXX>
95+
c) okai -models <premium,models> -license <license> <prompt>
96+
```
97+
98+
Where you'll be able to use any of the great performing inexpensive models listed under `FREE MODELS` for Free.
99+
Whilst ServiceStack customers with an active commercial license can also use any of the more expensive
100+
and better performing models listed under `PREMIUM MODELS` by either:
101+
102+
a) Setting the `SERVICESTACK_LICENSE` Environment Variable with your **License Key**
103+
b) Setting the `SERVICESTACK_CERTIFICATE` Variable with your **License Certificate**
104+
c) Inline using the `-license` flag with either the **License Key** or **Certificate**
105+
106+
### FREE for Personal Usage
107+
108+
To be able to maintain this as a free service we're limiting usage as a tool that developers can use for personal
109+
assistance and research by limiting usage to **60 requests /hour** which should be more than enough for most
110+
personal usage and research whilst deterring usage in automated tools.
111+
112+
:::tip info
113+
Rate limiting is implemented with a sliding [Token Bucket algorithm](https://en.wikipedia.org/wiki/Token_bucket) that replenishes 1 additional request every 60s
114+
:::

MyApp/_pages/autoquery/okai-db.md

Lines changed: 130 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,130 @@
1+
---
2+
title: Generate CRUD APIs and UIs from existing DBs
3+
---
4+
5+
A core piece of functionality in the [Text to Blazor CRUD App](/autoquery/text-to-blazor) feature is distilling an AI Prompt into TypeScript classes that can be [further customized](/autoquery/okai-models#customize-data-models)
6+
to generate AutoQuery CRUD APIs and Admin UIs for managing the underlying RDBMS tables.
7+
8+
## TypeScript Data Models
9+
10+
Using TypeScript is an effortless way to define data models, offering a DSL-like minimal boilerplate format that's human-friendly to read and write which can leverage TypeScript's powerful Type System is validated against the referenced [api.d.ts](https://okai.servicestack.com/api.d.ts) schema to provide a rich authoring experience
11+
with strong typing and intellisense - containing all the C# Types, interfaces, and attributes used in defining APIs, DTOs and Data Models.
12+
13+
### Blueprint for Code Generation
14+
15+
The TypeScript Data Models serve as the blueprint for generating everything needed to support the feature
16+
in your App, including the AutoQuery **CRUD APIs**, **Admin UIs** and **DB Migrations** that can re-create the necessary tables from scratch.
17+
18+
## 1. Generate RDBMS Metadata
19+
20+
The first step in generating TypeScript Data Models is to capture the metadata from the existing RDBMS tables which
21+
we can do with the `App.json` [AppTask](https://docs.servicestack.net/app-tasks) below which uses your App's configured
22+
RDBMS connection to generate the Table Definitions for all tables in the specified RDBMS connection and schema
23+
to the file of your choice (e.g `App_Data/App.json`):
24+
25+
```csharp
26+
AppTasks.Register("App.json", args =>
27+
appHost.VirtualFiles.WriteFile("App_Data/App.json",ClientConfig.ToSystemJson(
28+
migrator.DbFactory.GetTables(namedConnection:null, schema:null))));
29+
```
30+
31+
This task can then be run from the command line with:
32+
33+
:::sh
34+
dotnet run --AppTasks=App.json
35+
:::
36+
37+
Which generates `App_Data/App.json` containing the table definition metadata for all tables in
38+
the specified RDBMS, e.g:
39+
40+
```json
41+
[
42+
{
43+
"name": "AspNetUserClaims",
44+
"columns": [
45+
{
46+
"columnName": "Id",
47+
"columnOrdinal": 0,
48+
"columnSize": -1,
49+
"numericPrecision": 0,
50+
"numericScale": 0,
51+
"isUnique": true,
52+
"isKey": true,
53+
"baseCatalogName": "techstacks",
54+
"baseColumnName": "Id",
55+
"baseSchemaName": "public",
56+
"baseTableName": "AspNetUserClaims",
57+
"dataType": "System.Int32",
58+
"allowDBNull": false,
59+
"providerType": 9,
60+
"isAliased": false,
61+
"isExpression": false,
62+
"isAutoIncrement": true,
63+
"isRowVersion": false,
64+
"isHidden": false,
65+
"isLong": false,
66+
"isReadOnly": false,
67+
"dataTypeName": "integer",
68+
"columnDefinition": "INTEGER PRIMARY KEY AUTOINCREMENT"
69+
},
70+
],
71+
...
72+
]
73+
```
74+
75+
### Different Connection or DB Schema
76+
77+
If you prefer to generate the metadata for a different connection or schema, you can create a new AppTask
78+
with your preferred `namedConnection` and/or `schema`, e.g:
79+
80+
```csharp
81+
AppTasks.Register("Sales.json", args =>
82+
appHost.VirtualFiles.WriteFile("Sales.json", ClientConfig.ToSystemJson(
83+
migrator.DbFactory.GetTables(namedConnection:"reports",schema:"sales"))));
84+
```
85+
86+
That you could then generate with:
87+
88+
:::sh
89+
dotnet run --AppTasks=Sales.json
90+
:::
91+
92+
## 2. Generate TypeScript Data Models
93+
94+
The next step is to generate TypeScript Data Models from the captured metadata which can be done with the `okai` tool
95+
by running the `convert` command with the path to the `App.json` JSON table definitions which will generate the
96+
TypeScript Data Models to stdout which can be redirected to a file in your **ServiceModel** project, e.g:
97+
98+
:::sh
99+
npx okai convert App_Data/App.json > ../MyApp.ServiceModel/App.d.ts
100+
:::
101+
102+
## 3. Generate CRUD APIs and Admin UIs
103+
104+
The data models defined in the `App.d.ts` TypeScript Declaration file is what drives the generation of the Data Models, APIs, DB Migrations and Admin UIs. This can be further customized by editing the TypeScript Declaration file and re-running the `okai` tool with just the filename, e.g:
105+
106+
:::sh
107+
npx okai App.d.ts
108+
:::
109+
110+
Which will re-generate the Data Models, APIs, DB Migrations and Admin UIs based on the updated Data Models.
111+
112+
![](/img/posts/okai-models/npx-okai-App.png)
113+
114+
:::tip
115+
You only need to specify the `App.d.ts` TypeScript filename (i.e. not the filepath) from
116+
anywhere within your .NET solution
117+
:::
118+
119+
## Live Code Generation
120+
121+
If you'd prefer to see the generated code in real-time you can add the `--watch` flag to watch the
122+
TypeScript Declaration file for changes and automatically re-generate the generated files on Save:
123+
124+
:::sh
125+
npx okai App.d.ts --watch
126+
:::
127+
128+
<video autoplay="autoplay" loop="loop" controls>
129+
<source src="https://media.servicestack.com/videos/okai-watch.mp4" type="video/mp4">
130+
</video>

0 commit comments

Comments
 (0)