Skip to content

Commit 96be533

Browse files
committed
Fix Mermaid diagrams for GitHub compatibility
1 parent c21e01b commit 96be533

File tree

1 file changed

+24
-24
lines changed

1 file changed

+24
-24
lines changed

docs/architecture.md

Lines changed: 24 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -36,19 +36,21 @@ Core design principles:
3636

3737
```mermaid
3838
flowchart LR
39-
subgraph HW[Hardware]
40-
CAM[Camera Sensor<br/>OV2640 / OV5640]
39+
subgraph Hardware
40+
CAM[Camera Sensor OV2640/OV5640]
4141
end
4242
43-
subgraph FW[Firmware Components]
44-
CN[CameraNode<br/>components/camera_node]
45-
CV[CvPipeline<br/>components/cv_pipeline]
46-
SS[StreamServer (Planned)<br/>components/stream_server]
47-
DRV[Drivers<br/>components/drivers]
48-
UT[Utils<br/>components/utils]
43+
subgraph Firmware
44+
CN[CameraNode]
45+
CV[CvPipeline]
46+
SS[StreamServer (Planned)]
47+
DRV[Drivers]
48+
UT[Utils]
4949
end
5050
51-
CAM --> CN --> CV --> SS
51+
CAM --> CN
52+
CN --> CV
53+
CV --> SS
5254
CV --> UT
5355
CN --> UT
5456
```
@@ -148,12 +150,12 @@ This architecture mirrors higher‑power embedded vision stacks but optimized fo
148150

149151
```mermaid
150152
flowchart LR
151-
A[Sensor Exposure] --> B[DMA Frame Buffer]
152-
B --> C[CameraNode::capture_frame()]
153-
C --> D[CvPipeline::process()]
154-
D -->|Detections + FPS| E[UART Logs]
155-
D -->|Future MJPEG| F[StreamServer]
156-
E --> G[Developer / Host]
153+
A[Sensor Exposure] --> B[DMA to Frame Buffer]
154+
B --> C[CameraNode capture_frame]
155+
C --> D[CvPipeline process]
156+
D --> E[UART Logs]
157+
D --> F[StreamServer]
158+
E --> G[Developer or Host]
157159
F --> H[Remote Dashboard]
158160
```
159161

@@ -163,23 +165,22 @@ flowchart LR
163165

164166
```mermaid
165167
sequenceDiagram
166-
participant APP as main.cpp
168+
participant APP as Main
167169
participant CN as CameraNode
168170
participant CV as CvPipeline
169171
participant SS as StreamServer
170172
171173
APP->>CN: init()
172-
CN-->>APP: OK / Error
174+
CN-->>APP: OK or Error
173175
174176
loop Capture Loop
175177
APP->>CN: capture_frame()
176-
CN-->>APP: frame*
177-
APP->>CV: process(frame*)
178+
CN-->>APP: frame
179+
APP->>CV: process(frame)
178180
CV-->>APP: results
179-
APP->>SS: (planned) pushFrame()
180-
APP->>CN: release_frame(frame*)
181+
APP->>SS: pushFrame() (planned)
182+
APP->>CN: release_frame()
181183
end
182-
183184
```
184185

185186
---
@@ -250,7 +251,7 @@ Loadable pipeline configuration (JSON or struct):
250251

251252
## 10.3 TinyML Stage (Optional)
252253
Support for extremely compact embedded inference:
253-
- 96×96 CNNs
254+
- 96x96 CNNs
254255
- FOMO-style detection (Edge Impulse)
255256
- Post-threshold classification
256257

@@ -261,4 +262,3 @@ Support for extremely compact embedded inference:
261262
This architecture enables the ESP32‑S3 to act as a **fully functional embedded vision node**, capable of
262263
preprocessing, analysis, and eventually streaming. It is modular, extensible, and designed for real-time
263264
operation under microcontroller constraints.
264-

0 commit comments

Comments
 (0)