13 Commits

Author SHA1 Message Date
Rene
efb7258bcc Comment out default environment and upload port in platformio.ini 2025-11-11 16:54:39 +01:00
Rene
c1beb4d6e3 Merge pull request #177 from csabyka/develop
Added S3 wroom n16r8 board
2025-11-11 16:51:38 +01:00
Csaba Mártha
e3a7a0c89b Added s3 wroom n16r8 2025-11-11 13:11:38 +01:00
Rene
346deda2bb Merge pull request #170 from StaticRocket/develop
platformio.ini: add m5stack-timer-cam
2025-04-20 21:09:24 +02:00
Randolph Sapp
e466d592be platformio.ini: add m5stack-timer-cam
Add the m5stack-timer-cam definitions following information from the
board documentation [1] and the board entry on platform.io [2]. Tweak
the upload speed to use a known working value. Mention the X version in
the readme as well since it uses the same board and is just a chassis
change.

[1] https://docs.m5stack.com/en/unit/timercam
[2] ec69109ed6/boards/m5stack-timer-cam.json

Signed-off-by: Randolph Sapp <rs@ti.com>
2025-04-20 02:12:52 -05:00
Rene
32cbc2479d Merge pull request #158 from rzeldent:133-esp32-cam-led-flash-not-working
133-esp32-cam-led-flash-not-working
2025-01-05 13:02:44 +01:00
Rene Zeldenthuis
e93511d0c8 Added functions for flasg from API 2025-01-05 12:38:21 +01:00
Rene Zeldenthuis
8be1484386 Merge branch 'main' into develop 2024-12-29 13:27:29 +01:00
Rene Zeldenthuis
a4c6d60279 Moved dependabot 2024-12-29 11:45:53 +01:00
Rene
45d08d08ce Add files via upload 2024-12-29 11:42:17 +01:00
ColdLlama
0eb4ddfe69 Update README.md (#151)
corrected a tiny typo from "to" to "two"
2024-11-17 15:15:06 +01:00
Kaze
7a5abd3235 Update README.md (#148)
According to AI-Thinker officials, the ESP32-2 is a module and not a CPU, so it was changed to the ESP32.
2024-11-12 16:35:02 +01:00
Dominik Rimpf
069e6ff2e7 Add support for M5PoECAM-W (#140)
* add support for M5PoECAM-W (U121-B)

* update documentation for newly added M5PoECAM-W

* update board values for m5poecam-w
2024-08-24 11:32:26 +02:00
43 changed files with 348 additions and 1356 deletions

View File

@@ -42,6 +42,8 @@ This software supports the following ESP32-CAM (and alike) modules:
- M5STACK_V2_PSRAM
- M5STACK_PSRAM
- M5STACK_WIDE
- M5STACK M5PoECAM-W
- M5STACK Timer CAM (Original and X)
- M5STACK
- Seeed Studio XIAO ESP32S3 SENSE
- TTGO T-CAMERA
@@ -97,10 +99,10 @@ There are a lot of boards available that are all called ESP32-CAM.
However, there are differences in CPU (type/speed/cores), how the camera is connected, presence of PSRAM or not...
To select the right board use the table below and use the configuration that is listed below for your board:
| Board | Image | CPU | SRAM | Flash | PSRAM | Camera | Extras | Manufacturer site |
| Board | Image | CPU | SRAM | Flash | PSRAM | Camera | | Site |
|--- |--- |--- |--- |--- | --- |--- |--- |--- |
| Espressif ESP32-Wrover CAM | ![img](assets/boards/esp32-wrover-cam.jpg) | ESP32 | 520KB | 4Mb | 4MB | OV2640 | | |
| AI-Thinker ESP32-CAM | ![img](assets/boards/ai-thinker-esp32-cam-ipex.jpg) ![img](assets/boards/ai-thinker-esp32-cam.jpg) | ESP32-S | 520KB | 4Mb | 4MB | OV2640 | | [https://docs.ai-thinker.com/esp32-cam](https://docs.ai-thinker.com/esp32-cam) |
| AI-Thinker ESP32-CAM | ![img](assets/boards/ai-thinker-esp32-cam-ipex.jpg) ![img](assets/boards/ai-thinker-esp32-cam.jpg) | ESP32 | 520KB | 4Mb | 4MB | OV2640 | | [https://docs.ai-thinker.com/esp32-cam](https://docs.ai-thinker.com/esp32-cam) |
| Espressif ESP-EYE | ![img](assets/boards/espressif-esp-eye.jpg) | ESP32 | 520KB | 4Mb | 4MB | OV2640 | | |
| Espressif ESP-S3-EYE | ![img](assets/boards/espressif-esps3-eye.jpg) | ESP32-S3 | 520KB | 4Mb | 4MB | OV2640 | | [https://www.espressif.com/en/products/devkits/esp-eye/overview](https://www.espressif.com/en/products/devkits/esp-eye/overview) |
| LilyGo camera module | ![img](assets/boards/lilygo-camera-module.jpg) | ESP32 Wrover | 520KB | 4Mb | 4MB | OV2640 / OV5640 | | |
@@ -151,7 +153,7 @@ cd esp32cam-rtsp
```
Next, the firmware has to be build and deployed to the ESP32.
There are to flavours to do this; using the command line or the graphical interface of Visual Studio Code.
There are two flavours to do this; using the command line or the graphical interface of Visual Studio Code.
### Using the command line
@@ -310,22 +312,23 @@ The availability of PSRAM can be seen in the HTML status overview.
Not all the boards are equipped with PSRAM:
| Board | PSRAM |
|--- |--- |
| WROVER_KIT | 8Mb |
| ESP_EYE | 8Mb |
| ESP32S3_EYE | 8Mb |
| M5STACK_PSRAM | 8Mb |
| M5STACK_V2_PSRAM | Version B only |
| M5STACK_WIDE | 8Mb |
| M5STACK_ESP32CAM | No |
| M5STACK_UNITCAM | No |
| M5STACK_UNITCAMS3 | 8Mb |
| AI_THINKER | 8Mb |
| TTGO_T_JOURNAL | No |
| ESP32_CAM_BOARD | ? |
| ESP32S2_CAM_BOARD | ? |
| ESP32S3_CAM_LCD | ? |
| Board | PSRAM |
|--------------------|----------------|
| WROVER_KIT | 8Mb |
| ESP_EYE | 8Mb |
| ESP32S3_EYE | 8Mb |
| M5STACK_PSRAM | 8Mb |
| M5STACK_V2_PSRAM | Version B only |
| M5STACK_WIDE | 8Mb |
| M5STACK_ESP32CAM | No |
| M5STACK_UNITCAM | No |
| M5STACK_UNITCAMS3 | 8Mb |
| M5STACK_M5PoECAM-W | 8MB |
| AI_THINKER | 8Mb |
| TTGO_T_JOURNAL | No |
| ESP32_CAM_BOARD | ? |
| ESP32S2_CAM_BOARD | ? |
| ESP32S3_CAM_LCD | ? |
Depending on the image resolution, framerate and quality, the PSRAM must be enabled and/or the number of frame buffers increased to keep up with the data generated by the sensor.
There are (a lot of?) boards around with faulty PSRAM. If the camera fails to initialize, this might be a reason. See on [Reddit](https://www.reddit.com/r/esp32/comments/z2hyns/i_have_a_faulty_psram_on_my_esp32cam_what_should/).
@@ -357,6 +360,8 @@ esp32cam-rtsp depends on PlatformIO, Bootstrap 5 and Micro-RTSP by Kevin Hester.
## Change history
- August 2024
- Added support for M5Stack M5PoECAM-W
- January 2024
- Moved settings to board definitions
- Added new boards

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

View File

@@ -9,6 +9,7 @@
"'-D ESP32CAM_AI_THINKER'",
"'-D BOARD_HAS_PSRAM'",
"'-mfix-esp32-psram-cache-issue'",
"'-D FLASH_LED_GPIO=4'",
"'-D USER_LED_GPIO=33'",
"'-D USER_LED_ON_LEVEL=LOW'",
"'-D CAMERA_CONFIG_PIN_PWDN=32'",
@@ -32,7 +33,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=2'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_PSRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'"
"'-D SCCB_I2C_PORT=I2C_NUM_0'"
],
"f_cpu": "240000000L",
"f_flash": "40000000L",

View File

@@ -32,7 +32,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=2'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_PSRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'"
"'-D SCCB_I2C_PORT=I2C_NUM_0'"
],
"f_cpu": "240000000L",
"f_flash": "80000000L",

View File

@@ -32,7 +32,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=2'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_PSRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'"
"'-D SCCB_I2C_PORT=I2C_NUM_0'"
],
"f_cpu": "240000000L",
"f_flash": "80000000L",

View File

@@ -34,7 +34,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=2'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_PSRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'"
"'-D SCCB_I2C_PORT=I2C_NUM_0'"
],
"f_cpu": "240000000L",
"f_flash": "80000000L",

View File

@@ -34,7 +34,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=2'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_PSRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'"
"'-D SCCB_I2C_PORT=I2C_NUM_0'"
],
"f_cpu": "240000000L",
"f_flash": "80000000L",

View File

@@ -1,6 +1,6 @@
{
"build": {
"arduino": {
"arduino":{
"ldscript": "esp32_out.ld",
"partitions": "huge_app.csv"
},
@@ -32,7 +32,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=1'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_PSRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'"
"'-D SCCB_I2C_PORT=I2C_NUM_0'"
],
"f_cpu": "240000000L",
"f_flash": "40000000L",

View File

@@ -32,7 +32,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=2'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_PSRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'"
"'-D SCCB_I2C_PORT=I2C_NUM_0'"
],
"f_cpu": "240000000L",
"f_flash": "40000000L",

View File

@@ -1,6 +1,6 @@
{
"build": {
"arduino": {
"arduino":{
"ldscript": "esp32_out.ld",
"partitions": "huge_app.csv"
},
@@ -28,7 +28,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=1'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_DRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'",
"'-D SCCB_I2C_PORT=I2C_NUM_0'",
"'-D GROVE_SDA=13'",
"'-D GROVE_SCL=4'"
],

View File

@@ -30,7 +30,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=2'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_PSRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'",
"'-D SCCB_I2C_PORT=I2C_NUM_0'",
"'-D GROVE_SDA=13'",
"'-D GROVE_SCL=4'"
],

View File

@@ -1,6 +1,6 @@
{
"build": {
"arduino": {
"arduino":{
"ldscript": "esp32_out.ld",
"partitions": "huge_app.csv"
},
@@ -32,7 +32,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=2'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_PSRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'",
"'-D SCCB_I2C_PORT=I2C_NUM_0'",
"'-D MICROPHONE_GPIO=32'",
"'-D GROVE_SDA=13'",
"'-D GROVE_SCL=4'"

View File

@@ -0,0 +1,68 @@
{
"build": {
"arduino":{
"ldscript": "esp32_out.ld",
"partitions": "default_8MB.csv"
},
"core": "esp32",
"extra_flags": [
"'-D ESP32CAM_M5STACK_M5POECAM'",
"'-D BOARD_HAS_PSRAM'",
"'-D USER_LED_GPIO=0'",
"'-D USER_LED_ON_LEVEL=LOW'",
"'-mfix-esp32-psram-cache-issue'",
"'-D CAMERA_CONFIG_PIN_PWDN=GPIO_NUM_NC'",
"'-D CAMERA_CONFIG_PIN_RESET=15'",
"'-D CAMERA_CONFIG_PIN_XCLK=27'",
"'-D CAMERA_CONFIG_PIN_SCCB_SDA=14'",
"'-D CAMERA_CONFIG_PIN_SCCB_SCL=12'",
"'-D CAMERA_CONFIG_PIN_Y9=19'",
"'-D CAMERA_CONFIG_PIN_Y8=36'",
"'-D CAMERA_CONFIG_PIN_Y7=18'",
"'-D CAMERA_CONFIG_PIN_Y6=39'",
"'-D CAMERA_CONFIG_PIN_Y5=5'",
"'-D CAMERA_CONFIG_PIN_Y4=34'",
"'-D CAMERA_CONFIG_PIN_Y3=35'",
"'-D CAMERA_CONFIG_PIN_Y2=32'",
"'-D CAMERA_CONFIG_PIN_VSYNC=22'",
"'-D CAMERA_CONFIG_PIN_HREF=26'",
"'-D CAMERA_CONFIG_PIN_PCLK=21'",
"'-D CAMERA_CONFIG_CLK_FREQ_HZ=20000000'",
"'-D CAMERA_CONFIG_LEDC_TIMER=LEDC_TIMER_0'",
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=2'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_PSRAM'",
"'-D SCCB_I2C_PORT=I2C_NUM_0'",
"'-D GROVE_SDA=25'",
"'-D GROVE_SCL=33'"
],
"f_cpu": "240000000L",
"f_flash": "80000000L",
"flash_mode": "dio",
"mcu": "esp32",
"variant": "esp32"
},
"connectivity": [
"wifi",
"bluetooth",
"ethernet",
"can"
],
"debug": {
"openocd_board": "esp-wroom-32.cfg"
},
"frameworks": [
"arduino",
"espidf"
],
"name": "ESP32-CAM M5STACK M5PoECAM-W",
"upload": {
"flash_size": "16MB",
"maximum_ram_size": 327680,
"maximum_size": 16777216,
"require_upload_port": true,
"speed": 460800
},
"url": "https://docs.m5stack.com/en/unit/poecam-w",
"vendor": "M5STACK"
}

View File

@@ -1,6 +1,6 @@
{
"build": {
"arduino": {
"arduino":{
"ldscript": "esp32_out.ld",
"partitions": "huge_app.csv"
},
@@ -30,7 +30,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=1'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_DRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'"
"'-D SCCB_I2C_PORT=I2C_NUM_0'"
],
"f_cpu": "240000000L",
"f_flash": "40000000L",

View File

@@ -36,7 +36,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=2'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_DRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'",
"'-D SCCB_I2C_PORT=I2C_NUM_0'",
"'-D I2C_MEMS_SDA=17'",
"'-D I2C_MEMS_SCL=41'",
"'-D TF_CS=9'",

View File

@@ -1,6 +1,6 @@
{
"build": {
"arduino": {
"arduino":{
"ldscript": "esp32_out.ld",
"partitions": "huge_app.csv"
},
@@ -30,7 +30,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=2'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_PSRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'"
"'-D SCCB_I2C_PORT=I2C_NUM_0'"
],
"f_cpu": "240000000L",
"f_flash": "40000000L",

View File

@@ -0,0 +1,72 @@
{
"build": {
"arduino": {
"ldscript": "esp32s3_out.ld",
"partitions": "default_16MB.csv",
"memory_type": "qio_opi"
},
"core": "esp32",
"extra_flags": [
"'-D ARDUINO_ESP32S3_DEV'",
"'-D ARDUINO_USB_MODE=1'",
"'-D BOARD_HAS_PSRAM'",
"'-D ARDUINO_RUNNING_CORE=1'",
"'-D ARDUINO_EVENT_RUNNING_CORE=1'",
"'-D ARDUINO_USB_CDC_ON_BOOT=1'",
"'-D CAMERA_CONFIG_PIN_PWDN=GPIO_NUM_NC'",
"'-D CAMERA_CONFIG_PIN_RESET=GPIO_NUM_NC'",
"'-D CAMERA_CONFIG_PIN_XCLK=15'",
"'-D CAMERA_CONFIG_PIN_SCCB_SDA=4'",
"'-D CAMERA_CONFIG_PIN_SCCB_SCL=5'",
"'-D CAMERA_CONFIG_PIN_Y9=16'",
"'-D CAMERA_CONFIG_PIN_Y8=17'",
"'-D CAMERA_CONFIG_PIN_Y7=18'",
"'-D CAMERA_CONFIG_PIN_Y6=12'",
"'-D CAMERA_CONFIG_PIN_Y5=10'",
"'-D CAMERA_CONFIG_PIN_Y4=8'",
"'-D CAMERA_CONFIG_PIN_Y3=9'",
"'-D CAMERA_CONFIG_PIN_Y2=11'",
"'-D CAMERA_CONFIG_PIN_VSYNC=6'",
"'-D CAMERA_CONFIG_PIN_HREF=7'",
"'-D CAMERA_CONFIG_PIN_PCLK=13'",
"'-D CAMERA_CONFIG_CLK_FREQ_HZ=20000000'",
"'-D CAMERA_CONFIG_LEDC_TIMER=LEDC_TIMER_0'",
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=2'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_PSRAM'",
"'-D SCCB_I2C_PORT=I2C_NUM_0'"
],
"f_cpu": "240000000L",
"f_flash": "80000000L",
"flash_mode": "dio",
"psram_type": "opi",
"hwids": [
[
"0x303A",
"0x1001"
]
],
"mcu": "esp32s3",
"variant": "esp32s3"
},
"connectivity": [
"wifi"
],
"debug": {
"openocd_target": "esp32s3.cfg"
},
"frameworks": [
"arduino",
"espidf"
],
"name": "ESP32 S3 WROOM n16r8",
"upload": {
"flash_size": "16MB",
"maximum_ram_size": 327680,
"maximum_size": 16777216,
"require_upload_port": true,
"speed": 921600
},
"url": "https://docs.espressif.com/projects/esp-idf/en/latest/esp32s3/hw-reference/esp32s3/user-guide-devkitc-1.html",
"vendor": "Espressif"
}

View File

@@ -36,7 +36,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=2'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_PSRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'",
"'-D SCCB_I2C_PORT=I2C_NUM_0'",
"'-D I2C_MEMS_SDA=41'",
"'-D I2C_MEMS_SCL=42'",
"'-D TF_CS=21'",

View File

@@ -28,7 +28,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=1'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_DRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'",
"'-D SCCB_I2C_PORT=I2C_NUM_0'",
"'-D LCD_SSD1306_PIN_SDA=21'",
"'-D LCD_SSD1306_PIN_SCL=22'",
"'-D BUTTON_RIGHT_PIN=34'",

View File

@@ -28,7 +28,7 @@
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=1'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_DRAM'",
"'-D CAMERA_CONFIG_SCCB_I2C_PORT=I2C_NUM_0'"
"'-D SCCB_I2C_PORT=I2C_NUM_0'"
],
"f_cpu": "240000000L",
"f_flash": "40000000L",

View File

@@ -0,0 +1,69 @@
{
"build": {
"arduino":{
"ldscript": "esp32_out.ld",
"partitions": "huge_app.csv"
},
"core": "esp32",
"extra_flags": [
"'-D ARDUINO_M5Stack_Timer_CAM'",
"'-D BOARD_HAS_PSRAM'",
"'-mfix-esp32-psram-cache-issue'",
"'-mfix-esp32-psram-cache-strategy=memw'",
"'-D USER_LED_GPIO=2'",
"'-D USER_LED_ON_LEVEL=HIGH'",
"'-D CAMERA_CONFIG_PIN_PWDN=GPIO_NUM_NC'",
"'-D CAMERA_CONFIG_PIN_RESET=15'",
"'-D CAMERA_CONFIG_PIN_XCLK=27'",
"'-D CAMERA_CONFIG_PIN_SCCB_SDA=25'",
"'-D CAMERA_CONFIG_PIN_SCCB_SCL=23'",
"'-D CAMERA_CONFIG_PIN_Y9=19'",
"'-D CAMERA_CONFIG_PIN_Y8=36'",
"'-D CAMERA_CONFIG_PIN_Y7=18'",
"'-D CAMERA_CONFIG_PIN_Y6=39'",
"'-D CAMERA_CONFIG_PIN_Y5=5'",
"'-D CAMERA_CONFIG_PIN_Y4=34'",
"'-D CAMERA_CONFIG_PIN_Y3=35'",
"'-D CAMERA_CONFIG_PIN_Y2=32'",
"'-D CAMERA_CONFIG_PIN_VSYNC=22'",
"'-D CAMERA_CONFIG_PIN_HREF=26'",
"'-D CAMERA_CONFIG_PIN_PCLK=21'",
"'-D CAMERA_CONFIG_CLK_FREQ_HZ=20000000'",
"'-D CAMERA_CONFIG_LEDC_TIMER=LEDC_TIMER_0'",
"'-D CAMERA_CONFIG_LEDC_CHANNEL=LEDC_CHANNEL_0'",
"'-D CAMERA_CONFIG_FB_COUNT=2'",
"'-D CAMERA_CONFIG_FB_LOCATION=CAMERA_FB_IN_PSRAM'",
"'-D SCCB_I2C_PORT=I2C_NUM_0'",
"'-D GROVE_SDA=4'",
"'-D GROVE_SCL=13'"
],
"f_cpu": "240000000L",
"f_flash": "40000000L",
"flash_mode": "dio",
"mcu": "esp32",
"variant": "m5stack_timer_cam"
},
"connectivity": [
"wifi",
"bluetooth",
"ethernet",
"can"
],
"debug": {
"openocd_board": "esp-wroom-32.cfg"
},
"frameworks": [
"arduino",
"espidf"
],
"name": "M5Stack Timer CAM",
"upload": {
"flash_size": "4MB",
"maximum_ram_size": 327680,
"maximum_size": 4194304,
"require_upload_port": true,
"speed": 115200
},
"url": "https://docs.m5stack.com/en/unit/timercam",
"vendor": "M5STACK"
}

Submodule dotnet_riscv deleted from 70e3cb657b

View File

@@ -9,12 +9,6 @@
#define OTA_PASSWORD "ESP32CAM-RTSP"
// Time servers
#define NTP_SERVER_1 "nl.pool.ntp.org"
#define NTP_SERVER_2 "europe.pool.ntp.org"
#define NTP_SERVER_3 "time.nist.gov"
#define NTP_SERVERS NTP_SERVER_1, NTP_SERVER_2, NTP_SERVER_3
#define RTSP_PORT 554
#define DEFAULT_FRAME_DURATION 200

View File

@@ -1,18 +0,0 @@
#include <stddef.h>
#include "jpg_section.h"
class jpg
{
public:
bool decode(const uint8_t *jpg, size_t size);
const jpg_section_dqt_t *quantization_table_luminance_;
const jpg_section_dqt_t *quantization_table_chrominance_;
const uint8_t *jpeg_data_start;
const uint8_t *jpeg_data_end;
private:
static const jpg_section_t *find_jpg_section(const uint8_t **ptr, const uint8_t *end, jpg_section_t::jpg_section_flag flag);
};

View File

@@ -1,107 +0,0 @@
#pragma once
#include <stddef.h>
#include <stdint.h>
// http://www.ietf.org/rfc/rfc2345.txt Each table is an array of 64 values given in zig-zag order, identical to the format used in a JFIF DQT marker segment.
constexpr size_t jpeg_quantization_table_length = 64;
typedef struct __attribute__((packed))
{
enum jpg_section_flag : uint8_t
{
DATA = 0x00,
SOF0 = 0xc0,
SOF1 = 0xc1,
SOF2 = 0xc2,
SOF3 = 0xc3,
DHT = 0xc4,
SOF5 = 0xc5,
SOF6 = 0xc6,
SOF7 = 0xc7,
JPG = 0xc8,
SOF9 = 0xc9,
SOF10 = 0xca,
SOF11 = 0xcb,
DAC = 0xcc,
SOF13 = 0xcd,
SOF14 = 0xce,
SOF15 = 0xcf,
RST0 = 0xd0,
RST1 = 0xd1,
RST2 = 0xd2,
RST3 = 0xd3,
RST4 = 0xd4,
RST5 = 0xd5,
RST6 = 0xd6,
RST7 = 0xd7,
SOI = 0xd8,
EOI = 0xd9,
SOS = 0xda,
DQT = 0xdb,
DNL = 0xdc,
DRI = 0xdd,
DHP = 0xde,
EXP = 0xdf,
APP0 = 0xe0,
APP1 = 0xe1,
APP2 = 0xe2,
APP3 = 0xe3,
APP4 = 0xe4,
APP5 = 0xe5,
APP6 = 0xe6,
APP7 = 0xe7,
APP8 = 0xe8,
APP9 = 0xe9,
APP10 = 0xea,
APP11 = 0xeb,
APP12 = 0xec,
APP13 = 0xed,
APP14 = 0xee,
APP15 = 0xef,
JPG0 = 0xf0,
JPG1 = 0xf1,
JPG2 = 0xf2,
JPG3 = 0xf3,
JPG4 = 0xf4,
JPG5 = 0xf5,
JPG6 = 0xf6,
JPG7 = 0xf7,
JPG8 = 0xf8,
JPG9 = 0xf9,
COM = 0xfe,
JPG10 = 0xfa,
JPG11 = 0xfb,
JPG12 = 0xfc,
JPG13 = 0xfd
};
const uint8_t framing; // 0xff
const jpg_section_flag flag;
const uint8_t length_msb;
const uint8_t length_lsb;
const uint8_t data[];
static bool is_valid_flag(const jpg_section_flag flag);
static const char *flag_name(const jpg_section_flag flag);
uint16_t data_length() const;
uint16_t section_length() const;
} jpg_section_t;
typedef struct __attribute__((packed)) // 0xffe0
{
char identifier[5] = {'J', 'F', 'I', 'F', 0}; // JFIF identifier, zero-terminated
uint8_t version_major = 1;
uint8_t version_minor = 1; // JFIF version 1.1
uint8_t density_units = 0; // no density units specified
uint16_t density_hor = 1;
uint16_t density_ver = 1; // density: 1 pixel "per pixel" horizontally and vertically
uint8_t thumbnail_hor = 0;
uint8_t thumbnail_ver = 0; // no thumbnail (size 0 x 0)
} jpg_section_app0_t;
typedef struct __attribute__((packed)) // 0xffdb
{
uint8_t id; // 0= quantLuminance, 1= quantChrominance
uint8_t data[jpeg_quantization_table_length];
} jpg_section_dqt_t;

View File

@@ -1,14 +0,0 @@
{
"name": "micro-jpg",
"version": "1.0.0",
"description": "JPEG library",
"keywords": "",
"repository": {
"type": "git",
"url": "https://github.com/rzeldent/"
},
"build": {
"srcDir": "src/",
"includeDir": "include/"
}
}

View File

@@ -1,111 +0,0 @@
#include <esp32-hal-log.h>
#include "jpg.h"
const jpg_section_t *jpg::find_jpg_section(const uint8_t **ptr, const uint8_t *end, jpg_section_t::jpg_section_flag flag)
{
log_d("find_jpeg_section 0x%02x (%s)", flag, jpg_section_t::flag_name(flag));
while (*ptr < end)
{
// flag, len MSB, len LSB
auto section = reinterpret_cast<const jpg_section_t *>((*ptr));
if (section->framing != 0xff)
{
log_e("Expected framing 0xff but found: 0x%02x", section->framing);
break;
}
if (!jpg_section_t::is_valid_flag(section->flag))
{
log_d("Unknown section 0x%02x", flag);
return nullptr;
}
// Advance pointer section has a length, so not SOI (0xd8) and EOI (0xd9)
*ptr += section->section_length();
if (section->flag == flag)
{
log_d("Found section 0x%02x (%s), %d bytes", flag, jpg_section_t::flag_name(section->flag), section->section_length());
return section;
}
log_d("Skipping section: 0x%02x (%s), %d bytes", section->flag, jpg_section_t::flag_name(section->flag), section->section_length());
}
// Not found
return nullptr;
}
// See https://create.stephan-brumme.com/toojpeg/
bool jpg::decode(const uint8_t *data, size_t size)
{
log_d("decode_jpeg");
// Look for start jpeg file (0xd8)
auto ptr = data;
auto end = ptr + size;
// Check for SOI (start of image) 0xff, 0xd8
if (!find_jpg_section(&ptr, end, jpg_section_t::jpg_section_flag::SOI))
{
log_e("No valid start of image marker found");
return false;
}
// First quantization table (Luminance - black & white images)
const jpg_section_t *quantization_table_section;
if (!(quantization_table_section = find_jpg_section(&ptr, end, jpg_section_t::jpg_section_flag::DQT)))
{
log_e("No quantization_table_luminance section found");
return false;
}
if (quantization_table_section->data_length() != sizeof(jpg_section_dqt_t))
{
log_w("Invalid length of quantization_table_luminance section. Expected %d but read %d", sizeof(jpg_section_dqt_t), quantization_table_section->data_length());
return false;
}
quantization_table_luminance_ = reinterpret_cast<const jpg_section_dqt_t *>(quantization_table_section->data);
// Second quantization table (Chrominance - color images)
if (!(quantization_table_section = find_jpg_section(&ptr, end, jpg_section_t::jpg_section_flag::DQT)))
{
log_w("No quantization_table_chrominance section found");
return false;
}
if (quantization_table_section->data_length() != sizeof(jpg_section_dqt_t))
{
log_w("Invalid length of quantization_table_chrominance section. Expected %d but read %d", sizeof(jpg_section_dqt_t), quantization_table_section->data_length());
return false;
}
quantization_table_chrominance_ = reinterpret_cast<const jpg_section_dqt_t *>(quantization_table_section->data);
// Start of scan
if (!find_jpg_section(&ptr, end, jpg_section_t::jpg_section_flag::SOS))
{
log_e("No start of scan section found");
return false;
}
// Start of the data sections
jpeg_data_start = ptr;
log_d("Skipping over data sections");
// Scan over all the sections. 0xff followed by not zero, is a new section
while (ptr < end - 1 && (ptr[0] != 0xff || ptr[1] == 0))
ptr++;
// Check if marker is an end of image marker
if (!find_jpg_section(&ptr, end, jpg_section_t::jpg_section_flag::EOI))
{
log_e("No end of image marker found");
return false;
}
jpeg_data_end = ptr;
log_d("Total jpeg data: %d bytes", jpeg_data_end - jpeg_data_start);
return true;
}

View File

@@ -1,154 +0,0 @@
#include "jpg_section.h"
uint16_t jpg_section_t::data_length() const
{
return (length_msb << 8) + length_lsb - sizeof(jpg_section_t::length_msb)- sizeof(jpg_section_t::length_lsb);
}
uint16_t jpg_section_t::section_length() const
{
return flag == SOI || flag == EOI ? sizeof(jpg_section_t::framing) + sizeof(jpg_section_t::flag) : sizeof(jpg_section_t::framing) + sizeof(jpg_section_t::flag) + (length_msb << 8) + length_lsb;
}
bool jpg_section_t::is_valid_flag(const jpg_section_flag flag)
{
return flag >= SOF0 && flag <= COM;
}
// from: https://www.disktuna.com/list-of-jpeg-markers/
const char *jpg_section_t::flag_name(const jpg_section_flag flag)
{
switch (flag)
{
case DATA:
return "DATA"; // DATA
case SOF0:
return "SOF0"; // Start of Frame 0 Baseline DCT
case SOF1:
return "SOF1"; // Start of Frame 1 Extended Sequential DCT
case SOF2:
return "SOF2"; // Start of Frame 2 Progressive DCT
case SOF3:
return "SOF3"; // Start of Frame 3 Lossless (sequential)
case DHT:
return "DHT"; // Define Huffman Table
case SOF5:
return "SOF5"; // Start of Frame 5 Differential sequential DCT
case SOF6:
return "SOF6"; // Start of Frame 6 Differential progressive DCT
case SOF7:
return "SOF7"; // Start of Frame 7 Differential lossless (sequential)
case JPG:
return "JPG"; // JPEG Extensions
case SOF9:
return "SOF9"; // Start of Frame 9 Extended sequential DCT, Arithmetic coding
case SOF10:
return "SOF10"; // Start of Frame 10 Progressive DCT, Arithmetic coding
case SOF11:
return "SOF11"; // Start of Frame 11 Lossless (sequential), Arithmetic coding
case DAC:
return "DAC"; // Define Arithmetic Coding
case SOF13:
return "SOF13"; // Start of Frame 13 Differential sequential DCT, Arithmetic coding
case SOF14:
return "SOF14"; // Start of Frame 14 Differential progressive DCT, Arithmetic coding
case SOF15:
return "SOF15"; // Start of Frame 15 Differential lossless (sequential), Arithmetic coding
case RST0:
return "RST0"; // Restart Marker 0
case RST1:
return "RST1"; // Restart Marker 1
case RST2:
return "RST2"; // Restart Marker 2
case RST3:
return "RST3"; // Restart Marker 3
case RST4:
return "RST4"; // Restart Marker 4
case RST5:
return "RST5"; // Restart Marker 5
case RST6:
return "RST6"; // Restart Marker 6
case RST7:
return "RST7"; // Restart Marker 7
case SOI:
return "SOI"; // Start of Image
case EOI:
return "EOI"; // End of Image
case SOS:
return "SOS"; // Start of Scan
case DQT:
return "DQT"; // Define Quantization Table
case DNL:
return "DNL"; // Define Number of Lines (Not common)
case DRI:
return "DRI"; // Define Restart Interval
case DHP:
return "DHP"; // Define Hierarchical Progression (Not common)
case EXP:
return "EXP"; // Expand Reference Component (Not common)
case APP0:
return "APP0"; // Application Segment 0 JFIF JFIF JPEG image, AVI1 Motion JPEG (MJPG)
case APP1:
return "APP1"; // Application Segment 1 EXIF Metadata, TIFF IFD format, JPEG Thumbnail (160×120) Adobe XMP
case APP2:
return "APP2"; // Application Segment 2 ICC color profile, FlashPix
case APP3:
return "APP3"; // Application Segment 3 (Not common) JPS Tag for Stereoscopic JPEG images
case APP4:
return "APP4"; // Application Segment 4 (Not common)
case APP5:
return "APP5"; // Application Segment 5 (Not common)
case APP6:
return "APP6"; // Application Segment 6 (Not common) NITF Lossles profile
case APP7:
return "APP7"; // Application Segment 7 (Not common)
case APP8:
return "APP8"; // Application Segment 8 (Not common)
case APP9:
return "APP9"; // Application Segment 9 (Not common)
case APP10:
return "APP10"; // Application Segment 10 PhoTags (Not common) ActiveObject (multimedia messages / captions)
case APP11:
return "APP11"; // Application Segment 11 (Not common) HELIOS JPEG Resources (OPI Postscript)
case APP12:
return "APP12"; // Application Segment 12 Picture Info (older digicams), Photoshop Save for Web: Ducky
case APP13:
return "APP13"; // Application Segment 13 Photoshop Save As: IRB, 8BIM, IPTC
case APP14:
return "APP14"; // Application Segment 14 (Not common)
case APP15:
return "APP15"; // Application Segment 15 (Not common)
case JPG0:
return "JPG0"; // JPEG Extension 0
case JPG1:
return "JPG1"; // JPEG Extension 1
case JPG2:
return "JPG2"; // JPEG Extension 2
case JPG3:
return "JPG3"; // JPEG Extension 3
case JPG4:
return "JPG4"; // JPEG Extension 4
case JPG5:
return "JPG5"; // JPEG Extension 5
case JPG6:
return "JPG6"; // JPEG Extension 6
case JPG7:
return "JPG7"; // SOF48 JPEG Extension 7 JPEG-LS Lossless JPEG
case JPG8:
return "JPG8"; // LSE JPEG Extension 8 JPEG-LS Extension Lossless JPEG Extension Parameters
case JPG9:
return "JPG9"; // JPEG Extension 9 (Not common)
case JPG10:
return "JPG10"; // JPEG Extension 10 (Not common)
case JPG11:
return "JPG11"; // JPEG Extension 11 (Not common)
case JPG12:
return "JPG12"; // JPEG Extension 12 (Not common)
case JPG13:
return "JPG13"; // JPEG Extension 13 (Not common)
case COM:
return "COM"; // Comment
}
return "Unknown";
}

View File

@@ -1,25 +0,0 @@
#pragma once
#include <micro_rtsp_source.h>
#include <esp_camera.h>
class micro_rtsp_camera : public micro_rtsp_source
{
public:
micro_rtsp_camera();
virtual ~micro_rtsp_camera();
esp_err_t initialize(camera_config_t *camera_config);
esp_err_t deinitialize();
virtual void update_frame();
virtual uint8_t *data() const { return fb_->buf; }
virtual size_t width() const { return fb_->width; }
virtual size_t height() const { return fb_->height; }
virtual size_t size() const { return fb_->len; }
private:
esp_err_t init_result_;
camera_fb_t *fb_;
};

View File

@@ -1,56 +0,0 @@
#pragma once
#include <map>
#include <string>
class micro_rtsp_requests
{
public:
std::string process_request(const std::string& request);
bool active() const { return stream_active_; }
private:
// enum rtsp_command
// {
// rtsp_command_unknown,
// rtsp_command_options, // OPTIONS
// rtsp_command_describe, // DESCRIBE
// rtsp_command_setup, // SETUP
// rtsp_command_play, // PLAY
// rtsp_command_teardown // TEARDOWN
// };
static const std::string available_stream_name_;
//rtsp_command parse_command(const std::string &request);
//static bool parse_cseq(const std::string &line, unsigned long &cseq);
bool parse_client_port(const std::string &request);
//bool parse_stream_url(const std::string &request);
//static std::string date_header();
static std::string handle_rtsp_error(unsigned long cseq, unsigned short code, const std::string &message);
static std::string handle_options(unsigned long cseq);
static std::string handle_describe(unsigned long cseq, const std::string &request);
std::string handle_setup(unsigned long cseq, const std::map<std::string, std::string> &request);
std::string handle_play(unsigned long cseq);
std::string handle_teardown(unsigned long cseq);
//unsigned long cseq_;
// std::string host_url_;
// unsigned short host_port_;
// std::string stream_name_;
bool tcp_transport_;
unsigned short start_client_port_;
unsigned short end_client_port_;
unsigned short rtp_streamer_port_;
unsigned short rtcp_streamer_port_;
unsigned long rtsp_session_id_;
bool stream_active_;
bool stream_stopped_;
};

View File

@@ -1,45 +0,0 @@
#pragma once
#include <Arduino.h>
#include <WiFiServer.h>
#include <string>
#include <list>
#include "micro_rtsp_camera.h"
#include "micro_rtsp_requests.h"
#include "micro_rtsp_streamer.h"
class micro_rtsp_server : WiFiServer
{
public:
micro_rtsp_server(micro_rtsp_source &source);
~micro_rtsp_server();
void begin(unsigned short port = 554);
void end();
unsigned get_frame_interval() const { return frame_interval_; }
unsigned set_frame_interval(unsigned value) { return frame_interval_ = value; }
void loop();
size_t clients() const { return clients_.size(); }
class rtsp_client : public WiFiClient, public micro_rtsp_requests
{
public:
rtsp_client(const WiFiClient &client);
~rtsp_client();
void handle_request();
};
private:
micro_rtsp_source &source_;
unsigned frame_interval_;
unsigned long next_frame_update_;
unsigned long next_check_client_;
micro_rtsp_streamer streamer_;
std::list<rtsp_client> clients_;
};

View File

@@ -1,16 +0,0 @@
#pragma once
#include <stddef.h>
#include <stdint.h>
// Interface for a video source
class micro_rtsp_source
{
public:
virtual void update_frame() = 0;
virtual uint8_t *data() const = 0;
virtual size_t width() const = 0;
virtual size_t height() const = 0;
virtual size_t size() const = 0;
};

View File

@@ -1,43 +0,0 @@
#pragma once
#include <jpg_section.h>
#include <micro_rtsp_camera.h> // Add this line to include the definition of micro_rtsp_camera
#include <micro_rtsp_structs.h>
// https://en.wikipedia.org/wiki/Maximum_transmission_unit
constexpr size_t max_wifi_mtu = 2304;
// Payload JPG - https://www.ietf.org/rfc/rfc1890.txt
constexpr uint8_t RTP_PAYLOAD_JPG = 26;
// One of the types below will be returned, the jpeg_packet_with_quantization_t for the first packet, then the jpeg_packet_t
typedef struct __attribute__((packed))
{
rtp_over_tcp_hdr_t rtp_over_tcp_hdr;
rtp_hdr_t rtp_hdr;
jpeg_hdr_t jpeg_hdr;
jpeg_hdr_qtable_t jpeg_hdr_qtable;
uint8_t quantization_table_luminance[jpeg_quantization_table_length];
uint8_t quantization_table_chrominance[jpeg_quantization_table_length];
uint8_t jpeg_data[];
} jpeg_packet_with_quantization_t;
typedef struct __attribute__((packed))
{
rtp_over_tcp_hdr_t rtp_over_tcp_hdr;
rtp_hdr_t rtp_hdr;
jpeg_hdr_t jpeg_hdr;
uint8_t jpeg_data[];
} jpeg_packet_t;
class micro_rtsp_streamer
{
public:
micro_rtsp_streamer(const micro_rtsp_source& source);
rtp_over_tcp_hdr_t *create_jpg_packet(const uint8_t *jpg_scan, const uint8_t *jpg_scan_end, uint8_t **jpg_offset, const uint32_t timestamp, const uint8_t *quantization_table_luminance, const uint8_t *quantization_table_chrominance);
private:
const micro_rtsp_source& source_;
uint32_t ssrc_;
uint16_t sequence_number_;
};

View File

@@ -1,51 +0,0 @@
#pragma once
#include <stdint.h>
// https://www.ietf.org/rfc/rfc2326#section-10.12
typedef struct __attribute__((packed))
{
char magic = '$'; // Magic encapsulation ASCII dollar sign (24 hexadecimal)
uint8_t channel; // Channel identifier
uint16_t length; // Network order
} rtp_over_tcp_hdr_t;
// RTP data header - http://www.ietf.org/rfc/rfc3550.txt
typedef struct __attribute__((packed))
{
uint16_t version : 2; // protocol version
uint16_t padding : 1; // padding flag
uint16_t extension : 1; // header extension flag
uint16_t cc : 4; // CSRC count
uint16_t marker : 1; // marker bit
uint16_t pt : 7; // payload type
uint16_t seq : 16; // sequence number
uint32_t ts; // timestamp
uint32_t ssrc; // synchronization source
} rtp_hdr_t;
// https://datatracker.ietf.org/doc/html/rfc2435
typedef struct __attribute__((packed))
{
uint32_t tspec : 8; // type-specific field
uint32_t off : 24; // fragment byte offset
uint8_t type; // id of jpeg decoder params
uint8_t q; // Q values 0-127 indicate the quantization tables. JPEG types 0 and 1 (and their corresponding types 64 and 65)
uint8_t width; // frame width in 8 pixel blocks
uint8_t height; // frame height in 8 pixel blocks
} jpeg_hdr_t;
typedef struct __attribute__((packed))
{
uint16_t dri;
uint16_t f : 1;
uint16_t l : 1;
uint16_t count : 14;
} jpeg_hdr_rst_t;
typedef struct __attribute__((packed))
{
uint8_t mbz;
uint8_t precision;
uint16_t length;
} jpeg_hdr_qtable_t;

View File

@@ -1,20 +0,0 @@
{
"name": "micro-rtsp-streamer",
"version": "1.0.0",
"description": "RTSP Server",
"keywords": "",
"repository": {
"type": "git",
"url": "https://github.com/rzeldent/micro-rtsp-streamer"
},
"build": {
"srcDir": "src/",
"includeDir": "include/"
},
"dependencies": {
"micro-jpg": "^1.0.0",
"espressif/esp32-camera": "^2.0.4"
}
}

View File

@@ -1,39 +0,0 @@
#include <esp32-hal-log.h>
#include "micro_rtsp_camera.h"
micro_rtsp_camera::micro_rtsp_camera()
{
init_result_ == ESP_FAIL;
}
micro_rtsp_camera::~micro_rtsp_camera()
{
deinitialize();
}
esp_err_t micro_rtsp_camera::initialize(camera_config_t *camera_config)
{
log_v("camera_config={.pin_pwdn:%u,.pin_reset:%u,.pin_xclk:%u,.pin_sccb_sda:%u,.pin_sccb_scl:%u,.pin_d7:%u,.pin_d6:%u,.pin_d5:%u,.pin_d4:%u,.pin_d3:%u,.pin_d2:%u,.pin_d1:%u,.pin_d0:%u,.pin_vsync:%u,.pin_href:%u,.pin_pclk:%u,.xclk_freq_hz:%d,.ledc_timer:%u,ledc_channel:%u,.pixel_format:%d,.frame_size:%d,.jpeg_quality:%d,.fb_count:%d,.fb_location%d,.grab_mode:%d,sccb_i2c_port:%d}", camera_config->pin_pwdn, camera_config->pin_reset, camera_config->pin_xclk, camera_config->pin_sccb_sda, camera_config->pin_sccb_scl, camera_config->pin_d7, camera_config->pin_d6, camera_config->pin_d5, camera_config->pin_d4, camera_config->pin_d3, camera_config->pin_d2, camera_config->pin_d1, camera_config->pin_d0, camera_config->pin_vsync, camera_config->pin_href, camera_config->pin_pclk, camera_config->xclk_freq_hz, camera_config->ledc_timer, camera_config->ledc_channel, camera_config->pixel_format, camera_config->frame_size, camera_config->jpeg_quality, camera_config->fb_count, camera_config->fb_location, camera_config->grab_mode, camera_config->sccb_i2c_port);
init_result_ = esp_camera_init(camera_config);
if (init_result_ == ESP_OK)
update_frame();
else
log_e("Camera initialization failed: 0x%02x", init_result_);
return init_result_;
}
esp_err_t micro_rtsp_camera::deinitialize()
{
return init_result_ == ESP_OK ? esp_camera_deinit() : ESP_OK;
}
void micro_rtsp_camera::update_frame()
{
if (fb_)
esp_camera_fb_return(fb_);
fb_ = esp_camera_fb_get();
}

View File

@@ -1,217 +0,0 @@
#include <esp32-hal-log.h>
#include <iomanip>
#include <unordered_map>
#include <regex>
#include "micro_rtsp_requests.h"
// https://datatracker.ietf.org/doc/html/rfc2326
const std::string micro_rtsp_requests::available_stream_name_ = "/mjpeg/1";
bool micro_rtsp_requests::parse_client_port(const std::string &request)
{
log_v("request: %s", request.c_str());
std::regex regex("client_port=([0-9]+)", std::regex_constants::icase);
std::smatch match;
if (!std::regex_match(request, match, regex))
{
log_e("client_port not found");
return false;
}
start_client_port_ = std::stoi(match[1].str());
return true;
}
std::string micro_rtsp_requests::handle_rtsp_error(unsigned long cseq, unsigned short code, const std::string &message)
{
log_e("code: %d, message: %s", code, message.c_str());
auto now = time(nullptr);
std::ostringstream oss;
oss << "RTSP/1.0 " << code << " " << message << "\r\n"
<< "CSeq: " << cseq << "\r\n"
<< std::put_time(std::gmtime(&now), "Date: %a, %b %d %Y %H:%M:%S GMT") << "\r\n";
return oss.str();
}
// OPTIONS rtsp://192.168.178.247:554/mjpeg/1 RTSP/1.0
// CSeq: 2
// User-Agent: LibVLC/3.0.20 (LIVE555 Streaming Media v2016.11.28)
std::string micro_rtsp_requests::handle_options(unsigned long cseq)
{
auto now = time(nullptr);
std::ostringstream oss;
oss << "RTSP/1.0 200 OK\r\n"
<< "CSeq: " << cseq << "\r\n"
<< std::put_time(std::gmtime(&now), "Date: %a, %b %d %Y %H:%M:%S GMT") << "\r\n"
<< "Content-Length: 0\r\n"
<< "Public: DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE\r\n"
<< "\r\n";
return oss.str();
}
// DESCRIBE rtsp://192.168.178.247:554/mjpeg/1 RTSP/1.0
// CSeq: 3
// User-Agent: LibVLC/3.0.20 (LIVE555 Streaming Media v2016.11.28)
// Accept: application/sdp
std::string micro_rtsp_requests::handle_describe(unsigned long cseq, const std::string &request)
{
// Parse the url
static const std::regex regex_url("rtsp:\\/\\/([^\\/:]+)(?::(\\d+))?(\\/.*)?\\s+RTSP\\/1\\.0", std::regex_constants::icase);
std::smatch match;
if (!std::regex_search(request, match, regex_url))
return handle_rtsp_error(cseq, 400, "Invalid URL");
auto host = match[1].str();
auto port = match[2].str().length() > 0 ? std::stoi(match[2].str()) : 554;
auto path = match[3].str();
log_i("host: %s, port: %d, path: %s", host.c_str(), port, path.c_str());
if (path != available_stream_name_)
return handle_rtsp_error(cseq, 404, "Stream Not Found");
std::ostringstream osbody;
osbody << "v=0\r\n"
<< "o=- " << std::rand() << " 1 IN IP4 " << host << "\r\n"
<< "s=\r\n"
<< "t=0 0\r\n" // start / stop - 0 -> unbounded and permanent session
<< "m=video 0 RTP/AVP 26\r\n" // currently we just handle UDP sessions
<< "c=IN IP4 0.0.0.0\r\n";
auto body = osbody.str();
auto now = time(nullptr);
std::ostringstream oss;
oss << "RTSP/1.0 200 OK\r\n"
<< "CSeq: " << cseq << "\r\n"
<< std::put_time(std::gmtime(&now), "Date: %a, %b %d %Y %H:%M:%S GMT") << "\r\n"
<< "Content-Base: rtsp://" << host << ":" << port << path << "/" << "\r\n"
<< "Content-Type: application/sdp\r\n"
<< "Content-Length: " << body.size() << "\r\n"
<< "\r\n"
<< body;
return oss.str();
}
// SETUP rtsp://192.168.178.247:554/mjpeg/1 RTSP/1.0
// CSeq: 0
// Transport: RTP/AVP;unicast;client_port=9058-9059
std::string micro_rtsp_requests::handle_setup(unsigned long cseq, const std::map<std::string, std::string> &lines)
{
log_v("request: %s", request.c_str());
auto it = lines.find("Transport");
if (it == lines.end())
return handle_rtsp_error(cseq, 400, "No Transport Header Found");
static const std::regex regex_transport("\\s+RTP\\/AVP(\\/TCP)?;unicast;client_port=(\\d+)-(\\d+)", std::regex_constants::icase);
std::smatch match;
if (!std::regex_search(it->second, match, regex_transport))
return handle_rtsp_error(cseq, 400, "Could Not Parse Transport");
tcp_transport_ = match[1].str().length() > 0;
start_client_port_ = std::stoi(match[2].str());
end_client_port_ = std::stoi(match[3].str());
log_i("tcp_transport: %d, start_client_port: %d, end_client_port: %d", tcp_transport_, start_client_port_, end_client_port_);
std::ostringstream ostransport;
if (tcp_transport_)
ostransport << "RTP/AVP/TCP;unicast;interleaved=0-1";
else
ostransport << "RTP/AVP;unicast;destination=127.0.0.1;source=127.0.0.1;client_port=" << start_client_port_ << "-" << end_client_port_ + 1 << ";server_port=" << rtp_streamer_port_ << "-" << rtp_streamer_port_/*rtcp_streamer_port_*/;
auto now = time(nullptr);
std::ostringstream oss;
oss << "RTSP/1.0 200 OK\r\n"
<< "CSeq: " << cseq << "\r\n"
<< std::put_time(std::gmtime(&now), "Date: %a, %b %d %Y %H:%M:%S GMT") << "\r\n"
<< "Transport: " << ostransport.str() << "\r\n"
<< "Session: " << rtsp_session_id_<< "\r\n";
return oss.str();
}
std::string micro_rtsp_requests::handle_play(unsigned long cseq)
{
log_v("request: %s", request.c_str());
stream_active_ = true;
auto now = time(nullptr);
std::ostringstream oss;
oss << "RTSP/1.0 200 OK\r\n"
<< "CSeq: " << cseq << "\r\n"
<< std::put_time(std::gmtime(&now), "Date: %a, %b %d %Y %H:%M:%S GMT") << "\r\n"
<< "Range: npt=0.000-\r\n"
<< "Session: " << rtsp_session_id_ << "\r\n"
<< "RTP-Info: url=rtsp://127.0.0.1:8554" << available_stream_name_ << "/track1" << "\r\n"
<< "\r\n";
return oss.str();
}
std::string micro_rtsp_requests::handle_teardown(unsigned long cseq)
{
log_v("request: %s", request.c_str());
stream_stopped_ = true;
auto now = time(nullptr);
std::ostringstream oss;
oss << "RTSP/1.0 200 OK\r\n"
<< "CSeq: " << cseq << "\r\n"
<< std::put_time(std::gmtime(&now), "Date: %a, %b %d %Y %H:%M:%S GMT") << "\r\n"
<< "\r\n";
return oss.str();
}
// Parse a request e.g.
// Request: OPTIONS rtsp://192.168.178.247:554/mjpeg/1 RTSP/1.0
// CSeq: 2
// User-Agent: LibVLC/3.0.20 (LIVE555 Streaming Media v2016.11.28)
std::string micro_rtsp_requests::process_request(const std::string &request)
{
log_v("request: %s", request.c_str());
std::stringstream ss(request);
// Get the request line
std::string request_line;
if (!std::getline(ss, request_line))
return handle_rtsp_error(0, 400, "No Request Found");
// Create a map with headers
std::string line;
std::map<std::string, std::string> headers;
std::size_t pos;
while (std::getline(ss, line))
{
if ((pos = line.find(':')) != std::string::npos)
headers[line.substr(0, pos)] = line.substr(pos + 1);
// else
// log_e("No : found for header: %s", line.c_str());
}
log_i("request_line: %s", request_line.c_str());
for (const auto &header : headers)
log_i("header: %s: %s", header.first.c_str(), header.second.c_str());
// Check for CSeq
const auto cseq_it = headers.find("CSeq");
if (cseq_it == headers.end())
return handle_rtsp_error(0, 400, "No Sequence Found");
auto cseq = std::stoul(cseq_it->second);
if (request_line.find("OPTIONS") == 0)
return handle_options(cseq);
if (request_line.find("DESCRIBE") == 0)
return handle_describe(cseq, request_line);
if (request_line.find("SETUP") == 0)
return handle_setup(cseq, headers);
if (request_line.find("PLAY") == 0)
return handle_play(cseq);
if (request_line.find("TEARDOWN") == 0)
return handle_teardown(cseq);
return handle_rtsp_error(cseq, 400, "Unknown Command or malformed request");
}

View File

@@ -1,108 +0,0 @@
#include <micro_rtsp_server.h>
#include <jpg.h>
#include <vector>
#include <memory>
// Check client connections every 100 milliseconds
#define CHECK_CLIENT_INTERVAL 10
micro_rtsp_server::micro_rtsp_server(micro_rtsp_source &source)
: source_(source), streamer_(source)
{
}
micro_rtsp_server::~micro_rtsp_server()
{
end();
}
void micro_rtsp_server::begin(unsigned short port /*= 554*/)
{
WiFiServer::begin(port);
}
void micro_rtsp_server::end()
{
WiFiServer::end();
}
void micro_rtsp_server::loop()
{
auto now = millis();
if (next_check_client_ < now)
{
log_v("Check for new client");
next_check_client_ = now + CHECK_CLIENT_INTERVAL;
// Check if a client wants to connect
auto client = accept();
if (client)
clients_.push_back(rtsp_client(client));
// Check for idle clients
clients_.remove_if([](rtsp_client &c)
{ return !c.connected(); });
for (auto client : clients_)
client.handle_request();
}
if (next_frame_update_ < now)
{
log_v("Stream frame t=%d", next_frame_update_);
next_frame_update_ = now + frame_interval_;
auto ts = time(nullptr);
// Get next jpg frame
source_.update_frame();
// Decode to get quantitation- and scan data
jpg jpg;
auto jpg_data = source_.data();
auto jpg_size = source_.size();
assert(jpg.decode(jpg_data, jpg_size));
auto jpg_scan_current = (uint8_t *)jpg.jpeg_data_start;
while (jpg_scan_current < jpg.jpeg_data_end)
{
auto packet = streamer_.create_jpg_packet(jpg.jpeg_data_start, jpg.jpeg_data_end, &jpg_scan_current, ts, jpg.quantization_table_luminance_->data, jpg.quantization_table_chrominance_->data);
for (auto client : clients_)
{
log_i("Stream frame to client: 0x%08x", client);
// RTP over TCP encapsulates in a $
client.write((const uint8_t *)packet, packet->length + sizeof(rtp_over_tcp_hdr_t));
// TODO: UDP
}
free(packet);
}
}
}
micro_rtsp_server::rtsp_client::rtsp_client(const WiFiClient &wifi_client)
: WiFiClient(wifi_client)
{
}
micro_rtsp_server::rtsp_client::~rtsp_client()
{
stop();
}
void micro_rtsp_server::rtsp_client::handle_request()
{
// Read if data available
auto bytes_available = available();
if (bytes_available > 0)
{
std::string request(bytes_available, '\0');
if (read((uint8_t *)&request[0], bytes_available) == bytes_available)
{
request.resize(bytes_available);
log_i("Request: %s", request.c_str());
auto response = process_request(request);
log_i("Response: %s", response.c_str());
println(response.c_str());
println();
}
}
}

View File

@@ -1,82 +0,0 @@
#include <stddef.h>
#include <memory.h>
#include <esp32-hal-log.h>
#include "micro_rtsp_streamer.h"
#include "esp_random.h"
micro_rtsp_streamer::micro_rtsp_streamer(const micro_rtsp_source &source)
: source_(source)
{
// Random number
ssrc_ = esp_random();
sequence_number_ = 0;
}
rtp_over_tcp_hdr_t *micro_rtsp_streamer::create_jpg_packet(const uint8_t *jpg_scan, const uint8_t *jpg_scan_end, uint8_t **jpg_offset, const uint32_t timestamp, const uint8_t *quantization_table_luminance, const uint8_t *quantization_table_chrominance)
{
log_v("jpg_scan:0x%08x, jpg_scan_end:0x%08x, jpg_offset:0x%08x, timestamp:%d, quantization_table_luminance:0x%08x, quantization_table_chrominance:0x%08x", jpg_scan, jpg_scan_end, jpg_offset, timestamp, quantization_table_luminance, quantization_table_chrominance);
// The MTU of wireless networks is 2,312 bytes. This size includes the packet headers.
const auto isFirstFragment = jpg_scan == *jpg_offset;
const auto include_quantization_tables = isFirstFragment && quantization_table_luminance != nullptr && quantization_table_chrominance != nullptr;
// Quantization tables musty be included in the first packet
const auto headers_size = include_quantization_tables ? sizeof(jpeg_packet_with_quantization_t) : sizeof(jpeg_packet_t);
const auto payload_size = max_wifi_mtu - headers_size;
const auto jpg_bytes_left = jpg_scan_end - *jpg_offset;
const bool isLastFragment = jpg_bytes_left <= payload_size;
const auto jpg_bytes = isLastFragment ? jpg_bytes_left : payload_size;
const uint16_t packet_size = headers_size + jpg_bytes;
const auto packet = static_cast<jpeg_packet_t *>(calloc(1, packet_size));
// 4 bytes RTP over TCP header
packet->rtp_over_tcp_hdr.channel = 0;
packet->rtp_over_tcp_hdr.length = packet_size;
log_v("rtp_over_tcp_hdr_t={.magic=%c,.channel=%u,.length=%u}", packet->rtp_over_tcp_hdr.magic, packet->rtp_over_tcp_hdr.channel, packet->rtp_over_tcp_hdr.length);
// 12 bytes RTP header
packet->rtp_hdr.version = 2;
packet->rtp_hdr.marker = isLastFragment;
packet->rtp_hdr.pt = RTP_PAYLOAD_JPG;
packet->rtp_hdr.seq = sequence_number_;
packet->rtp_hdr.ts = timestamp;
packet->rtp_hdr.ssrc = ssrc_;
log_v("rtp_hdr={.version:%u,.padding:%u,.extension:%u,.cc:%u,.marker:%u,.pt:%u,.seq:%u,.ts:%u,.ssrc:%u}", packet->rtp_hdr.version, packet->rtp_hdr.padding, packet->rtp_hdr.extension, packet->rtp_hdr.cc, packet->rtp_hdr.marker, packet->rtp_hdr.pt, packet->rtp_hdr.seq, packet->rtp_hdr.ts, packet->rtp_hdr.ssrc);
// 8 bytes JPEG payload header
packet->jpeg_hdr.tspec = 0; // type-specific field
packet->jpeg_hdr.off = (uint32_t)(*jpg_offset - jpg_scan); // fragment byte offset (24 bits in jpg)
packet->jpeg_hdr.type = 0; // id of jpeg decoder params
packet->jpeg_hdr.q = (uint8_t)(include_quantization_tables ? 0x80 : 0x5e); // quantization factor (or table id) 5eh=94d
packet->jpeg_hdr.width = (uint8_t)(source_.width() >> 3); // frame width in 8 pixel blocks
packet->jpeg_hdr.height = (uint8_t)(source_.height() >> 3); // frame height in 8 pixel blocks
log_v("jpeg_hdr={.tspec:%u,.off:0x%6x,.type:0x2%x,.q:%u,.width:%u.height:%u}", packet->jpeg_hdr.tspec, packet->jpeg_hdr.off, packet->jpeg_hdr.type, packet->jpeg_hdr.q, packet->jpeg_hdr.width, packet->jpeg_hdr.height);
// Only in first packet of the frame
if (include_quantization_tables)
{
auto packet_with_quantization = reinterpret_cast<jpeg_packet_with_quantization_t *>(packet);
packet_with_quantization->jpeg_hdr_qtable.mbz = 0;
packet_with_quantization->jpeg_hdr_qtable.precision = 0; // 8 bit precision
packet_with_quantization->jpeg_hdr_qtable.length = jpeg_quantization_table_length + jpeg_quantization_table_length;
log_v("jpeg_hdr_qtable={.mbz:%u,.precision:%u,.length:%u}", packet_with_quantization->jpeg_hdr_qtable.mbz, packet_with_quantization->jpeg_hdr_qtable.precision, packet_with_quantization->jpeg_hdr_qtable.length);
memcpy(packet_with_quantization->quantization_table_luminance, quantization_table_luminance, jpeg_quantization_table_length);
memcpy(packet_with_quantization->quantization_table_chrominance, quantization_table_chrominance, jpeg_quantization_table_length);
// Copy JPG data
memcpy(packet_with_quantization->jpeg_data, *jpg_offset, jpg_bytes);
}
else
{
// Copy JPG data
memcpy(packet->jpeg_data, *jpg_offset, jpg_bytes);
}
// Update JPG offset
*jpg_offset += jpg_bytes;
// Update sequence number
sequence_number_++;
return (rtp_over_tcp_hdr_t *)packet;
}

View File

@@ -10,7 +10,8 @@
###############################################################################
[platformio]
default_envs = esp32cam_ai_thinker
#default_envs = esp32cam_s3_wroom_n16r8
#default_envs = esp32cam_ai_thinker
#default_envs = esp32cam_espressif_esp_eye
#default_envs = esp32cam_espressif_esp32s2_cam_board
#default_envs = esp32cam_espressif_esp32s2_cam_header
@@ -20,36 +21,35 @@ default_envs = esp32cam_ai_thinker
#default_envs = esp32cam_m5stack_camera_psram
#default_envs = esp32cam_m5stack_camera
#default_envs = esp32cam_m5stack_esp32cam
#default_envs = esp32cam_m5stack_unitcam
#default_envs = esp32cam_m5stack_unitcams3
#default_envs = esp32cam_m5stack_wide
#default_envs = esp32cam_m5stack_m5poecam_w
#default_envs = esp32cam_seeed_xiao_esp32s3_sense
#default_envs = esp32cam_ttgo_t_camera
#default_envs = esp32cam_ttgo_t_journal
[env]
platform = espressif32
platform = espressif32@6.9.0
framework = arduino
test_framework = unity
#upload_port = /dev/tty.usbmodem*
#upload_protocol = espota
#upload_port = 192.168.178.223
#upload_flags = --auth='ESP32CAM-RTSP'
# Partition scheme for OTA
#board_build.partitions = max_spiffs.csv
board_build.partitions = min_spiffs.csv
monitor_speed = 115200
monitor_rts = 0
monitor_dtr = 0
#monitor_filters = log2file, time, default, esp32_exception_decoder
monitor_filters = esp32_exception_decoder
monitor_filters = log2file, time, default, esp32_exception_decoder
build_flags =
-Ofast
-D 'BOARD_NAME="${this.board}"'
-D 'CORE_DEBUG_LEVEL=ARDUHAL_LOG_LEVEL_INFO'
-D 'CORE_DEBUG_LEVEL=ARDUHAL_LOG_LEVEL_VERBOSE'
-D 'IOTWEBCONF_PASSWORD_LEN=64'
board_build.embed_txtfiles =
@@ -58,8 +58,7 @@ board_build.embed_txtfiles =
lib_deps =
prampec/IotWebConf@^3.2.1
geeksville/Micro-RTSP@^0.1.6
rzeldent/micro-moustache
rzeldent/micro-timezonedb
rzeldent/micro-moustache@^1.0.1
[env:esp32cam_ai_thinker]
board = esp32cam_ai_thinker
@@ -104,6 +103,9 @@ board = esp32cam_m5stack_unitcams3
[env:esp32cam_m5stack_wide]
board = esp32cam_m5stack_wide
[env:esp32cam_m5stack_m5poecam_w]
board = esp32cam_m5stack_m5poecam_w
[env:esp32cam_seeed_xiao_esp32s3_sense]
board = esp32cam_seeed_xiao_esp32s3_sense
@@ -111,4 +113,10 @@ board = esp32cam_seeed_xiao_esp32s3_sense
board = esp32cam_ttgo_t_camera
[env:esp32cam_ttgo_t_journal]
board = esp32cam_ttgo_t_journal
board = esp32cam_ttgo_t_journal
[env:m5stack-timer-cam]
board = m5stack-timer-cam
[env:esp32cam_s3_wroom_n16r8]
board = esp32cam_s3_wroom_n16r8

View File

@@ -16,9 +16,6 @@
#include <moustache.h>
#include <settings.h>
#include <micro_rtsp_camera.h>
#include <micro_rtsp_server.h>
// HTML files
extern const char index_html_min_start[] asm("_binary_html_index_min_html_start");
@@ -50,15 +47,11 @@ auto param_dcw = iotwebconf::Builder<iotwebconf::CheckboxTParameter>("dcw").labe
auto param_colorbar = iotwebconf::Builder<iotwebconf::CheckboxTParameter>("cb").label("Colorbar").defaultValue(DEFAULT_COLORBAR).build();
// Camera
// OV2640 cam;
OV2640 cam;
// DNS Server
DNSServer dnsServer;
// RTSP Server
// std::unique_ptr<rtsp_server> camera_server;
micro_rtsp_camera camera;
micro_rtsp_server server(camera);
std::unique_ptr<rtsp_server> camera_server;
// Web server
WebServer web_server(80);
@@ -107,7 +100,7 @@ void handle_root()
{"Uptime", String(format_duration(millis() / 1000))},
{"FreeHeap", format_memory(ESP.getFreeHeap())},
{"MaxAllocHeap", format_memory(ESP.getMaxAllocHeap())},
{"NumRTSPSessions", String(server.clients())},
{"NumRTSPSessions", camera_server != nullptr ? String(camera_server->num_connected()) : "RTSP server disabled"},
// Network
{"HostName", hostname},
{"MacAddress", WiFi.macAddress()},
@@ -157,6 +150,20 @@ void handle_root()
web_server.send(200, "text/html", html);
}
#ifdef FLASH_LED_GPIO
void handle_flash()
{
log_v("handle_flash");
// If no value present, use off, otherwise convert v to integer. Depends on analog resolution for max value
auto v = web_server.hasArg("v") ? web_server.arg("v").toInt() : 0;
// If conversion fails, v = 0
analogWrite(FLASH_LED_GPIO, v);
web_server.sendHeader("Cache-Control", "no-cache, no-store, must-revalidate");
web_server.send(200);
}
#endif
void handle_snapshot()
{
log_v("handle_snapshot");
@@ -169,10 +176,10 @@ void handle_snapshot()
// Remove old images stored in the frame buffer
auto frame_buffers = CAMERA_CONFIG_FB_COUNT;
while (frame_buffers--)
camera.update_frame();
cam.run();
auto fb_len = camera.size();
auto fb = camera.data();
auto fb_len = cam.getSize();
auto fb = (const char *)cam.getfb();
if (fb == nullptr)
{
web_server.send(404, "text/plain", "Unable to obtain frame buffer from the camera");
@@ -182,7 +189,7 @@ void handle_snapshot()
web_server.sendHeader("Cache-Control", "no-cache, no-store, must-revalidate");
web_server.setContentLength(fb_len);
web_server.send(200, "image/jpeg", "");
web_server.sendContent((const char *)fb, fb_len);
web_server.sendContent(fb, fb_len);
}
#define STREAM_CONTENT_BOUNDARY "123456789000000000000987654321"
@@ -204,11 +211,11 @@ void handle_stream()
while (client.connected())
{
client.write("\r\n--" STREAM_CONTENT_BOUNDARY "\r\n");
camera.update_frame();
cam.run();
client.write("Content-Type: image/jpeg\r\nContent-Length: ");
sprintf(size_buf, "%d\r\n\r\n", camera.size());
sprintf(size_buf, "%d\r\n\r\n", cam.getSize());
client.write(size_buf);
client.write(camera.data(), camera.size());
client.write(cam.getfb(), cam.getSize());
}
log_v("client disconnected");
@@ -225,45 +232,39 @@ esp_err_t initialize_camera()
log_i("JPEG quality: %d", param_jpg_quality.value());
auto jpeg_quality = param_jpg_quality.value();
log_i("Frame duration: %d ms", param_frame_duration.value());
// Set frame duration
server.set_frame_interval(param_frame_duration.value());
camera_config_t camera_config = {
.pin_pwdn = CAMERA_CONFIG_PIN_PWDN, // GPIO pin for camera power down line
.pin_reset = CAMERA_CONFIG_PIN_RESET, // GPIO pin for camera reset line
.pin_xclk = CAMERA_CONFIG_PIN_XCLK, // GPIO pin for camera XCLK line
.pin_sccb_sda = CAMERA_CONFIG_PIN_SCCB_SDA, // GPIO pin for camera SDA line
.pin_sccb_scl = CAMERA_CONFIG_PIN_SCCB_SCL, // GPIO pin for camera SCL line
.pin_d7 = CAMERA_CONFIG_PIN_Y9, // GPIO pin for camera D7 line
.pin_d6 = CAMERA_CONFIG_PIN_Y8, // GPIO pin for camera D6 line
.pin_d5 = CAMERA_CONFIG_PIN_Y7, // GPIO pin for camera D5 line
.pin_d4 = CAMERA_CONFIG_PIN_Y6, // GPIO pin for camera D4 line
.pin_d3 = CAMERA_CONFIG_PIN_Y5, // GPIO pin for camera D3 line
.pin_d2 = CAMERA_CONFIG_PIN_Y4, // GPIO pin for camera D2 line
.pin_d1 = CAMERA_CONFIG_PIN_Y3, // GPIO pin for camera D1 line
.pin_d0 = CAMERA_CONFIG_PIN_Y2, // GPIO pin for camera D0 line
.pin_vsync = CAMERA_CONFIG_PIN_VSYNC, // GPIO pin for camera VSYNC line
.pin_href = CAMERA_CONFIG_PIN_HREF, // GPIO pin for camera HREF line
.pin_pclk = CAMERA_CONFIG_PIN_PCLK, // GPIO pin for camera PCLK line
.xclk_freq_hz = CAMERA_CONFIG_CLK_FREQ_HZ, // Frequency of XCLK signal, in Hz. EXPERIMENTAL: Set to 16MHz on ESP32-S2 or ESP32-S3 to enable EDMA mode
.ledc_timer = CAMERA_CONFIG_LEDC_TIMER, // LEDC timer to be used for generating XCLK
.ledc_channel = CAMERA_CONFIG_LEDC_CHANNEL, // LEDC channel to be used for generating XCLK
.pixel_format = PIXFORMAT_JPEG, // Format of the pixel data: PIXFORMAT_ + YUV422|GRAYSCALE|RGB565|JPEG
.frame_size = frame_size, // Size of the output image: FRAMESIZE_ + QVGA|CIF|VGA|SVGA|XGA|SXGA|UXGA
.jpeg_quality = jpeg_quality, // Quality of JPEG output. 0-63 lower means higher quality
.fb_count = CAMERA_CONFIG_FB_COUNT, // Number of frame buffers to be allocated. If more than one, then each frame will be acquired (double speed)
.fb_location = CAMERA_CONFIG_FB_LOCATION, // The location where the frame buffer will be allocated
.grab_mode = CAMERA_GRAB_LATEST, // When buffers should be filled
const camera_config_t camera_config = {
.pin_pwdn = CAMERA_CONFIG_PIN_PWDN, // GPIO pin for camera power down line
.pin_reset = CAMERA_CONFIG_PIN_RESET, // GPIO pin for camera reset line
.pin_xclk = CAMERA_CONFIG_PIN_XCLK, // GPIO pin for camera XCLK line
.pin_sccb_sda = CAMERA_CONFIG_PIN_SCCB_SDA, // GPIO pin for camera SDA line
.pin_sccb_scl = CAMERA_CONFIG_PIN_SCCB_SCL, // GPIO pin for camera SCL line
.pin_d7 = CAMERA_CONFIG_PIN_Y9, // GPIO pin for camera D7 line
.pin_d6 = CAMERA_CONFIG_PIN_Y8, // GPIO pin for camera D6 line
.pin_d5 = CAMERA_CONFIG_PIN_Y7, // GPIO pin for camera D5 line
.pin_d4 = CAMERA_CONFIG_PIN_Y6, // GPIO pin for camera D4 line
.pin_d3 = CAMERA_CONFIG_PIN_Y5, // GPIO pin for camera D3 line
.pin_d2 = CAMERA_CONFIG_PIN_Y4, // GPIO pin for camera D2 line
.pin_d1 = CAMERA_CONFIG_PIN_Y3, // GPIO pin for camera D1 line
.pin_d0 = CAMERA_CONFIG_PIN_Y2, // GPIO pin for camera D0 line
.pin_vsync = CAMERA_CONFIG_PIN_VSYNC, // GPIO pin for camera VSYNC line
.pin_href = CAMERA_CONFIG_PIN_HREF, // GPIO pin for camera HREF line
.pin_pclk = CAMERA_CONFIG_PIN_PCLK, // GPIO pin for camera PCLK line
.xclk_freq_hz = CAMERA_CONFIG_CLK_FREQ_HZ, // Frequency of XCLK signal, in Hz. EXPERIMENTAL: Set to 16MHz on ESP32-S2 or ESP32-S3 to enable EDMA mode
.ledc_timer = CAMERA_CONFIG_LEDC_TIMER, // LEDC timer to be used for generating XCLK
.ledc_channel = CAMERA_CONFIG_LEDC_CHANNEL, // LEDC channel to be used for generating XCLK
.pixel_format = PIXFORMAT_JPEG, // Format of the pixel data: PIXFORMAT_ + YUV422|GRAYSCALE|RGB565|JPEG
.frame_size = frame_size, // Size of the output image: FRAMESIZE_ + QVGA|CIF|VGA|SVGA|XGA|SXGA|UXGA
.jpeg_quality = jpeg_quality, // Quality of JPEG output. 0-63 lower means higher quality
.fb_count = CAMERA_CONFIG_FB_COUNT, // Number of frame buffers to be allocated. If more than one, then each frame will be acquired (double speed)
.fb_location = CAMERA_CONFIG_FB_LOCATION, // The location where the frame buffer will be allocated
.grab_mode = CAMERA_GRAB_LATEST, // When buffers should be filled
#if CONFIG_CAMERA_CONVERTER_ENABLED
conv_mode = CONV_DISABLE, // RGB<->YUV Conversion mode
conv_mode = CONV_DISABLE, // RGB<->YUV Conversion mode
#endif
.sccb_i2c_port = CAMERA_CONFIG_SCCB_I2C_PORT // If pin_sccb_sda is -1, use the already configured I2C bus by number
.sccb_i2c_port = SCCB_I2C_PORT // If pin_sccb_sda is -1, use the already configured I2C bus by number
};
return camera.initialize(&camera_config);
// return cam.init(camera_config);
return cam.init(camera_config);
}
void update_camera_settings()
@@ -302,8 +303,7 @@ void update_camera_settings()
void start_rtsp_server()
{
log_v("start_rtsp_server");
server.begin(RTSP_PORT);
camera_server = std::unique_ptr<rtsp_server>(new rtsp_server(cam, param_frame_duration.value(), RTSP_PORT));
// Add RTSP service to mDNS
// HTTP is already set by iotWebConf
MDNS.addService("rtsp", "tcp", RTSP_PORT);
@@ -330,13 +330,21 @@ void setup()
// Disable brownout
WRITE_PERI_REG(RTC_CNTL_BROWN_OUT_REG, 0);
Serial.begin(115200);
Serial.setDebugOutput(true);
#ifdef USER_LED_GPIO
pinMode(USER_LED_GPIO, OUTPUT);
digitalWrite(USER_LED_GPIO, !USER_LED_ON_LEVEL);
#endif
Serial.begin(115200);
Serial.setDebugOutput(true);
#ifdef FLASH_LED_GPIO
pinMode(FLASH_LED_GPIO, OUTPUT);
// Set resolution to 8 bits
analogWriteResolution(8);
// Turn flash led off
analogWrite(FLASH_LED_GPIO, 0);
#endif
#ifdef ARDUINO_USB_CDC_ON_BOOT
// Delay for USB to connect/settle
@@ -388,24 +396,21 @@ void setup()
#endif
iotWebConf.init();
// Set the time servers
configTime(0, 0, NTP_SERVERS);
// Try to initialize 3 times
for (auto i = 0; i < 3; i++)
{
log_i("Initializing camera...");
camera_init_result = initialize_camera();
if (camera_init_result == ESP_OK)
{
update_camera_settings();
break;
}
esp_camera_deinit();
log_e("Failed to initialize camera. Error: 0x%04x. Frame size: %s, frame rate: %d ms, jpeg quality: %d", camera_init_result, param_frame_size.value(), param_frame_duration.value(), param_jpg_quality.value());
log_e("Failed to initialize camera. Error: 0x%0x. Frame size: %s, frame rate: %d ms, jpeg quality: %d", camera_init_result, param_frame_size.value(), param_frame_duration.value(), param_jpg_quality.value());
delay(500);
}
update_camera_settings();
// Set up required URL handlers on the web server
web_server.on("/", HTTP_GET, handle_root);
web_server.on("/config", []
@@ -414,7 +419,10 @@ void setup()
web_server.on("/snapshot", HTTP_GET, handle_snapshot);
// Camera stream
web_server.on("/stream", HTTP_GET, handle_stream);
#ifdef FLASH_LED_GPIO
// Flash led
web_server.on("/flash", HTTP_GET, handle_flash);
#endif
web_server.onNotFound([]()
{ iotWebConf.handleNotFound(); });
}
@@ -423,5 +431,6 @@ void loop()
{
iotWebConf.doLoop();
server.loop();
if (camera_server)
camera_server->doLoop();
}

File diff suppressed because one or more lines are too long