B4R Tutorial ESP32 Camera Picture Capture and Video Streaming! (Updated with code!)

Hello!

Last December I made a request for support for the ESP32 Camera support. Well, I finally found the time to work on it myself and here's my initial attempt at implementing this with B4R. I'm using an ESP32CAM camera board with 4GB of PSRAM (like extended memory for the camera). I'm using one similar to this one you can find on Amazon. The board has an OV2640 2MP camera and an SD card slot (it's worth mentioning this model does not have a USB interface so you'll need an FTDI programmer).

I started with knowing nothing about image capture or video streaming to a working prototype, after about a week, I'm happy to say it works :).

The B4R app allows for picture capture using a /pic URL and stream video using /live over HTTP.

You can see a small demo of the screen capture and video on . The video shows my attempt to capture my daughter's toy duck picture (had a problem lining up the camera) using the /pic URL. Then I show video streaming from the camera in browser and inVLC using the /live URL. You can see the debug info as connections come in my B4R log window on the right of the screen.

The picture below is my ESP32 Camera board, my FTDI programmer on the left and my wiring.

devboard.jpg


Basically I'm using inline C to access the camera driver (which is now part of the Arduino library), and B4R WiFiServerSocket for the web server. It actually works quite well for a $10 camera and microcontroller. I plan on doing a full blown tutorial here on how it works and posting the source code (in the next day or so - I want to do a proper tutorial because I learned a lot during this fun project.

I want to contribute back to the community since I get so much from here. I hope you enjoy! :)

See the below lengthy tutorial and code :)
 
Last edited:

miker2069

Active Member
Licensed User
Longtime User
Hello! This is my first attempt at a tutorial. I'll try to provide everything that I've learned about working with the ESP32 Camera boards. Hopefully you can hit the ground running with what I'll provide. Like I said in the initial demo - I'm quite impressed with what you can with a $10 camera and microcontroller!

PART 1 - Getting a Board and Setting it up

Step 1 - Get the right ESP32 Camera Board

ESP32-Camera-Board (1).jpg


The first (obvious) step is to get an ESP32 Camera board. I bought one earlier this year (M5 Stack) and did not realize that some camera boards have additional RAM on it called PSRAM. I also did not realize that the M5 Stack board (at the time) did not have working examples for Arduino IDE (you needed to use the expressif IDF programming environment). This made my initial exploration very frustrating and pretty much ruled out using this board with B4R (however it turns out it is usable in Arduino IDE and B4R with so code modifications which I'll describe shortly). The OV2640 camera module which is on the ESP32 Camera boards can support a max resolution of 1600 x 1200 pixels, however without PSRAM the camera frame buffer can only support a max resolution of 800x600 pixels. Also, boards without PSRAM can only support 1 frame buffer. Boards with PSRAM (typically 4MB) can support the max resolution and also support 2 concurrent frame buffers. This provides for a better streaming experience as while the one frame is being sent (say to the network) the camera is working on capturing the next frame. So you want to get get a board with PSRAM. It's worth mentioning that you may see boards with 8MB PSRAM, however at the moment only 4MB is addressable by the camera. To access the other 4MB requires "bank switching" - I know the dev community is working on making the full 8MB work with cameras.

In addition to get a board with PSRAM (and it's now rare that you'll find boards for sale anymore without it), you want to get a board based on the AI-THINKER model. Why? Coding is that much easier as just about all the examples out there have use the pin mappings for the AI THINKER model. You could get another board, however you wil most likely have to manually map the pins for your board in your code. The AI THINKER pin mapping is already defined in ESP32 Arduino camera libraries, in your inline c code you simply need to the #define:

B4X:
#define CAMERA_MODEL_AI_THINKER

And that's it, it's all mapped for you. Here's an example of a typical ESP32CAM board on Amazon. what's nice about this board is that it has the camera and built in SD Card slot (more on the SD card slot later). What's "not so good" is that it doesn't have a USB connector, so programming is a bit of a challenge.

All this to say make sure the board you purchase is well defined by Arduino IDE and has PSRAM - life will be a lot easier.

Step 2 - Programming

As mentioned, there is no USB interface on the ES32CAM AI Thinker model. If you're like me, you typically buy dev boards with integrated USB so that flashing programs to it is a breeze. With the ES32CAM board there are two ways of flashing your program.

  1. Using an UNO as a make-shift flasher (works but flaky). See this article for more information
  2. Using a proper FTDI programmer (recommended). See this article for more information

I first tried the UNO as I did not have an FTDI programmer and couldn't wait. So if you have a camera board and an UNO lying around, that it will work - kinda. The article I referenced mentions that if flashing fails, just plug it out and try again. I had to do that a LOT. I would say that at least 1 out of 3 attempts to flash would fail and I'd have to plug the UNO out of the USB and plug it back in and try again. You can see that gets to be very frustrating fast. There were other issues, but when it worked, it worked great and I was able to test my code.

I recommend getting a good quality FTDI programmer. I picked this one up on Amazon and absolutely love it. I rarely have any issues with it (1 out of 40 times I may need to close B4R or Arduino because the comm port would lock up), but that was very much acceptable to me. You will want to wire your camera board to your FTDI programmer like so:
ESP32-CAM-wiring-FTDI1.png



A note on the voltage levels. You'll see a lot of examples that have you powering your board with 3.3V. At 3.3V the board is very flaky and will "brown" out a lot (similar to the watch dog timer reset. Powering the board with 5V increases stability a LOT. So if you are going to use 5V, make sure you're FTDI programmer is set for 5V (and most FTDI programmers have a 3.3 or 5V switch on them), the following is IMPORTANT!

MAKE SURE YOU CONNECT VCC from the FTDI programmer to 5Vin on your board. If you mistakenly connect it to 3.3V (because you followed the defaults in the examples), your board will light up like holiday lights for 5 glorious seconds before dying. Yes, I did that - fortunately I noticed it after about a 1 second and my board seems fine. Hopefully you don't make the same mistake as me :) - See image below:

warning.jpg




Once your board is wired up properly, follow the instructions in this article (FTDI programmer version) exactly to configure your board and run the sample ESP32 Camera Web Server sketch. I recommend starting here first, before jumping into the B4R. The article will get your Arduino environment ready for working with the ESP32 camera. Briefly, you will need to download the latest ESP32 Arduino libraries (which has camera support). It's a good idea to update the ESP8266 libraries as well (as I'll describe in the section). Make sure to update Arduino properly, and ensure that Arduino settings look like the following (with the exception of the comm port):

b4rboardsettings2.png



It's important as we'll set up B4R settings to match these. Once you've completed the steps in the article and you have have a working camera (and played with it sufficiently), you can move on to the next step, setting up B4R.
 
Last edited:

miker2069

Active Member
Licensed User
Longtime User
Part 2 - Setting up B4R

This next part describes setting up B4R. Once you've completed part 1 and you have your FTDI programmer and camera board working, you can now set up B4R.

First, make sure you're running the latest version of B4R. At the time I am writing this, the latest version of B4R is v2.8, make sure you have at least that. Next, it's important that you have the latest and greatest ESP32 *and* ESP2866 libraries configured in Arduino. If you do not, nothing will work (and it's important to have *both*) because there are some WIFI updates in the ESP8266 to avoid compile errors when the target is ESP32. You'll hit those same compile time errors in B4R as well. See this B4R forum post for more info on the error to avoid.

  1. The latest version of B4R brings in the latest version of rESP8266WiFi (which at this time is v1.55 of this library at the time of this writing).
  2. Next, and this one is also crucial is the latest version of rRandomAccessFile (which is v1.91 at the time of this writing). You can find it here.
Once you have the above updates are in place and updated libraries are installed, you're just about ready to go. Next you need to configure B4R board settings. Just like you setup your Arduino to get the example Camera Web Server sketch working, we're going to mimic those settings in B4R. It's quite simple, just set up your board confi to look like the following (with the exception of the comm port of course):

b4rboardsettings.png


You can adjust the UploadSpeed if you want, but all the other options must match the picture above.


Once you've done that, with your camer board wired to your FTDI programmer connect it to your comm port (if you haven't already done so, make sure you have the correct comm port selected) and then just compile a default program that prints out to the log (i.e.):

B4X:
Private Sub AppStart
    Serial1.Initialize(115200)
    Log("AppStart")
End Sub

You'll want to ensure, B4R is communicating with the board and can flash to it. If all that works well, you can move on to the next section and try code :)
 

miker2069

Active Member
Licensed User
Longtime User
Part 3 - The B4R ESP32CAM App!

The source code for the ESP32CAM Demo app is enclosed here in this post. This is my initial version and attempt at interfacing with the camera board, so I am sure there's lots of room for improvement. I've tried to self document the code as much a possible so you can follow along. I'll describe a bit about how it works.

The app will:

  1. Attempt to initialize the Camera module (which calls an inline c function "init", this setups up the camera with default such as frame size, image type (JPEG,RGB, etc.), and other sensor options)
  2. Connect to your access point (using the hard coded ssid and password)
  3. Start a Web Server on port 80. The web server will look for a /pic (which takes a still pic) and /live which starts a live stream
The web server (at the moment) only allows one connection at a time, to avoid contention for the camera.


Sending an Image

When the web server sees the /pic url it calls SendPic:

The /pic URL fetches a JPEG image from the camera. The default image size that will be returned is set to SVGA (800x600) resolution. You can change this in code (the inline c), just look in the module ESP32CAM which contains the inline c and look for:

B4X:
config.frame_size = FRAMESIZE_SVGA; // FRAMESIZE_ + QVGA|CIF|VGA|SVGA|XGA|SXGA|UXGA
B4X:
Private Sub SendPic
 
    ESP32CAM.TakePic
    If ESP32CAM.PictureAvailable Then
        Log("SendPic: Picture Available")
        If StreamGood Then Astream.Write("HTTP/1.1 200 OK").Write(CRLF)
        If StreamGood Then SendPicToStream
        'CallSubPlus("CloseConnection", 100, 0)
        ESP32CAM.Release
    Else
        'handle picture not available message, write now nothing displays at client side if error
    End If
 
End Sub

This is the easiest to follow as it grabs a picture from the camera and sends it to our client. Now if you look at the code, Send Pic calls another B4R sub called TakePic which in turns calls the inline c function take_pic_only. This is also a good place to start as you will basically understand how image capture works with the camera board (it's actually not to difficult). Basically:
  • The esp_camera_fb_get() function is called, this uses the configured default values for the camera that were set up in the inline c function "init" (as I described above) and stores the results in a global frame buffer point fb:
B4X:
//get image from camera and put in a frame buffer (a global point fb) - this is all managed by the camera driver and needs to be paired up with esp_camera_fb_return(fb)
    //when done with the frame buffer.  Called the B4R Release Sub to execute that function when done with the image, i.e. after sending to the client
     fb = esp_camera_fb_get();
      if(!fb) {
        Serial.println("Camera capture failed");
        b4r_esp32cam::_pictureavailable = false;
        return;
      }

The returned data is actually a single JPEG encode image (you could actually save the data "as-is" to a file and you have a valid image.

Camera boards that have PSRAM, the frame buffer will be mapped into that memory space, if not, it will be mapped into flash (the camera driver takes care of all of that). You'll see in the code that I utilize B4R variables quite a bit such as picture available (in c it's b4r_esp32cam::_pictureavailable ), this let's me know the image was capture successfully or not.

Obviously capturing the image is useless if we can't access the data from B4R! Since the frame buffer is totally managed by the camera driver and all we have is a pointer (fb), I used an empty B4R byte array (which effectively is just a pointer to nothing). this is called buffer() and is defined as so in ESP32CAM module:

B4X:
Public buffer() As Byte

We don't initialize it in B4R instead we manually set buffer() to *initially* point to the beginning of the frame buffer. This is done at the end of the take_pic_only inline c function:


B4X:
      //update the B4R buffer() variable.  In B4R this is defined as a byte array.  This will be sent to the requestor of http using ASTREAM.Write/Write2 calls
      //Every time we update picture, we need to update the buffer pointer and length.
      //On initial capture, the B4R buffer points to the beginning of the image, as we send chunks of data, the buffer pointer is updated by the following function
      //move_frame_buffer_ptr. This was necessary as B4R arrays (i.e. our buffer byte array) can only be indexed up to 32768. If our image is larger than this
      //then the image will be cutoff on the receiving end.
   
      b4r_esp32cam::_buffer->data = fb->buf;
      b4r_esp32cam::_buffer->length = frame_byte_length;

So now we have our B4R buffer() pointer pointing the raw image captured by the camera - great, kinda :). So "frame_byte_length" is the size in bytes of the image. The problem here is that our B4R length member of the array object is defined as an "int", which can only be 32768 max. So if our image is greater than that, we're out of luck. The way around this is that we will "chunk" the frame buffer up as we need it (i.e. sending over the network). In the Process Globals section of ESP32CAM I define a global variable:

B4X:
Public  G_CHUNK_SIZE As Int = 1024 'used for splitting image into chunks so can be sent over network

This variable defines the max amount of the frame buffer we'll use (i.e. like queuing to send over the network) at any time. So after we've captured our image, and returned from the inline function take_pic_only, the next logical thing to consider is how do we move where the B4R variable buffer() points to. This is where the the sub:

B4X:
If StreamGood Then SendPicToStream

comes in. This sub does all the heavy lifting of "chunking" the frame buffer and sending each chunk to the client. I won't post the entire sub here (you can see in the code for yourself), but I'll summarize what it does. it computes the number of "chunks" to send, and moves the B4R buffer() pointer accordingly so that it points to the position in the frame buffer that we want to send. The heart of SendPicToStream is this:

B4X:
    For i = 0 To (num_chunks - 1)
        ESP32CAM.byte_offset = i * chunk_size
        ESP32CAM.MoveFrameBufferPtr
        'Log("Idx: ")
        'Log(i)
        'Log("Offset:")
        'Log(ESP32CAM.byte_offset)
        'Log("Buffer Len:")
        'Log(ESP32CAM.buffer.Length)
        If StreamGood Then
            'note not worried about stack leakage here since were  using an already allocated memory buffer
            Astream.Write2(ESP32CAM.buffer,0,ESP32CAM.buffer.Length) 'explicitly using write2 even though write could probably work since the length variable is updated properly
        Else
            Log("Stream disconnected")
            Return
        End If
    Next

The default chunk size is 1024 bytes (which is set by G _CHUNK_SIZE as mentioned earlier). So if my image is 64K for instance, you'll have 64 chunks to send. You can play around with the chunk size (presumably you can set your chunk to 32768 bytes if you like which is the max). The sub:

B4X:
 ESP32CAM.MoveFrameBufferPtr

Calls an inline c function call "move_frame_buffer_ptr". This function handles adjust the B4R buffer() variable to point to the start of the chunk and updates the length member of buffer() to reflect the current chunk. One thing to note is that the last chunk may be less than
G _CHUNK_SIZE so we need to handle this and set the length properly for the last chunk - this is all handled in "move_frame_buffer_ptr". Now when we call:

B4X:
Astream.Write2(ESP32CAM.buffer,0,ESP32CAM.buffer.Length)

Astream.Write2 is pointing to the correct chunk of data to queue to the network *and* the length of buffer is set properly (so boundary checking won't bite us). As a side note, I toyed with turn off #CheckArrayBounds: True, but I actually like the feature for everywhere else in the program, so the method of chunking the data gives us the best of both worlds.

Once we've looped through all the chunks, the image is sent, and the web server can go back and serve another request.


Sending Stream Video

Sending streaming video is actually just sending a bunch of discrete pictures one after the other over HTTP. They are wrapped in a multipart message that is sent to the client. You can see further discussion on what the client sees here. From the server perspective in the code, we're just in a loop, fetching images from the camera and sending them to the client (taken from Sub StreamVideo):

B4X:
    'now capture each frame from camera and send it to stream, note SendPicToStream is used here as well
    Do While StreamGood
        'take pic
        'Log("Take pic...")
        ESP32CAM.TakePic
        If ESP32CAM.PictureAvailable Then
            'Log("Have Picture")
            'send the mime frame boundary
            If StreamGood Then aws("--esp32camframe") Else Return
            'now send the pic
            If StreamGood Then SendPicToStream
            If StreamGood Then
                awscrlf
                awscrlf
                'aws(CRLF)'.Write(CRLF)
            Else
                Log("Connection broken...")
                Exit
            End If
            ESP32CAM.Release
            'Log("++++looping in video...")
            Delay(10) '1 frame per sec
        Else
            Log("---NO PICTURE!")
        End If
        If G_FRAME_RATE_DELAY_MSEC > 0 Then 'if 0 then send as fast as possible
            'might be better off using a timer vs delay for efficiency
            'with SVGA and a frame rate of 0, and close to the access point the camera is connected to, the stream is pretty good
            Delay(G_FRAME_RATE_DELAY_MSEC )
        End If
    Loop


That's an overview of how it all works!
 

Attachments

  • ESP32CAMDemo.zip
    57.1 KB · Views: 1,694

miker2069

Active Member
Licensed User
Longtime User
Part 4 - Caveats

Looking at the code, you might wonder (among other things), why I wrapped calls to astream.Write in the following subs:


B4X:
'Helper Functions
'the aws (astream write stream) - is used to get around leaking stack buffer in a loop when sending strings to network, see description above
Private Sub aws(s As String)
  
    Astream.Write(s.GetBytes)
    Astream.Write(CRLF)
  
End Sub

'the awscrlf - same as above but just sends CRLF
Private Sub awscrlf()
  
    Astream.Write(CRLF)
  
End Sub

The short answer is Strings are created on the stack. So something like sending:

B4X:
Astream.Write("Content-type:image/jpeg")

In a loop (i.e. as a result of streaming) would cause a stack leak and the camera board would eventually panic and die. Wrapping the Astream.Write in a couple of helper functions that I can use in loops solves that as the the memory for strings are allocated/deallocated appropriately when the sub completes.

Another thing to mention (and most likely due to the attempt to figure out why my program was crashing as a result of the above). I make use of another helper routine:

B4X:
private Sub content_length_to_stream()
  
    Dim l As String
    l = NumberFormat(ESP32CAM.Length,1,0)
    Astream.write("Content-Length:").write(l).Write(CRLF) 'no need to call aws since this  astream.write is already wrapped in a helper function

For the same reason I described above - Strings memory leaks. calling the NumberFormat routine creates the string on the stack, and this is called repeatedly during streaming video. Wrapping it in a sub solves that.

Finally, I use a sub called "StreamGood()" all over the place in the web server code. Honestly it's probably not necessary to check *all the time* if the stream is good, as if the stream is "bad" astream_error should be called. Again I was hunting down why my program was crashing and it was ultimately a result of the string leaks. I ended up leaving it in as I can take appropriate action inside the streaming or chunking loops.

Well that's it - I will clean this up over the next day or so. I know it's a lot, I wanted to give you enough info and background so you can get started and avoid some of the mistakes I made (albeit a fun experience). Please let me know what you think :)
 
Last edited:

Hypnos

Active Member
Licensed User
Longtime User
Thank you miker2069! I will purchase this module tomorrow because of your post : )

To make it more secure and useful, would you able to add some protection on your B4R code to prevent unauthorized access to the camera? (e.g. password)
 

Hypnos

Active Member
Licensed User
Longtime User
Waiting for the ESP32-Cam module to deliver but same as you I have the M5 Cam32 (without psram) which purchased for a while but never use..... I can make the M5 work by modify the PIN setting on the inline C code, everything work fine except I only get the gray image, do you know how I can fix it? Thanks!

m5.jpg
M6.png
 
Last edited:

miker2069

Active Member
Licensed User
Longtime User
Hi Hypnos! Sorry for the late reply, I am on holiday/vacation for last week and have one more week to go :)

I am glad you like my tutorial about the ESP32 Camera. It's lots of fun and so many possibilities with B4R and B4x in general (IMO). I'm also happy to hear that you got the M5Stack camera working in B4R as well - I knew it was a matter of finding the correct pin assignment, however I got frustrated with M5's lack of support for the Arduino IDE (and the fact it has no PSRAM) that I left it alone. Now that you got it working I will probably use it for something simple - as it should still be good for still images and low frame rate video.

Anyway in regards to the grayscale - that's my fault, my apologies. I noticed I may have left this inline C line in ESP32CAM uncommented:

B4X:
s->set_special_effect(s, 2); //gray scale

If you comment that out, it should return to color mode.

I added some printfs to the ESP32 camera arduino example sketch to see how the sketch was setting options like color, brightness, etc. - ultimately I wanted to make a B4R version of that example to dynamically control those features.

I hope that helps!
 

Hypnos

Active Member
Licensed User
Longtime User
Hi miker2069,

Want to check whether "save_pic" can work ? I tried but got "Failed to open file in writing mode" Error.
 

miker2069

Active Member
Licensed User
Longtime User
Hi miker2069,

Want to check whether "save_pic" can work ? I tried but got "Failed to open file in writing mode" Error.

Hi Hypnos - The save_pic was a crude attempt at using the onboard SD Card on the ESP32CAM. If you search for:

B4X:
Serial.println("Starting SD Card");

In ESP32CAM module you'll see the code block (that's currently disabled) to re-enable it. It's just some inline C I lifted from an Arduino example. I disabled it for a couple of reasons. When originally developing with this I was having quite a few brown out issues which I attributed to current draw from the USB cable (which was powered from my USB port on my computer). I figured disabling the the SD card would help stablize the camera (which is what I was really interested in anyway). I think my brown out issues were more to do with a hacked-together FTDI programmer I made (using an old Arduino Uno) than anything else. So you might not experience that if you re-enable it. Also the ESP32CAM seems to work with the SD_MMC.h library - whereas the B4R library uses the MMC.h library. So you'd have to do everything with the card in inline C. Not a big deal as I was planning to write a small wrapper for it anyway and suggest it as a library to include in a future versino of B4R. I never got around to it.

The code works (and if you take a look at it, it's fairly simple), its just that for my purposes, read/writing the SDCard was a bit slow for me, it was far faster sending over the network - again this was just my use-case - I have another project that I will enable it on to create a simple time lapse camera that takes a picture of a scene/subject several times a day and save to the SD card.

Like I said you can re-enable it and play with it, it will save each successive image with picture0.jpg, picture1.jpg etc etc. all the way from 0-255 enumeration as it only use a single byte of EEPROM to track the picture number. Like I said, crude - it was just a proof of concept :)
 

derez

Expert
Licensed User
Longtime User
Thanks for sharing this, miker2069 !

I have a working camera, with the arduino sketch.
I get the following tons of errors when running the B4R ide (ver 3.0), even without the board, just compiling the libraries:
Loading configuration...
Initializing packages...
Preparing boards...
Verifying...
C:\Program Files (x86)\Arduino\arduino-builder -dump-prefs -logger=machine -hardware C:\Program Files (x86)\Arduino\hardware -hardware C:\Users\dudu\AppData\Local\Arduino15\packages -hardware D:\Arduino\hardware -tools C:\Program Files (x86)\Arduino\tools-builder -tools C:\Program Files .....
Build options changed, rebuilding all
.....
Detecting libraries used...
"C:\\Users\\dudu\\AppData\\Local\\Arduino15\\packages\\esp32\\tools\\xtensa-esp32-elf-gcc\\1.22.0-80-g6c4433a-5.2.0/bin/xtensa-esp32-elf-g++" -DESP_PLATFORM "-DMBEDTLS_CONFIG_FILE=\"mbedtls/esp_config.h\"" -DHAVE_CONFIG_H "-s\\EEPROM\\src" "D:\\B4ESP\\ESP32CAMDemo\\Objects\\bin\\sketch\\B4RSerializator.cpp" -o "D:\\B4ESP\\ESP32CAMDemo\\Objects\\bin\\sketch\\B4RSerializator.cpp.o"
In file included from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:26:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RCore.cpp:1:
rESP8266WiFi.h:90:23: error: cannot declare field 'B4R::WiFiSocket::client' to be of abstract type 'B4R::BufferedWiFiClient'
BufferedWiFiClient client;
^
D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\rESP8266WiFi.h:56:8: note: because the following virtual functions are pure within 'B4R::BufferedWiFiClient':
class BufferedWiFiClient : public Client {
^
In file included from C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Arduino.h:157:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:8,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RCore.cpp:1:
C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Client.h:31:17: note: virtual int Client::connect(IPAddress, uint16_t, int)
virtual int connect(IPAddress ip, uint16_t port, int timeout) =0;
^
C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Client.h:32:17: note: virtual int Client::connect(const char*, uint16_t, int)
virtual int connect(const char *host, uint16_t port, int timeout) =0;
^
In file included from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:26:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RCore.cpp:1:
rESP8266WiFi.h:126:23: error: cannot declare field 'B4R::WiFiSSLSocket::client' to be of abstract type 'B4R::BufferedWiFiClient'
BufferedWiFiClient client;
^
In file included from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:26:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
rESP8266WiFi.h:90:23: error: cannot declare field 'B4R::WiFiSocket::client' to be of abstract type 'B4R::BufferedWiFiClient'
BufferedWiFiClient client;
^
D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\rESP8266WiFi.h:56:8: note: because the following virtual functions are pure within 'B4R::BufferedWiFiClient':
class BufferedWiFiClient : public Client {
^
In file included from C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Arduino.h:157:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:8,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Client.h:31:17: note: virtual int Client::connect(IPAddress, uint16_t, int)
virtual int connect(IPAddress ip, uint16_t port, int timeout) =0;
^
C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Client.h:32:17: note: virtual int Client::connect(const char*, uint16_t, int)
virtual int connect(const char *host, uint16_t port, int timeout) =0;
^
In file included from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:26:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
rESP8266WiFi.h:126:23: error: cannot declare field 'B4R::WiFiSSLSocket::client' to be of abstract type 'B4R::BufferedWiFiClient'
BufferedWiFiClient client;
^
exit status 1
In file included from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:26:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
rESP8266WiFi.h:90:23: error: cannot declare field 'B4R::WiFiSocket::client' to be of abstract type 'B4R::BufferedWiFiClient'
BufferedWiFiClient client;
^
D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\rESP8266WiFi.h:56:8: note: because the following virtual functions are pure within 'B4R::BufferedWiFiClient':
class BufferedWiFiClient : public Client {
^
In file included from C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Arduino.h:157:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:8,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Client.h:31:17: note: virtual int Client::connect(IPAddress, uint16_t, int)
virtual int connect(IPAddress ip, uint16_t port, int timeout) =0;
^
C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Client.h:32:17: note: virtual int Client::connect(const char*, uint16_t, int)
virtual int connect(const char *host, uint16_t port, int timeout) =0;
^
In file included from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:26:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
rESP8266WiFi.h:126:23: error: cannot declare field 'B4R::WiFiSSLSocket::client' to be of abstract type 'B4R::BufferedWiFiClient'
BufferedWiFiClient client;
^
exit status 1
In file included from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:26:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
rESP8266WiFi.h:90:23: error: cannot declare field 'B4R::WiFiSocket::client' to be of abstract type 'B4R::BufferedWiFiClient'
BufferedWiFiClient client;
^
D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\rESP8266WiFi.h:56:8: note: because the following virtual functions are pure within 'B4R::BufferedWiFiClient':
class BufferedWiFiClient : public Client {
^
In file included from C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Arduino.h:157:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:8,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Client.h:31:17: note: virtual int Client::connect(IPAddress, uint16_t, int)
virtual int connect(IPAddress ip, uint16_t port, int timeout) =0;
^
C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Client.h:32:17: note: virtual int Client::connect(const char*, uint16_t, int)
virtual int connect(const char *host, uint16_t port, int timeout) =0;
^
In file included from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:26:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
rESP8266WiFi.h:126:23: error: cannot declare field 'B4R::WiFiSSLSocket::client' to be of abstract type 'B4R::BufferedWiFiClient'
BufferedWiFiClient client;
^
exit status 1
In file included from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:26:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
rESP8266WiFi.h:90:23: error: cannot declare field 'B4R::WiFiSocket::client' to be of abstract type 'B4R::BufferedWiFiClient'
BufferedWiFiClient client;
^
D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\rESP8266WiFi.h:56:8: note: because the following virtual functions are pure within 'B4R::BufferedWiFiClient':
class BufferedWiFiClient : public Client {
^
In file included from C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Arduino.h:157:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:8,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Client.h:31:17: note: virtual int Client::connect(IPAddress, uint16_t, int)
virtual int connect(IPAddress ip, uint16_t port, int timeout) =0;
^
C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Client.h:32:17: note: virtual int Client::connect(const char*, uint16_t, int)
virtual int connect(const char *host, uint16_t port, int timeout) =0;
^
In file included from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:26:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
rESP8266WiFi.h:126:23: error: cannot declare field 'B4R::WiFiSSLSocket::client' to be of abstract type 'B4R::BufferedWiFiClient'
BufferedWiFiClient client;
^
exit status 1
In file included from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:26:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
rESP8266WiFi.h:90:23: error: cannot declare field 'B4R::WiFiSocket::client' to be of abstract type 'B4R::BufferedWiFiClient'
BufferedWiFiClient client;
^
D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\rESP8266WiFi.h:56:8: note: because the following virtual functions are pure within 'B4R::BufferedWiFiClient':
class BufferedWiFiClient : public Client {
^
In file included from C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Arduino.h:157:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:8,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Client.h:31:17: note: virtual int Client::connect(IPAddress, uint16_t, int)
virtual int connect(IPAddress ip, uint16_t port, int timeout) =0;
^
C:\Users\dudu\AppData\Local\Arduino15\packages\esp32\hardware\esp32\1.0.2\cores\esp32/Client.h:32:17: note: virtual int Client::connect(const char*, uint16_t, int)
virtual int connect(const char *host, uint16_t port, int timeout) =0;
^
In file included from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\B4RDefines.h:26:0,
from D:\B4ESP\ESP32CAMDemo\Objects\bin\sketch\b4r_esp32cam.cpp:1:
rESP8266WiFi.h:126:23: error: cannot declare field 'B4R::WiFiSSLSocket::client' to be of abstract type 'B4R::BufferedWiFiClient'
BufferedWiFiClient client;
^
exit status 1

I'm not sure about the libraries, I use rESP8266WiFi ver 1.55 , what else is needed ?

The most popular error is
cannot declare field 'B4R::WiFiSocket::client' to be of abstract type 'B4R::BufferedWiFiClient'
which appear here https://www.b4x.com/android/forum/t...266-v2-50-and-esp32-v1-02.105388/#post-660785 but it should be corrected by version 1.55 :(
 
Last edited:

miker2069

Active Member
Licensed User
Longtime User
Thanks for sharing this, miker2069 !

I have a working camera, with the arduino sketch.
I get the following tons of errors when running the B4R ide (ver 3.0), even without the board, just compiling the libraries:


I'm not sure about the libraries, I use rESP8266WiFi ver 1.55 , what else is needed ?

The most popular error is which appear here https://www.b4x.com/android/forum/t...266-v2-50-and-esp32-v1-02.105388/#post-660785 but it should be corrected by version 1.55 :(


Hi derez - I just compiled it under B4R 3.0 and no issues. Make sure you the libraries updated as described in Part 2 above. Specifically make sure you have rRandomAccessFile as well as rESP8266WIFI in B4R and just as important the ESP32 *and* ESP8266 libraries updated in Arduino - it's not going to work and will throw all sorts of errors if everything isn't updated.
 

derez

Expert
Licensed User
Longtime User
I had esp32 version 1.0.2 and esp8266 version 2.5.2, updated esp32 to version 1.0.4 and esp8266 to version 2.6.2, now it compiles with no errors !
Thanks
 

derez

Expert
Licensed User
Longtime User
Great news - unlike the picture and video in the arduino sketch, the B4R program picture and video show nicely on a webview !
Now I can combine the display and controls in one b4a program.
I also changed the port for the server to my own definition instead of 80 - requires change in esp_http_server.h and in the B4R line 72, also in the router to forward this port to the board by stating its MAC address. With these changes I can get the data in my phone anywhere, not only on my home wifi.
I'd love to have realtime control on frame size, is it possible ? maybe by sending a new command through the server and adding a inline c function.
 

miker2069

Active Member
Licensed User
Longtime User
Great news - unlike the picture and video in the arduino sketch, the B4R program picture and video show nicely on a webview !
Now I can combine the display and controls in one b4a program.
I also changed the port for the server to my own definition instead of 80 - requires change in esp_http_server.h and in the B4R line 72, also in the router to forward this port to the board by stating its MAC address. With these changes I can get the data in my phone anywhere, not only on my home wifi.
I'd love to have realtime control on frame size, is it possible ? maybe by sending a new command through the server and adding a inline c function.
 

miker2069

Active Member
Licensed User
Longtime User
That's awesome! Yes frame size, brightnesse, saturation and many other properties of the image. It's all controlled by an API calls into the camera. Later today I'll dig up the inline C and post here.
 

derez

Expert
Licensed User
Longtime User
That's awesome! Yes frame size, brightnesse, saturation and many other properties of the image. It's all controlled by an API calls into the camera. Later today I'll dig up the inline C and post here.
My knowledge in the insides of the inline c is poor, however I thought of a (complicated...) way to make it work.
It goes like this:
1. Added a global to ESP32CAM module
B4X:
Public FS As Int = 3
2. Added this before the config definitions in the inline code:
B4X:
 framesize_t n[9];
  n[0] =   FRAMESIZE_UXGA ;
  n[1] =   FRAMESIZE_SXGA ;
  n[2] =   FRAMESIZE_XGA ;
  n[3] =   FRAMESIZE_SVGA ;
  n[4] =   FRAMESIZE_VGA ;
  n[5] =   FRAMESIZE_CIF ;
  n[6] =   FRAMESIZE_QVGA ;
  n[7] =   FRAMESIZE_HQVGA ;
  n[8] =   FRAMESIZE_QQVGA ;
....
 config.frame_size = n[b4r_esp32cam::_fs]; //FRAMESIZE_SVGA;     // FRAMESIZE_ + QVGA|CIF|VGA|SVGA|XGA|SXGA|UXGA
with this the framesize parameter will be read during init by the value of the FS variable.
Now what needed is "only" to re-init the camera with different value for the FS variable...
I noticed your comment
//Init Camera - can only be called once ever
I did try it and failed.
Restarting the device will do the job, so I added EEPROM and ESP8266 libraries.
Added two new commands to the Astream_NewData sub:
B4X:
If bc.IndexOf(Buffer, "restart") <> -1 Then
        esp8266.Restart
 
    else If bc.IndexOf(Buffer, "size") <> -1 Then

        Dim k As Int = bc.IndexOf(Buffer,"size")
        ESP32CAM.FS = Buffer(k+4) - 48
        eeprom.WriteBytes(bc.IntsToBytes(Array As Int(ESP32CAM.FS)), 0)
 
 
    else If bc.IndexOf(Buffer, "GET") <> -1 Then
...

The controlling application (b4a) uses a spinner to select the required resolution and sends the command with /framesize & position:
B4X:
Sub Framespin_ItemClick (Position As Int, Value As Object)
    Log("position= "& Position)
    wv.StopLoading
    wv.LoadUrl("http://[my ip and port]/framesize" & Position)
    wv.Invalidate
End Sub
The position in the spinner is the same as the array in the b4r inline code as described above.
The commands can be sent from a browser in the same way.
The B4R app, receiving that command, extracts the number after "framesize" and saves it to the eeprom.
When the B4A sends reset command, the b4r restarts, reads the framesize number from the eeprom and initializes the camera with it:
B4X:
    Dim b() As Byte = eeprom.ReadBytes(0, 1)
    Dim k As Int = b(0)
    If k >=0 And k < 9 Then ESP32CAM.FS = k
    Log("k = " , k , " FS = " , ESP32CAM.FS)

I hope you can find a simpler solution but in the meantime this will do, I believe that similar approach can do for other parameters if they are read only at init. For my needs I don't think that I need realtime control of other parameters.
 
Last edited:

WriteView

New Member
Hi,
really never used B4R since now. Just complied and get this example to work. šŸ‘
Any chance to write on the stream? E.g. room number, temperature or any other information?
Or easy possibility of pixel manipulation? So I can create the text on my own?

Cheers
 

miker2069

Active Member
Licensed User
Longtime User
Hi,
really never used B4R since now. Just complied and get this example to work. šŸ‘
Any chance to write on the stream? E.g. room number, temperature or any other information?
Or easy possibility of pixel manipulation? So I can create the text on my own?

Cheers

The mpeg stream is just a compilation of successive jpg images. So you could possibly manipulate each image in the stream before sending over the http connection. If the other side is a B4x app you could look at something like this to add a watermark .
 
Top