New epaper component for ESP-IDF

E-paper component driver for the ESP-IDF framework and compatible with ESP32 / ESP32S2 can be found in this repository:

https://github.com/martinberlin/CalEPD

Codenamed Cal e-pe-de for the initials of E-Paper Display this is basically all what I’ve been doing the last weeks. Learning more about C++ object oriented coding and preparing a class that can be used to send graphic Buffers to different epaper displays. In order to do that I’m documenting on the go and building at the same time CALE-IDF that is our Firmware version for CALE.es but this time built on top of the Espressif IoT Development Framework (IDF)

The mission of this new component is to have a similar library for ESP-IDF that is easier to understand. If possible strictly meeting these requirements:

  • Extendable
  • Maintainable
  • Object-Oriented Programming with human-readable code and function names
  • Easy to add a new Epaper display drivers
  • Easy to implement and send stuff to your Epaper displays
  • Human-understandable
  • Well documented
  • ESP-IDF only (No Arduino classes)

Please find here the Wiki with the models that are already supported in the component:

https://github.com/martinberlin/cale-idf/wiki

Leave us a note if you want to try this. We can guarantee that for big epaper displays (>=400 pix. wide) it runs faster than GxEPD also supporting Adafruit GFX fonts and geometric functions. If you tried and it worked out for you please don’t forget to add a ✰ Star and spread the idea.

Testing the new ESP32-S2

I got one ESP32-S2 Saola board and I will be making some tests on this repository:

https://github.com/martinberlin/ESP32-S2

ESP32-S2 integrates a rich set of peripherals, with 43 programmable GPIOs which can be flexibly configured to provide USB OTG, LCD interface, camera interface, SPI, I2S, UART, ADC, DAC and other common functionality. ESP32-S2 provides the optimal HMI solution for touchscreen and touchpad-based devices.
The arduino-espressif32 is getting it’s finals touches of grace before being available and I will just do a small repository and try to test as much as I can and also make some benchmark comparisons.

In the first tests the impression is that most of the arduino framework is already working with a few exceptions. Not all the features are supported yet but they will be soon.
Sadly the deepsleep is still not optimized correctly in this first revisions of the Saola board and on 3.4 v I measured 0.79 mA/hour. On a good ESP32 board like tinyPICO it’s 0.08 mA/hour (10 times less) now that’s something you can plug to a battery and make it wake up every hour to do something.
Please note this is not a fair comparison at this point. It’s only a test with same firmware, same conditions and bypassing power regulator to see how much it consumes directly powering the board with the 3.3V pin.
On the other hand, making processor intensive tests like receiving UDP packets or receiving and decompressing at the same time, it showed similar speed with much less consumption than the ESP32.

But other than this small annoyance in this first board revision it’s a really promising product and very powerful System on a Chip. The feature list is impressive:

  • Xtensa® single-core 32-bit LX7 microcontroller
  • 43 programmable GPIOs.
  • Standard peripherals including SPI, I2C, I2S, UART, ADC/DAC and PWM.
  • LCD (8-bit parallel RGB/8080/6800) interface and also support for 16/24-bit parallel.
  • Camera interface supports 8 or 16-bit DVP image sensor, with clock frequency of up to 40 MHz.

Please follow the repository and add issues or just leave a comment here if you want me to test any of the features and I will publish the results. Note that the only condition is that the results should be shared with the community on the internet. It’s all open-source, so we will keep the same spirit and share the results with everyone else.
Another example of ESP32-S2 in action, controlling 144 RGB Neopixels using Makuna library https://twitter.com/martinfasani/status/1267015931689144320

What GPIOs not to use in ESP32-S2

GPIODescription / default configuration
0Pull-up
26PSRAM pin
45Pull-down
46Pull-down – INPUT only
43 *default UART0 pin on S2
44 *default UART0 pin on S2
NOTE: GPIO26 it applies primarily to WROVER as those come with PSRAM.
* not sure if you should not use 43,44 but if you will use UART0 output in your project is the best you don’t use them

Important documentation
ESP32-S2 Technical reference manual (page 39-40 for the not-use GPIOs)

Trying out Waveshare E-Paper ESP32 driver board

In my pursuit to make the CALE displays as small as possible but also to learn how different boards interact with Epaper I got last week an Waveshare E-Ink ESP32 driver board.

ESP32 Waveshare Epaper driver

Not all the boards need to use the default SPI GPIOs of the ESP32 and this was one of this cases. Opening the documentation, the first check was to see that Waveshare used this PINS:

#define PIN_SPI_SCK  13  // CLOCK ESP32 default 18
#define PIN_SPI_DIN  14  // MOSI ESP32 default: 23
#define PIN_SPI_CS   15
#define PIN_SPI_BUSY 25
#define PIN_SPI_RST  26
#define PIN_SPI_DC   27

So my first though was Ok nice but this won’t work per se using gxEPD library. Because this great library starts the SPI.begin() withouth parameters hence using ESP32 /ESP8266 default GPIOs for SPI communication. So what I did to test is very simple, I forked GxEPD:
https://github.com/martinberlin/GxEPD-config-spi.git

And updated this part of the library to use defines that can be injected using Platformio build_flags:

SPI.begin(CLK_GPIO,MISO,MOSI_GPIO,CS_GPIO);
Note that MISO is already defined in espressif32 Arduino framework.
So now we can inject the other 3 in platformio.ini and get the Waveshare example working:
build_flags =
  -DCLK_GPIO=13
  -DMOSI_GPIO=14
  -DCS_GPIO=15

If we want to test it with CALE we can also use the same defines to reference the EInk wiring in lib/Config/Config.h

// Waveshare ESP32 SPI
int8_t EINK_CS = 15;
int8_t EINK_BUSY = 25;
int8_t EINK_RST = 26;
int8_t EINK_DC = 27;
So it was not a lot of time to get it working with this simple modification. I find it great to inject defines using build_flags. It’s really straight-forward and in fact, if they are only simple defines, all configuration for a project could be done just in platformio.ini

The party breaker

So I was excited to get this working and avoid using this SPI adaptors that go between the ESP32 and the Epaper display…until I got the idea to connect it to my small 3.7 V Lipo that I use to test the ESP32’s with a Diode in between to lower 0.5 v. and found out how much it consumes in deepsleep:
That’s 13 mA per hour. Unless I’m missing something and there is something big that needs to be switched of before deepsleep this EPD driver will eat your battery and your kids for dinner.
So that was the low point of this test. If someone’s got a clue about this high consumption when sleeping please write something in the comments.
UPDATE: In the measurement below I forgot to put the diode in serie, so I was giving 3.8 v to the 3.3 pin. Don’t try it otherwise you may kill the ESP32.  Adding the diode it went down to 3.4 v and consumption of course went also down to 10 mA. But is still a lot for deepsleep. In good ESP32 boards like TinyPICO and others, deepsleep consumption is as low as 0.08 mA.
Video proof:

Hackaday projects update: CALE Eink Calendar and udpx

I must say although I’m not proud of all the projects I tried to document in Hackaday I do like a lot udpx and Remora, that are made initially to control Addressable LEDs, but they could be expanded and used for another uses.

CALE

This is actually an old project for the agency I work that is finally seeing the light. I started using Waveshare V1 and this is the new 7.5″ e-Paper V2

CALE E-eink calendar ESP32
CALE Eink calendar ESP32

This will be powered by a TinyPICO ESP32 that claims to consume as little as 20uA when in deepsleep. Let’s see when it’s connected to the Waveshare also in deepsleep ;)
But the idea is to connect, refresh the meeting room labels at each door, and then go to sleep for at least 2 hours. And in wakeup check the day of the week, it it’s Saturday or Sunday when we are not in the office, then just keep on sleeping. There is no need to update stuff when no one will watch.
So designed with low-consumption in mind, will be actually my first real battery project, that I hope last at least 30 days without charging. The Firmware is actually not ultra complicated except the BMP image reading part, that I took mostly from the GxEPD Eink library example, with the add-on that I implemented Zlib compression so download takes one second instead of five. Other than that the only thing that does, is to wake up, query a backend passing an URL that renders that webpage screenshot. The image is sent per SPI and it goes to sleep.
EVOLUTION:
Firmware: 90% done and open source
Case: Work in progress
CLIENT: http://www.skygate.de

udpx

udpx sending frames to a 44*11 RGB Led matrix
One of my personal favorites since we are working in team long time already is udpx which is a transmission protocol with decompression support (Zlib and Brotli)
That we are using at the moment to receive animation frames but I can see much more potential in it, like to be used to send small JSON beacons and have real-time charts for anything, they are endless possibilities to use a streaming technology that is proving to work stable. The Max. transport Unit (MTU) of the ESP32 is about 1470 bytes, but you can send much more in that payload if you use Brotli or Zlib, so I see that it can be applied in many other use-cases.

udpx is the first project using Espressif boards where we can use the full connectivity possibilities that the ESP32 offers:

  1. Bluetooth WiFi configuration. That’s one of the main reasons the ESP32 offers Ble and Bluetooth serial, so here we put it to use.
    udpx-app is an open source Android application that offers Ble/Bluetooth serial configuration to send WiFi credentials and also mDNS discovery using zeroconf
  2. WiFi UDP and decompression. ESP32 has enough memory to pack in one sketch both Brotli, miniz (Zlib) and Bluetooth. I did not believe this initially, but udpx is the proof-of-concept that it can be done.

Pixels is the library that is implemented to do the UDP interpretation, read the headers and send the RGB(w) brigthness values of each LEDs to Neopixels. Initially developed by Samuel our colleague in the USA, we work in team also with Hendrik in Frankfurt am Main to develop a Backend controller that will be able to chain animations and route them to multiple ESP32 controllers every millisecond. So this is a very exciting project, that is going slow because everyone is busy in many things at once, but one that I personally steal hours of sleep every time I can just because it’s interesting and beautiful to work on it.
Last addition was the PIX565 protocol, one that’s been inspired in Spectre’s own take on OctoWiFi LED Controller, but one that has been also used in old game consoles. Didn’t know about this, but Neopixels developer Michael Miller put it in words today:

Games also use alternate bit depth images like 565 and even others; to compress in memory textures. Lots of writeups on that and how shaders can use them to render. The graphics stuff you learn for LEDs often overlaps with gaming techniques.

Remora

This streaming little fish is swimming since a while and I’m proud that some music hackers are using it in professional Eurorack setups.

It was initially built for ORCΛ sequencer, and receives short UDP commands of 4 to 6 characters, to launch fast animations at a very high framerate. In the last months it also had the addition that if receives more than 9 bytes, it renders also a Pixel animation frame, so you could send short animations, but also a small animated video.

Remora was also the first product that I could imagine can be a Tindie product, but I must confess that I never selled one, even that I put a price that is mostly the Hardware price. I’m good at building, but not at selling. Everyone has to find their strong point.

 

Cordova cross platform mobile applications

After the small Chrome App experience that resulted in a very compact app that reads Video and sends an UDP binary stream to ESP32 Led controllers I decided to start learning how to program a real mobile app.
Real because I find that the way to do it must end in the App Store and the user should be able to install it with a few clicks.
This start guide is only valid for Linux but it could be a similar approach in Mac.

1.- Install Java 8 SDK
https://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html
2.-Cordova
https://cordova.apache.org/#getstarted

It’s very straight-forward to start if you have previous html/javascript experience building a web page.

In this github repository:
https://github.com/martinberlin/udpx-app

I’m building my very first Android app. Follow the commits and watch the Repository to see my new steps and also my failures in this new world.

This is a preview of how the app looks like in it’s first version:

udpx Android application is available in the Play store.
udpx app

Soon more posts will follow explaining how to interface with Android OS (Camera as an example). There will be soon more issues to expand and implement in this repository. And my new philosophy to build Firmware for Espressif chips will be simply:

“One firmware, one mobile application”

Meaning that every firmware will have his app, where you can configure and test it, just with one click install in google Play store and setting up some very easy configuration. Starting from WiFi credentials using BLE (Bluetooth low energy) and setting up the App IP address as a last step.

My goal at last, is that internet of the things is easy to use, and not something where you need a 20 minutes complicated setup to start using something. So the mission is that for first weeks 2020 all master branches from our main ESP32 Firmwares like udpx and Remora should be bluetooth configurable.

Met Remora a simple firmware and language to launch addressable led animations from ORCΛ

Remora listens to UDP short commands to trigger LED Animations. Receives ; commands from ORCΛ

As a side fun idea while learning Nodejs and while working for a bigger and more complex project with my partner Hendrik I programmed Remora just to have an easy way to launch animations from ORCΛ

If you want to try it out please check the requirements and components on the repository or on the hackaday project page.
After getting addressable LEDs and any ESP32 board, it’s easy to compile this, reading the provided instructions with Arduino or Plataformio. It needs simple configuration, to know your WiFi credentials and the length of your led stripe.
After initializing and getting online, the firmware will listen by default in port 49161 and launch short animations as fast as the UDP message hits the chip. To test it send an UDP message using netcat or any other way. In Linux that can be done like this:

echo '82O'|nc -w1 -u ESP32_IP 49161

IMPORTANT: If you copy the upper text make sure that the quotes are single quotes otherwise the compiled C will not recognize what animation to start

That is enough if you want to send animations to one controller. If you need more then we created a simple UDP redirector script that lives in the extras folder. This can be run using:

nodejs udproxy.js

After configuring a simple ip-config.json file that has an id to IP lookup table:

{
    "1":"192.168.0.74",
    "2":"192.168.0.73"
}

When this is running in the background, it will just process any message coming, extract the first character and pick up that ID from the table. Then it will redirect the rest of the message to that IP, same port 49161. To explain it more graphically, sending:

echo '282O'|nc -w1 -u localhost 49161

will lookup IP 2 and send 82O to 192.168.0.73 enabling you to send animations to different LED Stripes from Orca sequencer software.

Remora Logo

Schematics

Electronic schema

Additional info

Public repository
https://github.com/martinberlin/Remora Stable branch: master (only ESP32)
Hackaday project page for Remora – Find component list there.
Approximate cost of hardware not including 3DPrint: 19€ or about 22 dollars

Starting to do smart-home appliances with ESP-Mesh

ESPMesh reduces the loading of smart light devices on the router by forming a mesh with the smart light device.”

I’ve started months ago getting some smart lights for home and I’ve chosen Osram since it was about half the price than Phillips VUE. But the thing with this systems, though they work nice and with very little configuration, is that you need always a “Gateway”. A central point that receives signals and then send via Radio frequency to the lights what they have to do.

The Espressif ESP-Mesh took a different approach: Every device is a network member making a Mesh of interconnected devices. There is no gateway because there is no need for one.

There is also a very important point that is very interesting if you are a maker like me. It’s open source. That means you don’t need to buy a 60€ light to test it, you can just go to the ESP-Mesh github repository and download ESP-Mesh-Light example to compile it in one of your existing ESP32 Boards. Then you can get easily something like this working :

So what I’m working on the free time to take a rest from another pending projects is to take this PWM output and amplify it using a 74HC125 Quad Bus Buffer to power more than one led.

For those interested in reproducing this example:

https://github.com/espressif/esp-mdf/tree/master/examples/development_kit/light

Command lines to execute the compilation are only 2:

make menuconfig
make erase_flash flash

Compile this into your ESP32 and then download Espressif official Android App or if you are using I-phone / I-Pad just search for “ESP-Mesh”
The Mesh devices are configured using Bluetooth so keep in mind to have this enabled on your device. Instead of taking the “WiFi Manager” Approach, using an App, you have the benefit that you can just send the WiFi Credentials plus configuration directly to the ESP32.

Postman self documenting tests and runner import

I’ve been working on the last days doing extensive API testing and needed to find an easy way to document my tests. Postman offers already something very useful that uploads your tests documentation to their website. But sometimes we need just a simple HTML that can be privately delivered to the client.

That’s where this project was born:

https://github.com/martinberlin/postman-reporter

The intention of this simple PHP script is to generate a Standalone HTML for your Postman tests that you can send to the client without the need to upload all the tests in the open internet.

It serves to achieve two things:

  1. Make a standalone HTML document from your Postman Collections
  2. Import the test run-results into a mysql database

With the second one only the importing is done. It’s then up to you how to present this data. It populates two tables, resume and detail, first one with the performance result and the detailed with a line per test. Much more information can be extracted from the runner json this is just a kickstart idea. Have fun with it!

If it calls your interest then please read more details in the github repository.

3D Printer upgrade

Since last year, when I started 3D-printing, I bought my first printer just to see if I find my way into it.

Anycubic Delta "Kossel Plus"
Anycubic Delta “Kossel Plus”

Since then I found out that I could remember my days in the university studing graphic design and that I can actually design pretty well. But from that to product design there is million-light years. Anyways I like a lot Blender as a 3D Modeling software and I’m starting to get quite advanced doing my own models.

That combined with my passion to soldier electronic stuff and to create new devices has found it’s way. So it’s time for a more professional update.

Last month I purchased a new Prusa MK3.

https://www.flickr.com/photos/mjtmail/
Photo by the real Tiggy Flickr: https://www.flickr.com/photos/mjtmail/

I would be really happy to posting my results with the new machine and sharing with all of you the experience of working and creating new stuff with it.

Reading an image bitmap file from the web using ESP8266 and C++

There are a couple of different ways to do it, but I wanted to do it after a simple image example, to understand a bit better how reading a stream from the web to get as far as the pixel information and send it to a display. As a reference then I started with /GxEPD Library :

There are a couple of basic things to understand when dealing with streams of information. One is that the information comes on chunks specially if there is a large file, then buffering whatever is coming is essential if you want to read from it. The first examples I did without buffering just filled the 7.5 E-ink display of separated lines, that only resembled part of the web screenshot I was going to send it.

So how comes an image you request from the web then ?

First of all like any other web content there is a request made to an endpoint to whatever script or API that delivers the image. This part of the code is well reflected here:

String request;
  request  = "GET " + image + " HTTP/1.1\r\n";
  request += "Accept: */*\r\n";
  request += "Host: " + host + "\r\n";
  request += "Connection: close\r\n";
  request += "\r\n";
  Serial.println(request);

  if (! client.connect(host, 80)) {
    Serial.println("connection failed");
    client.stop();
    return;
  }
  client.print(request); //send the http request to the server
  client.flush();

In this case is a get Request. Then in the case of reading a Windows BMP image that is one of the easiest formats to read, the first thing is to check for the starting bits, that for a .bmp image file are represented by 2 bytes represented by HEX 0x4D42

But before that, when you send a Request and the server replies with a Response, it comes with the headers. For example it looks something like this:

HTTP/1.1 200 OK
Host:display.local
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:61.0)
Accept: text/html

(And some more that I will spare here ending at the end with an empty line only with “\r” known as Carriage return)

Then after this the image should start. So there are two choices:

1 To make something that loops reading the first lines discarding the headers and then attempts to read the 2 starting bytes of the image

2 To read from the start including the headers and scan this 2 bytes until we find 4D42 that represent the start of the image

Between the two I prefer the first since it looks cleaner. If we where to take the second one for this image it will look like this, note is a 4-bit bmp:

5448 5054 312F 312E 3220 3030 4F20 D4B 440A 7461 3A65 5720 6465 202C 3130 4120
6775 3220 3130 2038 3131 343A 3A38 3334 4720 544D A0D 6553 7672 7265 203A 7041
6361 6568 322F 342E 312E 2036 4128 616D 6F7A 296E 4F20 6570 536E 4C53 312F 302E
312E 2D65 6966 7370 5020 5048 372F 302E 332E D30 580A 502D 776F 7265 6465 422D
3A79 5020 5048 372F 302E 332E D30 430A 6E6F 656E 7463 6F69 3A6E 6320 6F6C 6573
A0D 7254 6E61 6673 7265 452D 636E 646F 6E69 3A67 6320 7568 6B6E 6465 A0D 6F43
746E 6E65 2D74 7954 6570 203A 6D69 6761 2F65 6D62 D70 D0A 310A 3065 3637 A0D
4D42 ->BMP starts here. File size: 122998
Image Offset: 118
Header size: 40
Width * Height: 640 x 384 / Bit Depth: 4
Planes: 1
Format: 0
Bytes read:122912

Then as we can see in this example they come as the starting bits the image headers itself that are readed with this part of code:

// BMP signature
if (bmp == 0x4D42)
{
    uint32_t fileSize = read32();
    uint32_t creatorBytes = read32();
    uint32_t imageOffset = read32(); // Start of image data
    uint32_t headerSize = read32();
    uint32_t width  = read32();
    uint32_t height = read32();
    uint16_t planes = read16();
    uint16_t depth = read16(); // bits per pixel
    uint32_t format = read32();
}
uint16_t read16()
{
  // Reads 2 bytes and returns then
  uint16_t result;
  ((uint8_t *)&result)[0] = client.read(); // LSB
  ((uint8_t *)&result)[1] = client.read(); // MSB
  return result;
}

uint32_t read32()there
{
  // Reads 4 fucking bytes
  uint32_t result;
  ((uint8_t *)&result)[0] = client.read(); // LSB
  ((uint8_t *)&result)[1] = client.read();
  ((uint8_t *)&result)[2] = client.read();
  ((uint8_t *)&result)[3] = client.read(); // MSB
  return result;
}

In there comes a very important 2 bytes of information and without it is impossible or I just couldn’t find out how to read the pixels, and that’s Image Offset: 118 which means at byte 118 the image information starts. Also Depth that represents how many bits represents one single pixel. So in 1 bit, we can store a black and white image, and if we want full RGB then we need 24 bits per pixel, also 1 byte for each color (Red, Green and Blue)
Our dear Wikipedia says about this:

For an uncompressed, packed within rows, bitmap, such as is stored in Microsoft BMP file format, a lower bound on storage size for a n-bit-per-pixel (2n colors) bitmap, in bytes, can be calculated as:

size = width • height • n/8, where height and width are given in pixels.

So there we have then the Image Offset: 118, but to get to read this headers, we already got from the client 32 bytes. Then we need to make the difference and start reading the image:

// Attempt to move pointer where image starts
client.readBytes(buffer, imageOffset-bytesRead);

That should be it, then we need to read every row up to the reported width in our example 640, inside of a height loop of 384 pixels. And then read each pixel taking in account the pixel depth. In the code example this looks a bit rough around the corners:

    if ((planes == 1) && (format == 0 || format == 3)) { // uncompressed is handled
      // Attempt to move pointer where image starts
      client.readBytes(buffer, imageOffset-bytesRead);
      size_t buffidx = sizeof(buffer); // force buffer load

      for (uint16_t row = 0; row < height; row++) // for each line
      {
        uint8_t bits;
        for (uint16_t col = 0; col = sizeof(buffer))
          {
            client.readBytes(buffer, sizeof(buffer));
            buffidx = 0; // Set index to beginning
          }
          switch (depth)
          {
            case 1: // one bit per pixel b/w format
              {
                if (0 == col % 8)
                {
                  bits = buffer[buffidx++];
                  bytesRead++;
                }
                uint16_t bw_color = bits & 0x80 ? GxEPD_BLACK : GxEPD_WHITE;
                display.drawPixel(col, displayHeight-row, bw_color);
                bits <<= 1;
              }
              break;

            case 4: // was a hard word to get here
              {
                if (0 == col % 2) {
                  bits = buffer[buffidx++];
                  bytesRead++;
                }
                bits <<= 1;
                bits < 0x80 ? GxEPD_WHITE : GxEPD_BLACK;
                display.drawPixel(col, displayHeight-row, bw_color);
                bits <<= 1;
                bits < 0xFF  / 2) ? GxEPD_WHITE : GxEPD_BLACK;
                display.drawPixel(col, displayHeight-row, bw_color);
                bytesRead = bytesRead +3;
              }
          }
        } // end pixel
      } // end line

And I still have an issue that still didn't found why it does not work. This code works good and I can see images in 4-bits and 24-bits but it hangs on 1-bit image.
It's something about the headers, using the point 1 described before, also discarding headers the 1-bit image works. But not the other depths (4/24)
It's maybe some basic thing about how the byte stream comes that I'm not getting or I'm simply missing something stupid enough not to get around it.
There are other better examples on ZinggJM Repositories that deal much better with the buffering and other aspects, where the BMP reading truly works. But sometimes I like to understand the stuff and fight with it, before implementing something, since it's only way to learn how stuff works.
Have you ever though how far we are that every OS and every Browser have the resources to read almost any existing Image or Video format ? How many Megabytes of software is that ? ;)
That's what I love about coding simple examples in C++ on the Espressif chips. That you need to go deep, there is no such a thing of ready made json_decode or do-whatever libraries as in PHP. You need to read it from the bits. But the cool thing is that if you get around it, then you have a grasp of what is need to be done to read in this case a very simple Bitmap Format image. I cannot imagine how to read a compressed JPG or a PNG, I think for that yes, I will put my head down and use some library.
UPDATE: I found out after about 4 hours fight why it is. And it's the fact that I'm reading the bytes in chunks of 2. Reading them one by one and adding lastByte in the comparison to check them then it works for both 1 and 4 bits images. I can post the solution here if someone is interested, but if not, I will keep it as is to avoid making it a boring long read.