There are two affordable models that I would like to cover in this post. First one is the 400×300 FT6336 from Good Display, we can see this one in action (2nd video, 1st had still wrong Waveform)
Better but not perfect. Maybe someone will find a faster mode since my bed seems much more interesting at the moment than making float ink particles pic.twitter.com/njz1j950XE
1 GND
2 VDD -> 3.3V only, not 5V!
3 RST (Not used in this component)
4 INT
5 SDA
6 SCL
It’s important to know that each different model uses a different pinup in the 6 pin FPC cable, so if you had the great idea to make an universal adapter, you will fail doing it (Guess who tried?) Please note also that I do not use GoodDisplay touch adapter for this since I do not like the fact that you can use it only for touch or for SPI but not both, so I choose to use a simple 6 pin FPC adapter that you can find cheap in Aliexpress.
Note there is a small gotcha with this ones! Sometimes the FPC is bottom-contact only so make sure to plug it with the contacts facing down (Disregard this if it has double sided contacts) Also we are using a very raw touch interface to MCU connection there are some technical things to keep in mind:
1. I2C is a bidirectional connection and needs pull-up resistors to 3.3V in both SDA and SCL lines (4.7K or even 6K Ω will work) 2. INT pin goes low when there is an event to read via I2C. Same as last case, you should not leave this pin floating and add also a pullup to 3.3V.
The 2.7″ smaller brother
The smaller 2.7″ has also an optional touch screen. This model called Gdey027T91T with 264*176 pixels resolution seems to be at the moment out of stock, but you can get the touch separately, and just stick it with care in the top of the epaper display.
In the video you can see how fast partial update works in this model, compared with my 4.2″ version
1 GND
2 INT
3 RST (Not used in this component)
4 VDD
5 SCL
6 SDA
Also if you want to use Espressif IDF framework to handle touch events plus epaper component with GFX, we highly recommend to try Cale-idf Cale is a component that is already 2 years in development and has the most common epapers from Goodisplay / Waveshare, including this two touch models. It also has the interesting fact, and sometimes important, if you design something that is able to rotate screen of adapting touch to this rotation. Only on certain classes like the 2.7″ since we are no-one requested yet to add this feature to the 4.2″ display class. On our example demo-keyboard.cpp on cale component you can clearly see how this is implemented:
// Include touch plus the right class for your display
#include "FT6X36.h"
#include "goodisplay/touch/gdey027T91T.h"
// INTGPIO is touch interrupt, goes low when it detects a touch, which coordinates are read by I2C
// Use an RTC IO if you need to wake-up with touch!
FT6X36 ts(CONFIG_TOUCH_INT);
EpdSpi io;
// At this moment we insert touch into the display class!
Gdey027T91T display(io, ts);
uint8_t display_rotation = 3; // 1 or 3: Landscape mode
int t_counter = 0;
// This callback function will be fired on each touch event
void touchEvent(TPoint p, TEvent e)
{
++t_counter;
printf("X: %d Y: %d count:%d Ev:%d\n", p.x, p.y, t_counter, int(e));
}
// Entry point of our Firmware
void app_main()
{
// Initialize display and SPI
display.init(false);
// When using the ClassT integrated with touch then rotating this
// rotates touch X,Y coordinates too.
display.setRotation(display_rotation);
display.registerTouchHandler(touchEvent);
// You could launch this also in a FreeRTOS task
for (;;) {
display.touchLoop();
}
}
lv_port_esp32-epaper is the latest successful attempt to design UX in C using Espressif ESP32. If you like the idea please hit the ★ in my repository fork. What is LVGL?
LVGL stands for “Light and Versatile Graphics Library” and allows you to design an object oriented user interface in supported devices. So far it supports mostly TFT screens and only some slow SPI epapers where supported. My idea is to add driver support so it works also in fast parallel epapers.
It took me some work but at the end the light controller is working. So far added only Hue, Bright and White channel. LVGL with @lilygo9 EPD47 pic.twitter.com/6ZudrEb1OW
That is Lilygo EPD47. Parallel epaper with ESP32 WROVER and I2C touch interface that can be found in Aliexpress
The main idea is to use a bridge driver that pushes the pixels to EPDiy component using the set_px_cb and the flush callbacks in order to render the layouts on the supported epapers. This will have a performance hit but it will also allow us to draw UX interfaces in parallel epapers that are quite fast flushing partial refresh. The development took about one month of research and many iterations until it became usable. I started with an easy choice since Lilygo sent me an parallel epaper as a gift once and I bough the rest in their official store. The idea is that this acts as proof-of-concept to demostrate that is possible and that it’s working as expected. It’s possible to design an UX directly in C and then using a controller like ESP32 you can directly interact with Home appliances such as lights or other devices, to control them or to read information such as sensors that can respond with short JSON messages to inform your epaper control board about temperature or other matters that you choose.
After almost one year on the run, my service to deliver images for epapers and TFT displays finally is starting to get some adoption. The idea was starting at the beginning of 2020 when the epapers and many great projects like EPDiy in hackaday started to be early adopted.
Our ESP32 Firmware does 3 things at the moment and is very easy to set up:
It connects to cale.es and downloads a Screen bitmap.
In “Streaming mode” it pushes the pixels to Adafruit GFX buffer and at the end renders it in your Epaper.
It goes to sleep the amount of minutes you define
But then I needed to research more and a bigger idea was triggered: It was not enough to make an Arduino-esp32 firmware using GxEPD as a library. I wanted to learn more how epapers work and also to get out of Arduino-esp32 and get more into Espressif IDF framework. It was hard, I had some weeks where I achieved nothing, but after about one entire month of coding I finally saw the first small epaper refresh. Soon there where 5 models more.
It was a long journey and time taking. But I think it was worth it and I see that at least 20% of the users are having their screens connected and enjoying a very low consumption calendars and photo-frames at home. Very happy to make this possible and to bring something alternative to the usual arduino-esp32 Firmware. Something that you can hack, that is more understandable, and uses Espressif’s own framework. It might be not very well known for makers but is undoubtedly used in professional industry and it’s a very good alternative, with lots of examples and very well documented.
Next missions are to start making developer tools and examples to introduce uGFX interface design into ESP32 using epapers. There is a long journey ahead and we are very thankful for all the good feedback received so far.
DATA FLOW: Midi out > Sparkfun midi HAT > ESP32 > RX Uart > C++ Processing > FastLED RMT output
Meet Remora-matrix a project that started with the idea to sniff MIDI messages and make simple visuals on a LED-Matrix. Our previous Firmware Remora was intended to receive short commands from ORCΛ and make very simple Neopixels animations on addressable LEDs stripes (WS2812B like) With this one I decided to go a step further and make two versions, one that uses nodejs as a middleware (Requires WiFi and UDP) and other that uses MIDI serial, received directly via Serial2 (TX, RX) UART and hence requires no WiFi connection.
NODEJS VERSION This version uses a middleware script that sniffs a MIDI port (Ex. USB) and converts the messages into UDP messages that fly over WiFi to a destination IP. This script lives at the moment in middleware directory. An npm install needs to be run in order to install the required JS libraries.
cd middleware/midi-to-udp
nodejs midi.js
// Will list available midi entries. Requires port and udp destination IP
-p, --port_id ID: Port name
0: Midi Through:Midi Through Port-0 14:0
1: USB MIDI Interface:USB MIDI Interface MIDI 1 20:0
[number] [required]
-u, --udp_ip [string] [required]
// EX. listen to port 1 USB Midi and forward them to UDP x.x.x.x:49161
// Port is fixed to the ORCA default port, feel free to update it ^
nodejs midi.js -p 1 -u 192.168.12.109
This script will simply run in the background and redirect them using a simple short message that we designed to be compatible with both UDP and Serial versions.
2. MIDI SERIAL version This uses the Sparkfun Arduino midi HAT that cost around 15 u$ and it can be found both in eBay and Aliexpress. The easy task is to build a connecting PCB below that hosts the ESP32 at the side of the HAT with the RX and TX cables from the HAT outputs connected to the ESP32. This HAT has an opto-isolator, also called octocoupler, that converts the MIDI signals into readable UART messages. My prototype construction looks like this:
MIDI Hat alternative, but you could also build the MIDI to Serial yourself, in my case was easier to get it build
The MIDI Hat was designed for Arduino and requires 5 volts to run, so the 4 cables wiring is pretty straightforward: HAT midi > ESP32 5v . . . . 5v GND . . GND RX . . . . 26 TX . . . . 27
I’m quite sure the TX goes to TX in the ESP32 but may be also the opposite. There is no standard for this I think. But in case it does not work just invert it, there are only signals, so you won’t break anything for trying this. The advantage of Serial are that have less latence than WiFi. Depending on how clean your connection is, sometimes WiFi UDP packages can become clogged, and get out all together which is quite an undesirable effect if you are working with LIVE music. Also UDP nature is designed to be very fast, but has no order like TCP. It’s possible that a note played comes in different order as expected. Or if WiFi is shared, that while your Router is busy, the packages will accumulate and then sent all together, causing a burst of shapes in the Matrix in a moment that does not correlate with the music. This is not happening with the Serial version since there is no middleware redirecting packages and it cam be run without any PC in the middle. As there are no flying WiFi messages, has less latence and it’s much more reactive and fun to work with. Being the only pitfall being that you need a MIDI cable from your computer or Synthesizer to the MIDI Hat + ESP32 controller. Most LIVE music lighting equipment does not rely on WiFi and there is a good reason for it! Reliability.
Building our own internal Midi messaging system
Since this two versions want to achieve the same goal, that is converting the MIDI played into shapes, I though to create an internal messaging that is shared. Maybe this can be a C++ class or component in the future so it should speak the same language, no matter what version you use. The result is very simple, we will keep Channel internally and will use only Note + Status + Velocity. Status and Channel come in the first byte. Then comes the Note and at the end the Velocity. Once we get the last byte, we will assemble this with the following syntax:
2 chars (HEXA) representing Note played 1 boolean representing Status (1 note on, 0 note off) 2 chars (HEXA) representing VelocityNNSVV Note, Status, Velocity
Example: Playing DO in octave 3 that is 36 in decimal, velocity 60, Note ON message would be:
2413B When the same note is released it could be: 24000
After building the message the channel is analyzed. It can either hear on all channels leaving the constants in platformio.ini to 0. Or hear in 3 different channels (Also 3 instruments) this can be of course modified, but 3 is a good balance, to see something that can be correlated with the music. The configuration for this is on platformio.ini file using build_flags
There you can see that this will only forward packages for channels 1,2,15 all the rest will not be sent to the matrix. There is also an option to ignore Velocity and use a fixed number ( MIDI_FIXED_VELOCITY ) And depending on the song, it could played on a high tone, or in a lower tone. Because our matrix is limited, we need to define BASE_OCTAVE and TOP_OCTAVE so we can have a drawing range. That is the most important midi configuration. It would be desirable to have a “learning phase” where you can simply hear the first 10 seconds of a song and calculate this BASE and TOP margins automatically. This a future idea that might be implemented.
Interpreting the messages
I left just a demo of how to interpret this in C++. As we have 2 different firmware versions, one that listens UDP messages, and another one that get’s MIDI via UART you have to select what to compile editing platformio.ini File:
[platformio]
default_envs = esp32
# Uncomment only one of the folders to select what example to run:
#src_dir = firmware/udp-midi-matrix
src_dir = firmware/midi-in-matrix
Every message at the end triggers a function that draws a shape. And that part is open to every different implementation. For example, you can draw a different shape per channel, like: Ch1 – Usually piano or main instrument – Rectangles Ch2 – Triangles Ch3 – Circles Ch4 – Lines and so on
As said this is just an example, but it’s open to draw anything you want, since we are using GFX over the RGB Led matrix. Also you have the Velocity, that is the pressure that is applied to the key, so you can use this factor to make the shape bigger or change colors. The possibilities are unlimited. There is only one important thing to keep in mind. A note with status 1 should be drawn, but same note with status 0, signalizes that the key was release hence we should delete the shape. At the moment is just an experiment that will may never see the light out of my studio, but nevertheless I wanted to leave this post as a declaration of intentions, in case someone wants to fork this and make his own take.
NeoMatrix let’s you map the RGB Led matrix to use GFX
I writed to Marc Merlin who did the amazing job of adding GFX to FastLED and here I wanted to quote his answer
About FrameBuffer GFX: The good news is that your code will now run mostly unmodified on other displays like LCDs, or RGBPanels, or even display on linux. Like this you can write all your code, run it, and debug it on linux, and then upload it to ESP32 or rPI when it’s done. After that, you can go big!
Marc
Remora is starting to be a bit more responsive now that midi is not flying via UDP but directly connected to the ESP32 via @sparkfun midi Cc @hputzekpic.twitter.com/hCA59eY6FT