Description
I am trying to understand how to have an Arduino sketch be able to use the touch events generated
by the GIGA Display shield.
If I look at the overlay file, that is defined for the giga in zephyr:
D:\github\zephyr\boards\shields\giga_display_shield\boards\arduino_giga_r1_m7.overlay
It has:
&i2c4 {
pinctrl-0 = <&i2c4_scl_pb6 &i2c4_sda_ph12>;
pinctrl-names = "default";
clock-frequency = <I2C_BITRATE_FAST>;
status = "okay";
gt911: gt911@5d {
status = "okay";
compatible = "goodix,gt911";
reg = <0x5d>;
alt-addr = <0x14>;
reset-gpios = <&gpioi 2 GPIO_ACTIVE_LOW>;
irq-gpios = <&gpioi 1 GPIO_ACTIVE_HIGH>;
};
};
So it is using the gt911 input ... It has status of okay so it starts up. with IRQ on GPIO I1...
And it is probably using the code in zephyr\drivers\input\input_gt911.c
I found the zephyr example: samples\subsys\input\draw_touch_events, which I was curious if it would
build and run... So far did not build.
D:/zephyrproject/zephyr/drivers/display/display_stm32_ltdc.c:422:25: error: 'CONFIG_VIDEO_BUFFER_SMH_ATTRIBUTE' undeclared (first use in this function)
422 | CONFIG_VIDEO_BUFFER_SMH_ATTRIBUTE,
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
D:/zephyrproject/zephyr/drivers/display/display_stm32_ltdc.c:422:25: note: each undeclared identifier is reported only once for each function it appears in
[41/48] Linking C static library zephyr\kernel\libkernel.a
Thought I would try doing the same thing in a sketch, however I don't think this will work.
static void touch_event_callback(struct input_event *evt, void *user_data)
{
if (evt->code == INPUT_ABS_X) {
touch_point.x = evt->value;
}
if (evt->code == INPUT_ABS_Y) {
touch_point.y = evt->value;
}
if (evt->code == INPUT_BTN_TOUCH) {
touch_point.pressed = evt->value;
}
if (evt->sync) {
k_sem_give(&sync);
}
}
INPUT_CALLBACK_DEFINE(touch_dev, touch_event_callback, NULL);
As I don't believe the INPUT_CALLBACK_DEFINE will work in a sketch?
Wondering best way to handle this?
Potentially, maybe define some callback within the ArduinoCore-zephyr code space.
Like maybe in loader\fixups.c ?
Turn off the interrupt? And try to poll it?
Suggestions?
Activity
KurtE commentedon Apr 28, 2025
Quick note:
The example sketch: ~/zephyrproject/zephyr/samples/subsys/input/input_dump
does build and run.
From the monitor window:
Note: if you hold your finger down at all, you will see a lot of messages like:
KurtE commentedon May 3, 2025
Edit: meant to add it to this issue instead of display library issue.
Quick notes: It was unclear to me how I might adapt the Arduino_GigaDisplayTouch library to zephyr. There is a PT911 object
defined in the overlay for the board GIGA under the Shield (GIGA Display Shield), that I know is defined and running.
As as soon as I touch the display I get lots of these messages in the monitor window:
I know that the PT911 class is looking for me to hook up a callback function. like some of the input samples show:
INPUT_CALLBACK_DEFINE(touch_dev, touch_event_callback, NULL);
But I know that adding this define to a library that gets loaded in our LLTEXT code won't work. Was not sure if there are any
Dynamic ways to setup the callback. So I asked up on Zephyr Discussion:
zephyrproject-rtos/zephyr#89415
The response I received:
zephyrproject-rtos/zephyr#89415 (comment)
From fabiobaltieri,
My read of this is, that I would need to add the callback code into some place that is built into the loader, like maybe fixups.c
and maybe that code would receive the callbacks and save the touch information and maybe set a semaphore or the like
and then the library code could detect the semaphore and retrieve the last callback information.
Probably beyond my pay grade 😆, especially if your plans are to always use LVGL.
KurtE commentedon May 4, 2025
Experiments update:
Trying directly on zephyr:
I generated a slightly modified version of one of the zephyr samples, sort of a cross between the two in
samples/subsys/input
I put a copy of it up at: https://github.com/KurtE/zephyr_test_sketches/tree/master/print_touch_events
Nothing much to it:
Which when I touch the display I get messages like:
With a reasonable response, although there is a delay for the release...
Built on Ubuntu and Windows using west build...
Note: on windows west flash fails on WIndows: I created an issue on this:
zephyrproject-rtos/zephyr#89434
Can flash on windows using arduino dfu command:
"C:\Users\kurte\AppData\Local\Arduino15\packages\arduino\tools\dfu-util\0.10.0-arduino1/dfu-util" --device 0x2341:0x0366 -D "build\zephyr\zephyr.bin" -a0 --dfuse-address=0x08040000:leave
ArduinoCore-zephyr:
I added similar code to the above into fixups.c and added entry to llext_exports.c
In fixups, I extended the code under GIGA/VIDEO #if
Note: currently just an experiment and if actually works out, maybe should be conditional on if LVGL is defined for the build.
Currently:
Added the callback, added semaphore, and had it init in the camera_ext_clock_enable.
In the exports I added the callback in the section:
So far really really bad performance/latency!!!
The first touch sort of goes through then get lots of syswq full messages, then delay then
some events processed, then full then messages... And these happen long after my
finger was removed...
Edit: test sketch with the touch code in it:
https://github.com/KurtE/Arduino_GIGA-stuff/tree/main/sketches/zephyr/zephyr_GIGA_shield_touchpaint
Not sure if something blocking? threads? priorities? ...
KurtE commentedon May 5, 2025
@iabdalkader @pillo79 @facchinm @mjs513 (and all)
Quick update:
Current potential theory: The Display output code interferes with the input thread: zephyr/subsys/input/input.c
We get a lot of Event dropped message from the same file (input_report).
One issue I had with my code was to have printk calls in the callback function mentioned in previous mentioned so I removed it.
It still was still not working right.
So I created another test sketch: up on my github.... zephyr_GIGA_shield_touch_only, where I started from the
touchpaint sketch, but removed everything associated with the display (did not start it, ...)
It appears like it is properly getting the callbacks from the PT911 input object....
So wondering if the display update code is running long periods of time and that thread is being starved?
Wondered if maybe increasing the size of the QUEUE might help, so tried:
CONFIG_INPUT_QUEUE_MAX_MSGS=10
Did not help:
Tried a few different priorities (2 4 -1) for this thread:
CONFIG_INPUT_THREAD_PRIORITY=-1
Did not help.
Wondering what else to try? Like maybe have PT911 not be interrupt? in the config for GIGA under the shield it has
CONFIG_INPUT_GT911_INTERRUPT=y
Maybe try NO, I think it then uses Timer interrupt?
Side note: I was curious about threads in sketches. So I hacked up our test sketch: GIGA_Display_first:
I added a thread to the sketch, that blinks a pin and every n
times blinks LED_BUILTIN... The interesting parts:
And the sketch runs and when it reaches the end of main, the thread never runs again.
However if I change loop to:
The issue is in cores\arduino\main.cpp:
There is nothing in the for loop calling loop() that yields or delays... So this logical thread never
yields back to the task manager.
KurtE commentedon Jun 1, 2025
@ALL....
I am maybe having some better luck right now with touch.
I pushed my current WIP branch up:
https://github.com/KurtE/ArduinoCore-zephyr/commits/pr130/
It is based on Pull request #130
And I had edited in the changes in #117
As the zephyr changes for it were merged and the zephyr fork/branch refered to by 130 has it in it.
I added some comments on 117 as some things were not working, like nothing showed up on the screen.
Found that the backlight was not turned on... more on that PR...
I added a callback and the like for the touch controller into loader/fixups.c
I added an export:
I made sure that loop() had a delay(1) in it, as to let other threads have a chance to run.
I now have the touch screen sketch running... Up on Arduino-GIGA-stuff github...
On my Windows machine:

on my Ubuntu machine:

But I am finding that it appears like the orientation of the touch screen versus the display is different between
the two giga/gigaShield setups.
That is on my Windows machine if I click near the bottom left area (red square)
the coordinates returned are something like 59,440 but on Ubuntu it is like 16,10
And the opposite corner (upper right): Windows 722,60, Ubuntu 459,783
So wondering where the difference is? And is there a way to programatically know which orientation the
screen was installed?
mjs513 commentedon Jun 2, 2025
@KurtE
After a torturous couple of days was able to synch up with all the changes for ble-display-touch again so am not getting the weird bus faults.
Ran your latest set of changes for touch and am now seeing what you previously shown
as for coordinates.
0, 0 for touch is the upper left corner but for the boxes seems that it is in the lower left corner?
KurtE commentedon Jun 3, 2025
Thanks @mjs513 - As I mentioned, I found the coordinates of touch versus the pixels different between my two boards...
I need to look more at it to see if this also happens with the MBED version or not.
mjs513 commentedon Jun 3, 2025
@KurtE
Was able to get it running on my 2nd giga and am seeing the same coordinate of touch and graphics on my 1st giga. Will look at the graphics code see deltas.
Not sure I mentioned this but I changed the way you defined shared buffer from:
to
mjs513 commentedon Jun 3, 2025
Simple solution. Delete fillscreen and fillrect functions in Arduino_GigaDisplay_GFX and the adafruit gfx lib handle it. The current fillscreen and fillrect functions are setup for rotation as you already know. I tried it and looks like with that change the coordinates should align with the touch coordinates. Fingers crossed.... :)
Actually probably can just deleted fillRect function
KurtE commentedon Jun 3, 2025
Thanks @mjs513 - made the same changes as you mentioned in the touch paint sketch... Will push it up soon...
@iabdalkader @facchinm @pillo79
Keep wondering about usage patterns and the how or if to use the to be two buffers.
CONFIG_STM32_LTDC_FB_SMH_ATTRIBUTE=2
With the current code, I believe, that if I pass in a full size buffer to the display write function, it will set it as the new (pending)
buffer. If for example compute the address of the 2nd buffer and then alternate passing in the first and second one,
Each time I do this, I will be starting off with the FB containing the contents from two updates before. i.e. the contents of
what was in it the previous time this buffer was used... So either I need to copy the contents of the other buffer into it
and/or the code needs to regenerate the whole frame...
Note: if you just use the one buffer like mentioned in:
And do something like:
You will get a completely garbled screen...
Note: If I have the code alternate between the two simply, the above code will probably also give you some garbled stuff
and/or missed complete updates.
Could add function to say wait until the new pending buffer is the current buffer, before continuing...
Like waiting until: getFrameBuffer() retuns the address of the last buffer I passed to it...
But again wondering what you all believe is the proper usage pattern. and/or should there be some way for a sketch
to dictate this as an option...
iabdalkader commentedon Jun 4, 2025
I don't follow, but this is the shared memory attribute not the number of buffers.
KurtE commentedon Jun 8, 2025
Sorry, I was misreading... My eyes saw it as the attribute for the count of buffers.
KurtE commentedon Jun 8, 2025
@iabdalkader @facchinm @pillo79 @mjs513 - and everyone else.
I don't know how much if any of this stuff, is in the shorter term goals for the Zephyr Arduino setup, but I know that at
least @mjs513 and I have been playing around, to setup with versions of the different GIGA display graphic libraries and the like
that at least have some of the functionality enabled.
These include the libraries we have mentioned in issue #92.
As mentioned in that thread, it relies on PR #117
I now have some Touch support in my fork/branch of the Arduino_GigaDisplayTouch library:
https://github.com/KurtE/Arduino_GigaDisplayTouch/tree/zephyr
Note: I update the hook I added into the ArduinoCore-zephyr code base in the loader fixups. Before I was saving the
data in the hook and allowing sketch to call to the get the data. In this version, the code is setup to simply forward
the callbacks from zephyr to who ever registers for it. I added an InitVariant call into GIGA for setting the Callback to NULL
at startup. This code is in the Pull Request #134
Note: With this current code I have some support for multiple touches. But in order to enable it, we need to update zephyr.
https://github.com/zephyrproject-rtos/zephyr/blob/main/boards/shields/arduino_giga_display_shield/boards/arduino_giga_r1_m7.conf
And add a line like:
CONFIG_INPUT_GT911_MAX_TOUCH_POINTS=3
It defaults to 1, and has range of 1-5...
Note: The zephyr code so far has no support for gestures. AFAICT - zephyr input code does not have support for this.
To test all of this, I first added a test sketch that saves up all of the callbacks into memory and waits for me to input something
in the serial port and then it prints it all out:
That is up at: https://github.com/KurtE/Arduino_GIGA-stuff/tree/main/sketches/zephyr_giga_display_examples/touch_events_capture_print
I also updated my version of the TouchPaint to allow it to handle multiple touches. Where it uses the next color up in the
color list for the 2nd finger ...
Again this is up at: https://github.com/KurtE/Arduino_GIGA-stuff/tree/main/sketches/zephyr/zephyr_GIGA_shield_touchpaint
Note: I have several questions on the zephyr implementation of the input device code. I posted a question about some of
this up at: zephyrproject-rtos/zephyr#91229
Now back to playing