lv_image_set_scale() Does Not Work -- Image Becomes Scattered And Unrecognizable
LVGL version
v9.2.0
What happened?
When an image created from the online image converter (LVGL v9, RGB565) is scaled using lv_image_set_scale(), the rendered image is unrecognizable. Only a value of 256 works, leaving the image unscaled.
Unscaled render:
Scaled renders:
How to reproduce?
img_data.c
#ifndef LV_ATTRIBUTE_MEM_ALIGN
#define LV_ATTRIBUTE_MEM_ALIGN
#endif
#ifndef LV_ATTRIBUTE_IMAGE_MYIMG
#define LV_ATTRIBUTE_IMAGE_MYIMG
#endif
const LV_ATTRIBUTE_MEM_ALIGN LV_ATTRIBUTE_LARGE_CONST LV_ATTRIBUTE_IMAGE_MYIMG uint8_t myimg_map[] =
{
....data....
}
const lv_image_dsc_t myimg = {
.header.cf = LV_COLOR_FORMAT_RGB565,
.header.magic = LV_IMAGE_HEADER_MAGIC,
.header.w = 500,
.header.h = 500,
.data_size = 250000 * 2,
.data = myimg_map,
};
main.cpp
extern "C" {
extern const lv_image_dsc_t myimg;
}
myfunc() {
lv_obj_t* screen = lv_screen_active();
lv_obj_t* img = lv_image_create(screen);
lv_image_set_src(img, &myimg);
lv_image_set_scale(img, 200); // offending function
}
If you want me to include the image data, just let me know.
Please source the data as well, for 1:1 reproduction. Thank you in advance.
image_source_data.zip The content is too long to post within a comment or my original post so I've zipped the C file here. Thank you.
I cannot reproduce your issue, as everything works perfectly on my end. Have you modified the makefile or the sources to compile the image file itself?
No, the source code remains untouched. I am running this using the Windows simulator. Maybe it has something to do with the configurations I set?
#define LV_COLOR_DEPTH 16
#define LV_MEM_SIZE (32U * 1024U * 1024U)
#define LV_USE_DRAW_SW 1
#define LV_DRAW_SW_SUPPORT_RGB565 1
#define LV_DRAW_SW_SUPPORT_RGB565A8 0
#define LV_DRAW_SW_SUPPORT_RGB888 0
#define LV_DRAW_SW_SUPPORT_XRGB8888 0
#define LV_DRAW_SW_SUPPORT_ARGB8888 1
#define LV_DRAW_SW_SUPPORT_L8 0
#define LV_DRAW_SW_SUPPORT_AL88 0
#define LV_DRAW_SW_SUPPORT_A8 0
#define LV_DRAW_SW_SUPPORT_I1 0
#define LV_USE_OBJ_ID_BUILTIN 0
You could disable the ARGB8888 part, as color depth is 16 anyways. I doubt you need 32 megs of memory for rendering, tho the image is quite large for sure. Have you compiled the .c image itself as well?
@kissa96 Yes, the .c image is compiled. When no scaling is applied (or 256) then the image renders as expected. I will try will a smaller image and ARGB8888 disabled and report back.
Could you try the Eclipse or VSCode simulators. Recently we have heard about some rendering issues in the Windows simulator which are not tracked/resolved yet.
It is instead easier for me to compile for our device (no simulation), as I do not have a VSCode environment set up. Even so, all images I have tried result in these artifacts.
Have you checked the memory contents? Does it at least partially represent the image you are trying to upscale?
The memory appears unrelated to the image. Rather than inspecting the memory directly, I scaled a pure-green image. Here's what animating the scale looks like. You can see regular patterns appear. My best finger-in-the-air guess is they are artifacts of the decoding algorithm. However I cannot figure out why the image data is lost/unused/corrupted.
https://github.com/user-attachments/assets/a4688798-f025-4725-b16b-ed4a00f2caec
Hi, can you provide the code for this animation? Maybe I'll be able to reproduce the issue, as the image itself was working on my end last time.
What if you just create a simple button + label and press it. Is it rendered correctly?
There is nothing special/surprising about this code (I believe) -- it follows examples very closely. Unfortunately my project is not trivial, so I did my best to extract the relevant code. Let me know if I missed anything. If you're testing environment does not use C++, just remove the extern C wrapper around the extern lv_image_dsc_t declaration.
extern "C" {
extern const lv_image_dsc_t img_green; // see zip file attached
}
lv_obj_t* m_image;
static void set_scale_anim_cb(void* img, int32_t v)
{
lv_image_set_scale((lv_obj_t*)img, v);
}
lv_obj_t* make_image(lv_obj_t* parent, const void* data)
{
lv_obj_t* img = lv_image_create(parent);
lv_image_set_src(img, data);
return img;
}
lv_obj_t* make_image_green(lv_obj_t* parent)
{
return make_image(parent, &img_green);
}
// this is run to create the image object
void func(lv_obj_t* parent)
{
m_image = make_image_green(parent);
lv_obj_center(m_image);
}
// this is run on a button click
void animate_image_scale(void)
{
lv_anim_t a;
lv_anim_init (&a);
lv_anim_set_var (&a, m_image);
lv_anim_set_exec_cb (&a, set_scale_anim_cb);
lv_anim_set_values (&a, 256, 256*4);
lv_anim_set_duration (&a, 3000);
lv_anim_set_playback_duration(&a, 3000);
lv_anim_set_repeat_count (&a, 1);
lv_anim_start (&a);
}
@kisvegabor My video does not capture it, but I am using a button with a label to trigger this animation. I have a project which is a page-based application which uses fonts, styles, fragments, layouts, etc., but image scaling is the only graphical element which is not working for me.
Maybe this can shed some light... This is logged repeatedly when scaling the image during the animation:
image_decoder_get_info: Image decoder didn't set stride. Calculate it from width. lv_image_decoder.c:349
Apologies for not finding this sooner, I did not realize I had a different logging level set.
Edit: Perhaps not the issue. The stride seems to be calculated correctly (120 for an RGB565 image with a width of 60).
UPDATE
I managed to find a solution.
With a color depth of LV_COLOR_DEPTH 16 for an RGB565 image (as a C file, at least), both LV_DRAW_SW_SUPPORT_RGB565 and LV_DRAW_SW_SUPPORT_RGB565A8 must be enabled. Only enabling the former is not sufficient and will result in the images/video seen above.
With a color depth of LV_COLOR_DEPTH 24 for an RGB888 image (as a C file, at least), both LV_DRAW_SW_SUPPORT_RGB888 and LV_DRAW_SW_SUPPORT_ARGB8888 must be enabled. Again, only enabling the former is not sufficient and will result in the images/video seen above.
Generalizing, it appears that image scaling fails to render if the alpha version of the enabled LV_DRAW_SW_SUPPORT_* is not also enabled. I don't know if this applies in all cases, but it does in the two mentioned above.
Is this a known prerequisite that I missed in the documentation? If not, this is definitely a bug. And if it is a bug, it seems there are even more bugs related to the LV_DRAW_SW_SUPPORT_* family (see here).
Ah,it's great that you have found the problem. It's not a bug but it works like this by design. I wonder adding LV_LOG_WARN could be a good enough solution for now.
If it is intentionally designed this way, there should absolutely be warnings and/or assertions in place. I think adding an explanation to the docs for this design choice would be good. In my case, it would have prevented me from going down this rabbit hole.
What is the reason for designing it this way? Is there an opacity-related requirement in the drawing software when scaling images?
I've just opened https://github.com/lvgl/lvgl/pull/7315 to add a warning.
What is the reason for designing it this way? Is there an opacity-related requirement in the drawing software when scaling images?
Imagine that a simple RGB565 image is rotated. On the result image there will be transparent areas along the diagonal edges. However, you are right, in case of scaling only it's not required to use the alpha version of the color formats. Unfortunately, there is no pure RGB565 transformation implemented (only RGB565A8). It can be added later, but won't have time for it in the upcoming weeks. 🙁