nuttx icon indicating copy to clipboard operation
nuttx copied to clipboard

Add amg88xx driver

Open tinnedkarma opened this issue 1 year ago • 8 comments

Summary

New driver for amg88xx sensor with basic capabilities.

Currently supports the following ioctl commands:

SNIOC_SET_OPERATIONAL_MODE : toggles sleep mode SNIOC_SET_FRAMERATE : toggles between 1 and 10 fps. SNIOC_SET_MOVING_AVG : unique to this sensor, sets moving average, for more details check sensor datasheet

For the moment the sensor interrupt functionality is missing, this driver supports getting data from the sensor in pooling mode.

Impact

Small changes for nucleo-f429zi board config to enable the sensor driver.

Testing

Locally, with an example app which is part of nuttx-apps and logic analyzer.

tinnedkarma avatar Aug 02 '24 18:08 tinnedkarma

@LuchianMihai I don't know if you already thought about it, but this device can be exported as a "camera", this way it could work easily with existing applications.

Please take a look at drivers/video/v4l2_cap.c and drivers/video/goldfish_camera.c how a synthetic/simulated camera could be implemented.

acassis avatar Aug 02 '24 19:08 acassis

@LuchianMihai I don't know if you already thought about it, but this device can be exported as a "camera", this way it could work easily with existing applications.

Please take a look at drivers/video/v4l2_cap.c and drivers/video/goldfish_camera.c how a synthetic/simulated camera could be implemented.

@acassis I had no idea, I will check your suggestion in my free time. For my use case I had to transceive data from i2c to canbus, so I needed raw data, that is why I choose a simple driver aproach

tinnedkarma avatar Aug 02 '24 19:08 tinnedkarma

Hi @acassis, I've took a look at video drivers. I cannot say I fully understand how a camera should work, not the driver implementation but how to use it from user space perspective. I do expect differences from an char driver, though I cannot say what the differences should be. Could you help me out with an explication, guide, or point me into the direction I should look at? I saw the camera example from nuttx-apps, but I do not understand if that should be an general example (over all driver that implements camera ops), or is an specific implementation over some camera drivers.

tinnedkarma avatar Aug 03 '24 20:08 tinnedkarma

Hi @LuchianMihai the example at apps/examples/camera/ is generic and it will display the camera image in a LCD using the NX Graphics system. Creating your driver as a char driver will make it simple for raw usage, like you are planing to do. Converting it for a camera interface will make it more generic, but it will require more effort to be implemented.

If you want to keep it as a char driver it is fine too. So later we can convert it to camera interface or just and an option to export it was a camera interface.

acassis avatar Aug 03 '24 20:08 acassis

Hi @acassis, I've took a look at video drivers. I cannot say I fully understand how a camera should work, not the driver implementation but how to use it from user space perspective. I do expect differences from an char driver, though I cannot say what the differences should be.

Actually, video driver framework is still char device, the difference is that ioctl is standardize which mean:

  1. All drivers implement the same ioctl
  2. We can write general apps to control the different camera sensor/isp without change

Could you help me out with an explication, guide, or point me into the direction I should look at?

NuttX camera/video driver follow Linux v4l2 driver from userspace perspective (e.g. ffmpeg v4l2 driver could work with NuttX camera/codec driver without any changing). So, you can learn how NuttX camere/video driver work from v4l2 related documentation.

I saw the camera example from nuttx-apps, but I do not understand if that should be an general example (over all driver that implements camera ops), or is an specific implementation over some camera drivers.

https://github.com/apache/nuttx-apps/tree/master/system/nxcamera is another example.

xiaoxiang781216 avatar Aug 04 '24 03:08 xiaoxiang781216

@acassis @xiaoxiang781216 I need your help again. My understanding is that :

  • the fb.c holds the framebuffer character driver, can be used to interact with displays, or with a image.
  • I would need to make my sensor "compatible" with v4l2 (interaction with the my senzor should be done through v4l2) Now the things that I do not understand:
  • That is goldfish_*.c, my guess is that is is an driver used with qemu (pipe or brige or something),
  • I see that isx012 and isx019 camera drivers both uses imgdata.h and imgsensor.h

My main issue, is that I'm not sure about the flow of data from sensor -> driver code -> v4l2 -> userspace. What is the link between them. Now I've read anything I could about this topics but there is not too much documentation on nuttx side. Could any or you help me with a short description of how the date should get from sensor to userspace?

tinnedkarma avatar Aug 06 '24 14:08 tinnedkarma

@acassis @xiaoxiang781216 I need your help again. My understanding is that :

* the **fb.c** holds the framebuffer character driver, can be used to interact with displays, or with a image.

* I would need to make my sensor "compatible" with v4l2 (interaction with the my senzor should be done through v4l2)
  Now the things that I do not understand:

* That is goldfish_*.c, my guess is that is is an driver used with qemu (pipe or brige or something),

* I see that isx012 and isx019 camera drivers both uses imgdata.h and imgsensor.h

My main issue, is that I'm not sure about the flow of data from sensor -> driver code -> v4l2 -> userspace. What is the link between them. Now I've read anything I could about this topics but there is not too much documentation on nuttx side. Could any or you help me with a short description of how the date should get from sensor to userspace?

Hi @LuchianMihai, I'm not an expert on that subject but I think you just need to implement an imgsensor like those existing at drivers/video/ and v4l2_cap.c will use it.

You are completely right, although NuttX uses V4L2 and the API from Linux could help, we still need some documentation explaining how things connect on NuttX side. This is something I try to enforce during to PR reviews: we need Documentation/ otherwise people will spend a lot of time trying to figure it out.

Maybe @xiaoxiang781216 or someone from Xiaomi or Sony could help here, because they contributed the code.

acassis avatar Aug 06 '24 14:08 acassis

@acassis @xiaoxiang781216 I need your help again. My understanding is that :

  • the fb.c holds the framebuffer character driver, can be used to interact with displays, or with a image.

Yes, fb is the output side, you can write a fb driver and work with high level graphic stack without change, e.g.: https://github.com/apache/nuttx-apps/tree/master/graphics/nxwm https://github.com/apache/nuttx-apps/tree/master/graphics/lvgl https://github.com/apache/nuttx-apps/tree/master/graphics/twm4nx v4l2 is the input side (e.g. camera) or both side (e.g. hardware encoder/decoder). so, fb and v4l2 are different driver framework, but both are compatible with Linux counterpart from userspace perspective.

  • I would need to make my sensor "compatible" with v4l2 (interaction with the my senzor should be done through v4l2) Now the things that I do not understand:
  • That is goldfish_*.c, my guess is that is is an driver used with qemu (pipe or brige or something),

goldfish camera driver(https://github.com/apache/nuttx/blob/master/drivers/video/goldfish_camera.c) is a pseudo device come from goldfish(qemu) emulator (used by Android), which could acquire the image from host camera through pipe.

  • I see that isx012 and isx019 camera drivers both uses imgdata.h and imgsensor.h

these two are real image sensor with I2C for control and CSI(I guess) for data.

My main issue, is that I'm not sure about the flow of data from sensor -> driver code -> v4l2 -> userspace. What is the link between them. Now I've read anything I could about this topics but there is not too much documentation on nuttx side. Could any or you help me with a short description of how the date should get from sensor to userspace?

imgsensor driver(provided by sensor vendor) is for the real sensor with I2C and CSI interface, imgdata driver(provided SoC vendorr) is for SoC side which have CSI master to drive sensor. In this case:

  1. Most setting (e.g. set exposure) is done by imgsensor driver through I2C interface
  2. Data is done by imgdata driver which enable DMA to fetch image through CSI
  3. Some control (e.g. start/stop) handle by both imgsensor/imgdata driver

imgsensor and imgdata form a complete video device(/dev/videox).

If your sensor uses one SPI for both control and data, you normally implement imgsensor and imgdata in one file just like goldfish camera driver. So, all control/data is handled by one code.

xiaoxiang781216 avatar Aug 06 '24 17:08 xiaoxiang781216

Hi @acassis, @xiaoxiang781216, As it stands right now, I cannot continue the development on this sensor. I had to deliver the boards that I was working on. I've designed an replacement daughter board, but it is still on its way. I suggest to merge this driver as is and I'll restart work as soon as I have access to the board again. What do you say?

tinnedkarma avatar Sep 02 '24 05:09 tinnedkarma

Hello @acassis, @xiaoxiang781216, So should I continue implementing this sensor as a video driver? Or can we merge it as is?

tinnedkarma avatar Sep 06 '24 06:09 tinnedkarma

Hello @acassis, @xiaoxiang781216, So should I continue implementing this sensor as a video driver? Or can we merge it as is?

let's merge this version first, you can refine the implementation later.

xiaoxiang781216 avatar Sep 06 '24 07:09 xiaoxiang781216