Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

drivers: video: sw_generator: Hue Saturation Brightness controls #79458

Conversation

josuah
Copy link
Collaborator

@josuah josuah commented Oct 5, 2024

Dependencies:

This is a proof of concept for a modified test pattern generator.

  • It uses fixed point arithmetic to convert color format from HSV to RGB and from HSV to YUV.
  • Add controls for hue saturation
  • Add an YUV output format
  • Allow sw_generator to be instantiated on the devicetree

To do:

  • adjust the samples which were using the legacy device_get_binding("SW_GENERATOR") call to a devicetree based approach.

By switching to SMPTE, this also allow to build end-to-end tests that have no hardware dependency, leveraging this:

josuah added 3 commits October 5, 2024 23:25
This permits to make a computation in a larger range to gain precision:

	int32_t tmp_in = SCALE(in, -100, 100, INT32_MIN, INT32_MAX);

Or convert between two unit systems:

	uint32_t fahrenheit = SCALE(celsius, 0, 100, 32, 212);

Signed-off-by: Josuah Demangeon <[email protected]>
Saturation logic is supported, but not rounding.

This helps as a support for inputting literal values:

	q15_t first = Q15f(0.23);

This permits to store Q values in consts expressions:

	#define VAL(x) SUBq7(Q7f(1.0), Q7f(x / M_PI))
	const q7_t table[] = { VAL(1.32), VAL(1.42), VAL(0.8) };

This permits to use fixed point arithmetics when a zdsp back-end is
not available, such as light computation in drivers subitted upstream.

Signed-off-by: Josuah Demangeon <[email protected]>
This permits to make a computation in a larger range to gain precision:

	int32_t tmp_in = MAP_VALUE(in, -100, 100, INT32_MIN, INT32_MAX);

Or convert between two unit systems:

	uint32_t fahrenheit = MAP_VALUE(celsius, 0, 100, 32, 212);

Signed-off-by: Josuah Demangeon <[email protected]>
@josuah josuah force-pushed the pr-video-sw-generator-hsv branch from b2a6f16 to 5dfabfb Compare October 6, 2024 16:55
@josuah
Copy link
Collaborator Author

josuah commented Oct 6, 2024

In case using the fixed-point macros is considered an overkill, it is also possible to make the fixed-point "by hand" and use uint8_t values like done here:

This switches from the 8-bars to the classic SMPTE color bars pattern
and express the colors in HSV format, allowing to control the hue,
saturation and value (brightness) of the colors. The runtime overhead
is the same as the color computation is only performed This leverages
the newly introduced fixed point airthmetic library.

This also introduces an YUV output format, and enable the test pattern
generator to be used on the devicetree.

Signed-off-by: Josuah Demangeon <[email protected]>
@josuah josuah force-pushed the pr-video-sw-generator-hsv branch from 5dfabfb to 638fa35 Compare October 6, 2024 19:44
# Copyright 2024 tinyVision.ai Inc.
# SPDX-License-Identifier: Apache-2.0

compatible: "zephyr,sw-generator"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why does this need to be in DT?

Copy link
Collaborator Author

@josuah josuah Oct 7, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I personally needed it for exposing the hue/saturation/brightness controls over USB Video.
https://github.com/zephyrproject-rtos/zephyr/pull/76798/files#diff-47c7b3f77fda7a7bc55ec72fb80259f4e7b7111fd1f20e0653f28ea90d663b76R13

This permit to refer to this "virtual hardware" as a target for video-controls, using a phandle and avoid a special-case in this driver, or in the various samples.

It sounds a bit of a deformation of the purpose of devicetree describing actual hardware.
Having it on the devicetree was a method to avoid making a special case of it in samples
and every driver to be tested with a software-based back-end.

I'd be glad to search an alternative method if it is more suited!

Copy link
Contributor

@ngphibang ngphibang Oct 7, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have the same question. It seems still unclear why we need to have this in DT. The video capture sample currently fallbacks by default to video sw generator if we don't specify any shield.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the use-case ever shows-up, then I can make a PR to introduce it, but it looks like there is no use for it currently. I will close this PR and reopen others for the points suggested.

Very grateful for the feedback from both of you!

Copy link
Contributor

@ngphibang ngphibang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

video sw generator is a virtual driver for testing purpose. While the implementation idea is good with the use of HSV to change hue, saturation, brightness, I am not sure what is the benefice of this addition knowing that the color bar test pattern is somewhat standardized (SMPTE) with fixed colors and brightness.

While this can be applied to real-world images (the image content remains unchanged) but for color bar pattern, when you change hue, saturation (and brightness), you actually changed each bar color (the main content of the pattern), so you will not have an 8-color bar test pattern (white, cyan, ...) anymore.

image

.height_step = 1,
},
{0}};
#define VIDEO_SW_GENERATOR_FORMAT_CAP(fourcc) \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This code optimization should go into a separate commit.

@@ -116,28 +128,145 @@ static int video_sw_generator_stream_stop(const struct device *dev)
return 0;
}

/* Black, Blue, Red, Purple, Green, Aqua, Yellow, White */
uint16_t rgb565_colorbar_value[] = {0x0000, 0x001F, 0xF800, 0xF81F, 0x07E0, 0x07FF, 0xFFE0, 0xFFFF};
static void hsv_to_rgb(q15_t h, q15_t s, q15_t v, q15_t *r, q15_t *g, q15_t *b)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think color related stuffs (macro, functions, etc.) is generic and should go into a separate .h file to enhance readabitlity and reusability.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point, I will do this!

@@ -259,10 +388,30 @@ static inline int video_sw_generator_set_ctrl(const struct device *dev, unsigned
void *value)
{
struct video_sw_generator_data *data = dev->data;
uint32_t u32 = (uint32_t)value;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

u32 seems not a good name, confused to the type.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are right, I realize this is not a good name for a variable.

@josuah
Copy link
Collaborator Author

josuah commented Oct 8, 2024

video sw generator is a virtual driver for testing purpose. While the implementation idea is good with the use of HSV to change hue, saturation, brightness, I am not sure what is the benefice of this addition knowing that the color bar test pattern is somewhat standardized (SMPTE) with fixed colors and brightness.

The only thing I could think of is to test compensating a color shift, or test the limits of a test.
There is also 75% SMPTE, but that does not require hue or saturation control, and can be one in RGB.
So as you said, no real use for it!

Maybe the YUYV output format can still be useful, so I will make another PR that introduces just this.
Maybe also going through utilities for pixel-packing/unpacking of color values...

@josuah josuah closed this Oct 8, 2024
@ngphibang
Copy link
Contributor

ngphibang commented Oct 8, 2024

  • Yes, YUYV format and other optimizations are useful.

  • SMPTE color bar pattern could be also added but instead of replacing the existing classic 8-color pattern (which is useful as most of sensors generate this type I think), you can keep both by using VIDEO_CID_TEST_PATTERN. Otherwise, the SMPTE color patterns seem containing some additional bars at the bottom (including black color), not just 7 pure color bars (?).

  • Then, the video sw generator (should be renamed though) could also go into devicetree if you decouple it into sub-devices to emulate a complex video pipeline, may be a sensor and an ISP-like device to just de-bayer the images come from the sensor (this way, Bayer formats can be also added in the sensor part).

@josuah
Copy link
Collaborator Author

josuah commented Oct 14, 2024

I think I will try to implement this. This would enable some more robust Twister tests for video interfaces.

I wonder though what kind of API have between the video sensor and the ISP, as this is typically happening in-hardware with no access from software, but now it needs to happen in software. Maybe some video_virtual.h with a few utilities to do the I/O? Otherwise, the image sensor needs a enqueue() and dequeue(), which they do not have for real hardware.

@ngphibang
Copy link
Contributor

I think It will be the same as real hardware. Currently, the video_sw_generator will act as a sensor who generate Bayer images and it will go to the ISP-like device to be converted to RGB. enqueue/dequeue remain the same and only be called by the application as of now, no ? For more details, I think you can refer to the vivid driver in Linux.

@josuah
Copy link
Collaborator Author

josuah commented Oct 17, 2024

Thank you once again for the reference and suggestions!

/* @buf_queue:          passes buffer vb to the driver; driver may start
 *                      hardware operation on this buffer; driver should give
 *                      the buffer back by calling vb2_buffer_done() function;
 *                      it is always called after calling VIDIOC_STREAMON()
 *                      ioctl; might be called before @start_streaming callback
 *                      if user pre-queued buffers before calling
 *                      VIDIOC_STREAMON().
 */
        .buf_queue              = vid_out_buf_queue,

It looks like vivid exposes an enqueue()/dequeue()-like API.

For real hardware, change on the imager will lead to different data on the MIPI RX driver without any enqueue() / dequeue() in-between. I wanted to try simulating this mechanism that way on a WIP branch:

video_emul_imager.c

struct emul_imager_data {
        /* First field is a framebuffer for I/O emulation purpose,
         * set with test pattern data when the format is changed
         */
        uint8_t framebuffer[CONFIG_EMUL_IMAGER_FRAMEBUFFER_SIZE];
        /* Other fields are shared with real hardware drivers */
        struct emul_imager_mode *mode;
        struct video_format fmt;
};

video_mipi_rx_emul.c

struct emul_mipi_rx_data {
        struct video_format fmt;
        struct k_fifo fifo_in;
        struct k_fifo fifo_out;
};

void emul_mipi_rx_worker(k_work *work)
{
        ...
        /* Simulate the MIPI transfer happening at hardware level, outside of Zephyr APIs */
        memcpy(vbuf->buffer, sensor_dev->data, fmt->pitch * fmt->height);
}

@josuah
Copy link
Collaborator Author

josuah commented Oct 17, 2024

There was also the suggestion of making a skeleton image sensor that builds... Why not make one that actually runs? Only way to know there is no bug in the image sensor driver.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: Base OS Base OS Library (lib/os) area: Devicetree area: DSP area: Video Video subsystem
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants