HoloLensWithOpenCVForUnityExample icon indicating copy to clipboard operation
HoloLensWithOpenCVForUnityExample copied to clipboard

Can you only get the camera in Gray Mode?

Open chismer opened this issue 3 years ago • 4 comments

How can I make it detect colors and adapt the "HoloLensComicFilterExample" example to be equal to "MultiObjectTrackingBasedOnColorExample".

An example of how to detect colors would be great. Thanks!

chismer avatar Jul 03 '21 19:07 chismer

Hello,

To get a color image from the camera, change the OutputColorFormat property of the HololensCameraStreamToMatHelper component to a value other than GRAY.

Hololens_GRAYtoRGBA

Then you need to change the code in the Mat processing part of "HoloLensComicFilterExample.cs". (Mat data size has been changed from 8UC1 to 8UC3 or 8UC4)

Best regards, Enox Software

2021年7月4日(日) 4:16 Jose Canovas @.***>:

How can I make it detect colors and adapt the "HoloLensComicFilterExample" example to be equal to "MultiObjectTrackingBasedOnColorExample".

An example of how to detect colors would be great. Thanks!

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/EnoxSoftware/HoloLensWithOpenCVForUnityExample/issues/38, or unsubscribe https://github.com/notifications/unsubscribe-auth/AB4NWCBFPWBYPQQXSQWS5KLTV5OZJANCNFSM47YNBLCQ .

EnoxSoftware avatar Jul 04 '21 11:07 EnoxSoftware

Thanks! Yes, it's the first thing I did but I can't get it to work. I give you the script and the scene so you can see it if you can and see what I'm doing wrong. Thanks! It would be a good example to start learning how to use the plugin in hololens. Thanks!

El dom, 4 jul 2021 a las 13:50, Enox Software @.***>) escribió:

Hello,

To get a color image from the camera, change the OutputColorFormat property of the HololensCameraStreamToMatHelper component to a value other than GRAY. Then you need to change the code in the Mat processing part of "HoloLensComicFilterExample.cs". (Mat data size has been changed from 8UC1 to 8UC3 or 8UC4)

Best regards, Enox Software

2021年7月4日(日) 4:16 Jose Canovas @.***>:

How can I make it detect colors and adapt the "HoloLensComicFilterExample" example to be equal to "MultiObjectTrackingBasedOnColorExample".

An example of how to detect colors would be great. Thanks!

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub < https://github.com/EnoxSoftware/HoloLensWithOpenCVForUnityExample/issues/38 , or unsubscribe < https://github.com/notifications/unsubscribe-auth/AB4NWCBFPWBYPQQXSQWS5KLTV5OZJANCNFSM47YNBLCQ

.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/EnoxSoftware/HoloLensWithOpenCVForUnityExample/issues/38#issuecomment-873573600, or unsubscribe https://github.com/notifications/unsubscribe-auth/AE5MSZXAJYM5OEFG25RMRHLTWBDGXANCNFSM47YNBLCQ .

using System; using System.Collections.Generic; using UnityEngine; using UnityEngine.UI; using UnityEngine.SceneManagement; using OpenCVForUnity.CoreModule; using OpenCVForUnity.ImgprocModule; using OpenCVForUnity.UnityUtils.Helper; using OpenCVForUnity.UnityUtils; using HoloLensWithOpenCVForUnity.UnityUtils.Helper; using HoloLensCameraStream; using OpenCVForUnityExample;

///

/// HoloLens Comic Filter Example /// An example of image processing (comic filter) using OpenCVForUnity on Hololens. /// Referring to http://dev.classmethod.jp/smartphone/opencv-manga-2/. /// [RequireComponent(typeof(HololensCameraStreamToMatHelper))] public class HoloLensColorFilterExample : MonoBehaviour { /// /// max number of objects to be detected in frame /// const int MAX_NUM_OBJECTS = 50;

/// <summary>
/// minimum and maximum object area
/// </summary>
const int MIN_OBJECT_AREA = 20 * 20;

/// <summary>
/// max object area
/// </summary>
//int MAX_OBJECT_AREA;

/// <summary>
/// The rgb mat.
/// </summary>
Mat rgbMat;

/// <summary>
/// The threshold mat.
/// </summary>
Mat thresholdMat;

/// <summary>
/// The hsv mat.
/// </summary>
Mat hsvMat;

ColorObject blue = new ColorObject("blue");
ColorObject yellow = new ColorObject("yellow");
ColorObject red = new ColorObject("red");
ColorObject green = new ColorObject("green");

/// <summary>
/// The texture.
/// </summary>
Texture2D texture;

/// <summary>
/// The quad renderer.
/// </summary>
Renderer quad_renderer;

/// <summary>
/// The web cam texture to mat helper.
/// </summary>
HololensCameraStreamToMatHelper webCamTextureToMatHelper;

readonly static Queue<Action> ExecuteOnMainThread = new Queue<Action>();


// Use this for initialization
protected void Start()
{
    
    webCamTextureToMatHelper = gameObject.GetComponent<HololensCameraStreamToMatHelper>();

#if WINDOWS_UWP && !DISABLE_HOLOLENSCAMSTREAM_API webCamTextureToMatHelper.frameMatAcquired += OnFrameMatAcquired; #endif //webCamTextureToMatHelper.outputColorFormat = WebCamTextureToMatHelper.ColorFormat.RGBA; webCamTextureToMatHelper.Initialize(); }

/// <summary>
/// Raises the web cam texture to mat helper initialized event.
/// </summary>
public void OnWebCamTextureToMatHelperInitialized()
{
    Debug.Log("OnWebCamTextureToMatHelperInitialized");

    Mat webCamTextureMat = webCamTextureToMatHelper.GetMat();

    texture = new Texture2D(webCamTextureMat.cols(), webCamTextureMat.rows(), TextureFormat.RGB24, false);
    texture.wrapMode = TextureWrapMode.Clamp;
    quad_renderer = gameObject.GetComponent<Renderer>() as Renderer;
    quad_renderer.sharedMaterial.SetTexture("_MainTex", texture);


    //Debug.Log("Screen.width " + Screen.width + " Screen.height " + Screen.height + " Screen.orientation " + Screen.orientation);

    rgbMat = new Mat(webCamTextureMat.rows(), webCamTextureMat.cols(), CvType.CV_8UC4);
    thresholdMat = new Mat();
    hsvMat = new Mat();


    Matrix4x4 projectionMatrix;

#if WINDOWS_UWP && !DISABLE_HOLOLENSCAMSTREAM_API projectionMatrix = webCamTextureToMatHelper.GetProjectionMatrix (); quad_renderer.sharedMaterial.SetMatrix ("_CameraProjectionMatrix", projectionMatrix); #else // This value is obtained from PhotoCapture's TryGetProjectionMatrix() method. I do not know whether this method is good. // Please see the discussion of this thread. Https://forums.hololens.com/discussion/782/live-stream-of-locatable-camera-webcam-in-unity projectionMatrix = Matrix4x4.identity; projectionMatrix.m00 = 2.31029f; projectionMatrix.m01 = 0.00000f; projectionMatrix.m02 = 0.09614f; projectionMatrix.m03 = 0.00000f; projectionMatrix.m10 = 0.00000f; projectionMatrix.m11 = 4.10427f; projectionMatrix.m12 = -0.06231f; projectionMatrix.m13 = 0.00000f; projectionMatrix.m20 = 0.00000f; projectionMatrix.m21 = 0.00000f; projectionMatrix.m22 = -1.00000f; projectionMatrix.m23 = 0.00000f; projectionMatrix.m30 = 0.00000f; projectionMatrix.m31 = 0.00000f; projectionMatrix.m32 = -1.00000f; projectionMatrix.m33 = 0.00000f; quad_renderer.sharedMaterial.SetMatrix("_CameraProjectionMatrix", projectionMatrix); #endif

    float halfOfVerticalFov = Mathf.Atan(1.0f / projectionMatrix.m11);
    float aspectRatio = (1.0f / Mathf.Tan(halfOfVerticalFov)) / projectionMatrix.m00;
    Debug.Log("halfOfVerticalFov " + halfOfVerticalFov);
    Debug.Log("aspectRatio " + aspectRatio);


}

/// <summary>
/// Raises the web cam texture to mat helper disposed event.
/// </summary>
public void OnWebCamTextureToMatHelperDisposed()
{
    Debug.Log("OnWebCamTextureToMatHelperDisposed");

    if (rgbMat != null)
        rgbMat.Dispose();
    if (thresholdMat != null)
        thresholdMat.Dispose();
    if (hsvMat != null)
        hsvMat.Dispose();

    if (texture != null)
    {
        Texture2D.Destroy(texture);
        texture = null;
    }


    lock (ExecuteOnMainThread)
    {
        ExecuteOnMainThread.Clear();
    }
       
}

/// <summary>
/// Raises the web cam texture to mat helper error occurred event.
/// </summary>
/// <param name="errorCode">Error code.</param>
public void OnWebCamTextureToMatHelperErrorOccurred(WebCamTextureToMatHelper.ErrorCode errorCode)
{
    Debug.Log("OnWebCamTextureToMatHelperErrorOccurred " + errorCode);
}

#if WINDOWS_UWP && !DISABLE_HOLOLENSCAMSTREAM_API public void OnFrameMatAcquired(Mat rgbaMat, Matrix4x4 projectionMatrix, Matrix4x4 cameraToWorldMatrix, CameraIntrinsics cameraIntrinsics) { Imgproc.cvtColor(rgbaMat, rgbMat, Imgproc.COLOR_RGBA2RGB);

    //then reds
    Imgproc.cvtColor(rgbMat, hsvMat, Imgproc.COLOR_RGB2HSV);
    Core.inRange(hsvMat, red.getHSVmin(), red.getHSVmax(), thresholdMat);
    morphOps(thresholdMat);
    trackFilteredObject(red, thresholdMat, hsvMat, rgbMat);

    

    Enqueue(() =>
    {

        if (!webCamTextureToMatHelper.IsPlaying()) return;

        Utils.fastMatToTexture2D(rgbaMat, texture);
        rgbaMat.Dispose();

        Matrix4x4 worldToCameraMatrix = cameraToWorldMatrix.inverse;

        quad_renderer.sharedMaterial.SetMatrix("_WorldToCameraMatrix", worldToCameraMatrix);

        // Position the canvas object slightly in front
        // of the real world web camera.
        Vector3 position = cameraToWorldMatrix.GetColumn(3) - cameraToWorldMatrix.GetColumn(2) * 2.2f;

        // Rotate the canvas object so that it faces the user.
        Quaternion rotation = Quaternion.LookRotation(-cameraToWorldMatrix.GetColumn(2), cameraToWorldMatrix.GetColumn(1));

        gameObject.transform.position = position;
        gameObject.transform.rotation = rotation;

    });
}

private void Update()
{
    lock (ExecuteOnMainThread)
    {
        while (ExecuteOnMainThread.Count > 0)
        {
            ExecuteOnMainThread.Dequeue().Invoke();
        }
    }
}

private void Enqueue(Action action)
{
    lock (ExecuteOnMainThread)
    {
        ExecuteOnMainThread.Enqueue(action);
    }
}

#else

// Update is called once per frame
void Update()
{
    if (webCamTextureToMatHelper.IsPlaying() && webCamTextureToMatHelper.DidUpdateThisFrame())
    {
        Mat rgbaMat = webCamTextureToMatHelper.GetMat();

        Imgproc.cvtColor(rgbaMat, rgbMat, Imgproc.COLOR_RGBA2RGB);
       
        Imgproc.cvtColor(rgbMat, hsvMat, Imgproc.COLOR_RGB2HSV);
        Core.inRange(hsvMat, red.getHSVmin(), red.getHSVmax(), thresholdMat);
        morphOps(thresholdMat);
        trackFilteredObject(red, thresholdMat, hsvMat, rgbMat);
        

        Utils.fastMatToTexture2D(rgbMat, texture);
    }

    if (webCamTextureToMatHelper.IsPlaying())
    {

        Matrix4x4 cameraToWorldMatrix = webCamTextureToMatHelper.GetCameraToWorldMatrix();
        Matrix4x4 worldToCameraMatrix = cameraToWorldMatrix.inverse;

        quad_renderer.sharedMaterial.SetMatrix("_WorldToCameraMatrix", worldToCameraMatrix);

        // Position the canvas object slightly in front
        // of the real world web camera.
        Vector3 position = cameraToWorldMatrix.GetColumn(3) - cameraToWorldMatrix.GetColumn(2) * 2.2f;

        // Rotate the canvas object so that it faces the user.
        Quaternion rotation = Quaternion.LookRotation(-cameraToWorldMatrix.GetColumn(2), cameraToWorldMatrix.GetColumn(1));

        gameObject.transform.position = position;
        gameObject.transform.rotation = rotation;
    }
}

#endif

/// <summary>
/// Draws the object.
/// </summary>
/// <param name="theColorObjects">The color objects.</param>
/// <param name="frame">Frame.</param>
/// <param name="temp">Temp.</param>
/// <param name="contours">Contours.</param>
/// <param name="hierarchy">Hierarchy.</param>
private void drawObject(List<ColorObject> theColorObjects, Mat frame, Mat temp, List<MatOfPoint> contours, Mat hierarchy)
{
    for (int i = 0; i < theColorObjects.Count; i++)
    {
        Imgproc.drawContours(frame, contours, i, theColorObjects[i].getColor(), 3, 8, hierarchy, int.MaxValue, new Point());
        Imgproc.circle(frame, new Point(theColorObjects[i].getXPos(), theColorObjects[i].getYPos()), 5, theColorObjects[i].getColor());
        Imgproc.putText(frame, theColorObjects[i].getXPos() + " , " + theColorObjects[i].getYPos(), new Point(theColorObjects[i].getXPos(), theColorObjects[i].getYPos() + 20), 1, 1, theColorObjects[i].getColor(), 2);
        Imgproc.putText(frame, theColorObjects[i].getType(), new Point(theColorObjects[i].getXPos(), theColorObjects[i].getYPos() - 20), 1, 2, theColorObjects[i].getColor(), 2);
    }
}

/// <summary>
/// Morphs the ops.
/// </summary>
/// <param name="thresh">Thresh.</param>
private void morphOps(Mat thresh)
{
    //create structuring element that will be used to "dilate" and "erode" image.
    //the element chosen here is a 3px by 3px rectangle
    Mat erodeElement = Imgproc.getStructuringElement(Imgproc.MORPH_RECT, new Size(3, 3));
    //dilate with larger element so make sure object is nicely visible
    Mat dilateElement = Imgproc.getStructuringElement(Imgproc.MORPH_RECT, new Size(8, 8));

    Imgproc.erode(thresh, thresh, erodeElement);
    Imgproc.erode(thresh, thresh, erodeElement);

    Imgproc.dilate(thresh, thresh, dilateElement);
    Imgproc.dilate(thresh, thresh, dilateElement);
}

/// <summary>
/// Tracks the filtered object.
/// </summary>
/// <param name="theColorObject">The color object.</param>
/// <param name="threshold">Threshold.</param>
/// <param name="HSV">HS.</param>
/// <param name="cameraFeed">Camera feed.</param>
private void trackFilteredObject(ColorObject theColorObject, Mat threshold, Mat HSV, Mat cameraFeed)
{

    List<ColorObject> colorObjects = new List<ColorObject>();
    Mat temp = new Mat();
    threshold.copyTo(temp);
    //these two vectors needed for output of findContours
    List<MatOfPoint> contours = new List<MatOfPoint>();
    Mat hierarchy = new Mat();
    //find contours of filtered image using openCV findContours function
    Imgproc.findContours(temp, contours, hierarchy, Imgproc.RETR_CCOMP, Imgproc.CHAIN_APPROX_SIMPLE);

    //use moments method to find our filtered object
    bool colorObjectFound = false;
    if (hierarchy.rows() > 0)
    {
        int numObjects = hierarchy.rows();

        //                      Debug.Log("hierarchy " + hierarchy.ToString());

        //if number of objects greater than MAX_NUM_OBJECTS we have a noisy filter
        if (numObjects < MAX_NUM_OBJECTS)
        {
            for (int index = 0; index >= 0; index = (int)hierarchy.get(0, index)[0])
            {

                Moments moment = Imgproc.moments(contours[index]);
                double area = moment.get_m00();

                //if the area is less than 20 px by 20px then it is probably just noise
                //if the area is the same as the 3/2 of the image size, probably just a bad filter
                //we only want the object with the largest area so we safe a reference area each
                //iteration and compare it to the area in the next iteration.
                if (area > MIN_OBJECT_AREA)
                {

                    ColorObject colorObject = new ColorObject();

                    colorObject.setXPos((int)(moment.get_m10() / area));
                    colorObject.setYPos((int)(moment.get_m01() / area));
                    colorObject.setType(theColorObject.getType());
                    colorObject.setColor(theColorObject.getColor());

                    colorObjects.Add(colorObject);

                    colorObjectFound = true;

                }
                else
                {
                    colorObjectFound = false;
                }
            }
            //let user know you found an object
            if (colorObjectFound == true)
            {
                //draw object location on screen
                drawObject(colorObjects, cameraFeed, temp, contours, hierarchy);
            }

        }
        else
        {
            Imgproc.putText(cameraFeed, "TOO MUCH NOISE!", new Point(5, cameraFeed.rows() - 10), Imgproc.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar(255, 255, 255, 255), 2, Imgproc.LINE_AA, false);
        }
    }
}










/// <summary>
/// Raises the destroy event.
/// </summary>
void OnDestroy()
{

#if WINDOWS_UWP && !DISABLE_HOLOLENSCAMSTREAM_API webCamTextureToMatHelper.frameMatAcquired -= OnFrameMatAcquired; #endif webCamTextureToMatHelper.Dispose(); }

/// <summary>
/// Raises the back button click event.
/// </summary>
public void OnBackButtonClick()
{
    SceneManager.LoadScene("HoloLensWithOpenCVForUnityExample");
}

/// <summary>
/// Raises the play button click event.
/// </summary>
public void OnPlayButtonClick()
{
    webCamTextureToMatHelper.Play();
}

/// <summary>
/// Raises the pause button click event.
/// </summary>
public void OnPauseButtonClick()
{
    webCamTextureToMatHelper.Pause();
}

/// <summary>
/// Raises the stop button click event.
/// </summary>
public void OnStopButtonClick()
{
    webCamTextureToMatHelper.Stop();
}

/// <summary>
/// Raises the change camera button click event.
/// </summary>
public void OnChangeCameraButtonClick()
{
    webCamTextureToMatHelper.requestedIsFrontFacing = !webCamTextureToMatHelper.IsFrontFacing();
}

}

chismer avatar Jul 04 '21 16:07 chismer

I've modified the script you sent me. I also changed the gray shader in the example scene to the color shader. It's working fine in my environment. Try it out.

HololensColorFilterExample.zip

EnoxSoftware avatar Jul 05 '21 17:07 EnoxSoftware

Guys, can you explain what is happening in this line of code:

Vector3 position = cameraToWorldMatrix.GetColumn(3) - cameraToWorldMatrix.GetColumn(2) * 2.2f;

What are Columns 3 and 2?

I need to reposition or resize the Quad slightly. I want to make it black (transparent) and apply augmentations to it. It overlays distant objects well, but for closer objects the quad is a little bit low, and too big.

I don't know how to position the quad using the camerToWorldMatrix code above, but I really like how it tracks with the user's head movement. I need this functionality.

A00107408 avatar Aug 06 '21 09:08 A00107408