zha-device-handlers icon indicating copy to clipboard operation
zha-device-handlers copied to clipboard

Add support for MoesHouse Presence sensor

Open rabin-io opened this issue 1 year ago • 23 comments

After working with it as a custom quirks for some time, I think it stable and working properly to be included upstream.

See https://github.com/zigpy/zha-device-handlers/issues/1645 for full details and history of this work.

Co-authored-by: PlusPlus-ua

rabin-io avatar Mar 15 '23 11:03 rabin-io

Pull Request Test Coverage Report for Build 4427008309

  • 39 of 71 (54.93%) changed or added relevant lines in 1 file are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage decreased (-0.2%) to 83.679%

Changes Missing Coverage Covered Lines Changed/Added Lines %
zhaquirks/tuya/ts0601_radar.py 39 71 54.93%
<!-- Total: 39 71
Totals Coverage Status
Change from base Build 4409564543: -0.2%
Covered Lines: 6937
Relevant Lines: 8290

💛 - Coveralls

coveralls avatar Mar 15 '23 11:03 coveralls

Codecov Report

Attention: Patch coverage is 54.92958% with 32 lines in your changes are missing coverage. Please review.

Project coverage is 83.67%. Comparing base (fa60b9d) to head (70bc35a). Report is 296 commits behind head on dev.

:exclamation: Current head 70bc35a differs from pull request most recent head a6e5be5. Consider uploading reports for the commit a6e5be5 to get more accurate results

Files Patch % Lines
zhaquirks/tuya/ts0601_radar.py 54.92% 32 Missing :warning:
Additional details and impacted files
@@            Coverage Diff             @@
##              dev    #2275      +/-   ##
==========================================
- Coverage   83.92%   83.67%   -0.25%     
==========================================
  Files         260      261       +1     
  Lines        8219     8290      +71     
==========================================
+ Hits         6898     6937      +39     
- Misses       1321     1353      +32     

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

codecov-commenter avatar Mar 15 '23 12:03 codecov-commenter

Thanks for the PR first of all. We likely need to combine this with https://github.com/zigpy/zha-device-handlers/blob/dev/zhaquirks/tuya/ts0601_motion.py a bit (it looks like that file was modified and is your quirk now).

Some model infos are duplicated now. Can you post all "manufacturer names" (_TZE200_xxxx) this code was actually tested with?

TheJulianJES avatar Mar 15 '23 16:03 TheJulianJES

I can only speak on my setup, which include only _TZE204_ztc6ggyl
some others mentioned other models in the original thread where this code was introduced, and said, it's working for them as well.

We can remove the support from ts0601_motion.py for _TZE204_ztc6ggyl as it didn't expose all the option for this device.

rabin-io avatar Mar 15 '23 17:03 rabin-io

Some model infos are duplicated now. Can you post all "manufacturer names" (_TZE200_xxxx) this code was actually tested with?

I've tested this code with '_TZE200_ikvncluo' and '_TZE204_ztc6ggyl', both works well. It might be that '_TZE200_ztc6ggyl' it's my typo, I've just missed 4 in first part.

PlusPlus-ua avatar Mar 16 '23 06:03 PlusPlus-ua

IMHO all this code should go inside the current ts0601_motion.py file. Your TuyaMmwRadarCluster must replace the current MmwRadarManufCluster class for sure. Then MmwRadarMotion would be replaced with the TuyaMmwRadarOccupancy (merging all the models_info).

I have mixed feelings about the new TuyaAttributesCluster classes. Nowdays we have a mechanism to expose the cluster attributes in HA, but I also appreciate (and thank you for) your implemtation. For now, it's fine to me to keep it as is inside the ts0601_motion.py file.

PS: and thank you very much for your contribution 🎉

javicalle avatar Mar 16 '23 20:03 javicalle

Thanks for the contribution.

The attributes are already mapped by the MmwRadarManufCluster in ts0601_motion.py (the names aren't sexy, but they are mapped). Just the entities are missing in zha. The proposed solution works, but IMO, the main part of the attributes should be configuration entities, and, I might be wrong, but the only way I know to do so is to add the entities in zha, not in a quirk. I have it working, it looks like this: Capture d’écran 2023-03-17 à 03 29 20

@javicalle what's your point of view? According to your answer, I will do the PR on zha in ha-core.

bemble avatar Mar 16 '23 22:03 bemble

I might be wrong, but the only way I know to do so is to add the entities in zha, not in a quirk.

I think it's definitely better way

PlusPlus-ua avatar Mar 17 '23 06:03 PlusPlus-ua

I think it's definitely better way

It's not fully functional, I still have some bugs, it's like there is a mix in the attributes, either the occupancy sensor is not updated OR thee min/max range and fading time are not written.

bemble avatar Mar 17 '23 11:03 bemble

@bemble are you extending the NoManufacturerCluster in the MmwRadarManufCluster?

javicalle avatar Mar 17 '23 14:03 javicalle

@javicalle It's like it's not (I'd like to don't change the already existing quirk, but I guess I'll have to): https://github.com/zigpy/zha-device-handlers/blob/511aeff858ee40060b3b546bd5a6690895c40111/zhaquirks/tuya/ts0601_motion.py#L118

bemble avatar Mar 17 '23 14:03 bemble

For the reported behavior I would suspect that the NoManufacturerCluster is requiered to update the values. Try changing the definition to:

 class MmwRadarManufCluster(NoManufacturerCluster, TuyaMCUCluster): 

You will also need to add the new import like:

from zhaquirks.tuya import (
    DPToAttributeMapping,
    NoManufacturerCluster,
    TuyaLocalCluster,
    TuyaManufCluster,
    TuyaNewManufCluster,
)

Despite if they are sexy or not, to have homogeneous names for the same functions/controls will make easy to reuse HA control definitions (ie: it will be more easy to match several sensitivity attributes in one control if they uses the same attribute).

javicalle avatar Mar 17 '23 15:03 javicalle

It's more the DPToAttributeMapping from my tests. With the DPToAttributeMapping from zhaquirks.tuya.mcu, attributes are used:

from zhaquirks.tuya import (
    TuyaLocalCluster,
    TuyaManufCluster,
    TuyaNewManufCluster,
)
from zhaquirks.tuya.mcu import (
    DPToAttributeMapping,
    TuyaMCUCluster,
)

Where with the DPToAttributeMapping from zhaquirks.tuya as it's now in the current version of the quirk, attributes are ignored:

from zhaquirks.tuya import (
    DPToAttributeMapping,
    TuyaLocalCluster,
    TuyaManufCluster,
    TuyaNewManufCluster,
)
from zhaquirks.tuya.mcu import (
    TuyaMCUCluster,
)

And yes, as I'm modifying the quirk, I will rename the attributes, sensitivity is better than dp_2 in every case. Should I keep them in MmwRadarManufCluster or like in the proposed PR, into dedicated clusters?

bemble avatar Mar 17 '23 15:03 bemble

I would keep the attributes names from that PR.

IMHO the ZHA attributes implementation is the way to go (better than the TuyaAttributesCluster) but I would also like to know the opinion of @rabin-io

javicalle avatar Mar 17 '23 20:03 javicalle

Yes, my attributes look like this right now:

    attributes = TuyaMCUCluster.attributes.copy()
    attributes.update(
        {
            # ramdom attribute IDs
            0xEF02: ("sensitivity", t.uint32_t, True),
            0xEF03: ("min_range", t.uint32_t, True),
            0xEF04: ("max_range", t.uint32_t, True),
            0xEF06: ("self_test", TuyaMmwRadarSelfTest, True),
            0xEF65: ("detection_delay", t.uint32_t, True),
            0xEF66: ("fading_time", t.uint32_t, True),
            0xEF67: ("cli", t.CharacterString, True),
            0xEF69: ("dp_105", t.enum8, True),
            0xEF6A: ("dp_106", t.enum8, True),
            0xEF6B: ("dp_107", t.enum8, True),
            0xEF6C: ("dp_108", t.uint32_t, True),
        }
    )

But I have some problems: I have to restart HA after the device is added to have all the entities displayed.

Another problem, bigger IMO, the target_distance worked for like an hour, and after that, it stayed at 0.0. I tried with another device with the same quirk etc, same behavior. When looking at the attribute in the Manage Zigbee Device section, it stays at 0.0. I don't know if I did a bad combo of min range, max range, sensitivity, fading time etc and the device do not compute this value anymore or something else. I don't know how to factory reset the device (reconfigure it in zha or remove/repair didn't change anything, self-check is "success").

bemble avatar Mar 18 '23 11:03 bemble

But I have some problems: I have to restart HA after the device is added to have all the entities displayed.

Probably related with the entity initialization. Check the backlight_mode and power_on_state initialization here: https://github.com/javicalle/home-assistant/blob/b4a3a663cf7f986a4d386a9b8b4809a272085077/homeassistant/components/zha/core/channels/manufacturerspecific.py#L62-L94

Checking the code it seems that the target_distance (0xEF09) attribute from TuyaMmwRadarCluster cluster it is not updating. The Tuya DP9 is mapped against the TuyaMmwRadarTargetDistance. If the present_value attribute from the TuyaMmwRadarTargetDistance isn't updating either, enable the debug logs for the integration and check the logs for DP=9 reports.

(PS: if you are going to the ZHA attributes approach, why don't also for the rest of the DP: Occupancy, RadarTargetDistance and Illuminance?)

javicalle avatar Mar 18 '23 20:03 javicalle

@javicalle This PR is not my own alone, and most of the was written by @PlusPlus-ua, I only made it to work with the latest HA release. I just didn't want all this work to go in waste and asked to make a PR for it, to be merged upstream.

And personally, I would prefer the path with the least work :)

rabin-io avatar Mar 19 '23 00:03 rabin-io

@rabin-io whatever the final solution will be your code was already useful, I used it to understand things and how the device work (last one, I know now why the target_distance didn't move, simply because I set the max and min range in meters, but the device expects centimeters).

I prefer the cleanest path, even if it's more work.

bemble avatar Mar 19 '23 13:03 bemble

OK, on my side I'm stuck:

  1. The right method:
    1. I like the approach of the given quirk, but I can't match the entities in ZHA to set them as config. I can't find the right CONFIG_DIAGNOSTIC_MATCH rules.
    2. Updating the current quirk (zhaquirks.tuya.ts0601_motion.MmwRadarMotion) and adding the entities in ZHA works, but I have to restart HA to have the entities showing after pairing the device, even after adding config in manufacturerspecific.py as suggested by @javicalle
  2. The right units:
    • after some hours spent to have human usable units (meters and seconds), I now understand why @rabin-io and @PlusPlus-ua used milliseconds for fading time and delay and centimeters for min and max range instead of seconds and meters: for a reason I couldn't find, when the value comes in the dp_converter it's not a float anymore, it's an int.

IMHO, none of the 2 solutions are good enough to be merged as it is, but I'm stuck with no better proposal. Here are some pieces of code if anybody is interested.

Sample of an entity in ZHA (number in this case):

@CONFIG_DIAGNOSTIC_MATCH(
    channel_names="tuya_manufacturer",
    quirk_classes={
        "zhaquirks.tuya.ts0601_motion.MmwRadarMotion",
        "ts0601_motion.MmwRadarMotion", # my override of zhaquirks.tuya.ts0601_motion.MmwRadarMotion
        "ts0601_radar.TuyaMmwRadarOccupancy", # from #2275
    },
)
class TuyaMmwRadarMaximumRange(ZHANumberConfigurationEntity, id_suffix="maximum_range"):
    """Tuya Mmw Radar maxiumum range configuration entity."""

    _attr_entity_category = EntityCategory.CONFIG
    _attr_device_class: NumberDeviceClass = NumberDeviceClass.DISTANCE
    _attr_icon: str = "mdi:tape-measure"
    _attr_native_min_value: float = 0
    _attr_native_max_value: float = 950
    _attr_native_step: float = 10
    _attr_native_unit_of_measurement: str | None = UnitOfLength.CENTIMETERS
    _zcl_attribute: str = "max_range"
    _attr_name: str = "Maximum range"

The cluster in the "default" quirk:

class MmwRadarManufCluster(NoManufacturerCluster, TuyaMCUCluster):
    """Neo manufacturer cluster."""

    # # Possible DPs and values
    # presence_state: presence
    # target distance: 1.61m
    # illuminance: 250lux
    # sensitivity: 9
    # minimum_detection_distance: 0.00m
    # maximum_detection_distance: 4.05m
    # dp_detection_delay: 0.1
    # dp_fading_time: 5.0
    # ¿illuminance?: 255lux
    # presence_brightness: no control
    # no_one_brightness: no control
    # current_brightness: off

    attributes = TuyaMCUCluster.attributes.copy()
    attributes.update(
        {
            # ramdom attribute IDs
            0xEF01: ("occupancy", t.uint32_t, True),
            0xEF02: ("sensitivity", t.uint32_t, True),
            0xEF03: ("min_range", t.uint32_t, True),
            #0xEF03: ("min_range", float, True), # doesn't change the cast problem
            0xEF04: ("max_range", t.uint32_t, True),
            0xEF06: ("self_test", TuyaMmwRadarSelfTest, True),
            0xEF09: ("target_distance", t.uint32_t, True),
            0xEF65: ("detection_delay", t.uint32_t, True),
            0xEF66: ("fading_time", t.uint32_t, True),
            0xEF67: ("cli", t.CharacterString, True),
            0xEF69: ("dp_105", t.enum8, True),
            0xEF6A: ("dp_106", t.enum8, True),
            0xEF6B: ("dp_107", t.enum8, True),
            0xEF6C: ("dp_108", t.uint32_t, True),
        }
    )

    dp_to_attribute: Dict[int, DPToAttributeMapping] = {
        1: DPToAttributeMapping(
            TuyaOccupancySensing.ep_attribute,
            "occupancy",
        ),
        2: DPToAttributeMapping(
            TuyaMCUCluster.ep_attribute,
            "sensitivity",
        ),
        3: DPToAttributeMapping(
            TuyaMCUCluster.ep_attribute,
            "min_range",
            # converter=lambda x: x / 100,
            # dp_converter=lambda x: x * 100, # x is an int here, we loose the precision if we convert to meters
        ),
        4: DPToAttributeMapping(
            TuyaMCUCluster.ep_attribute,
            "max_range",
            # converter=lambda x: x / 100,
            # dp_converter=lambda x: x * 100,
        ),
        6: DPToAttributeMapping(
            TuyaMCUCluster.ep_attribute,
            "self_test",
        ),
        9: DPToAttributeMapping(
            TuyaAnalogInput.ep_attribute,
            "present_value",
            lambda x: x / 100,
        ),
        101: DPToAttributeMapping(
            TuyaMCUCluster.ep_attribute,
            "detection_delay",
            converter=lambda x: x * 100,
            dp_converter=lambda x: x // 100,
        ),
        102: DPToAttributeMapping(
            TuyaMCUCluster.ep_attribute,
            "fading_time",
            converter=lambda x: x * 100,
            dp_converter=lambda x: x // 100,
        ),
        103: DPToAttributeMapping(
            TuyaMCUCluster.ep_attribute,
            "cli",
        ),
        104: DPToAttributeMapping(
            TuyaIlluminanceMeasurement.ep_attribute,
            "measured_value",
            lambda x: 10000 * math.log10(x) + 1,
        ),
        105: DPToAttributeMapping(
            TuyaMCUCluster.ep_attribute,
            "dp_105",
        ),
        106: DPToAttributeMapping(
            TuyaMCUCluster.ep_attribute,
            "dp_106",
        ),
        107: DPToAttributeMapping(
            TuyaMCUCluster.ep_attribute,
            "dp_107",
        ),
        108: DPToAttributeMapping(
            TuyaMCUCluster.ep_attribute,
            "dp_108",
        ),
    }

    data_point_handlers = {
        1: "_dp_2_attr_update",
        2: "_dp_2_attr_update",
        3: "_dp_2_attr_update",
        4: "_dp_2_attr_update",
        6: "_dp_2_attr_update",
        9: "_dp_2_attr_update",
        101: "_dp_2_attr_update",
        102: "_dp_2_attr_update",
        103: "_dp_2_attr_update",
        104: "_dp_2_attr_update",
        105: "_dp_2_attr_update",
        106: "_dp_2_attr_update",
        107: "_dp_2_attr_update",
        108: "_dp_2_attr_update",
    }

And the channel:

@registries.CHANNEL_ONLY_CLUSTERS.register(registries.TUYA_MANUFACTURER_CLUSTER)
@registries.ZIGBEE_CHANNEL_REGISTRY.register(registries.TUYA_MANUFACTURER_CLUSTER)
class TuyaChannel(ZigbeeChannel):
    """Channel for the Tuya manufacturer Zigbee cluster."""

    REPORT_CONFIG = (
        AttrReportConfig(attr="occupancy", config=REPORT_CONFIG_DEFAULT), # doesn't change even with REPORT_CONFIG_ASAP
        AttrReportConfig(attr="sensitivity", config=REPORT_CONFIG_DEFAULT),
        AttrReportConfig(attr="min_range", config=REPORT_CONFIG_DEFAULT),
        AttrReportConfig(attr="max_range", config=REPORT_CONFIG_DEFAULT),
        AttrReportConfig(attr="self_test", config=REPORT_CONFIG_DEFAULT),
        AttrReportConfig(attr="target_distance", config=REPORT_CONFIG_DEFAULT),
        AttrReportConfig(attr="detection_delay", config=REPORT_CONFIG_DEFAULT),
        AttrReportConfig(attr="fading_time", config=REPORT_CONFIG_DEFAULT),
    )

    def __init__(self, cluster: zigpy.zcl.Cluster, ch_pool: ChannelPool) -> None:
        """Initialize TuyaChannel."""
        super().__init__(cluster, ch_pool)

        if self.cluster.endpoint.manufacturer in (
            "_TZE200_7tdtqgwv",
            "_TZE200_amp6tsvy",
            "_TZE200_oisqyl4o",
            "_TZE200_vhy3iakz",
            "_TZ3000_uim07oem",
            "_TZE200_wfxuhoea",
            "_TZE200_tviaymwx",
            "_TZE200_g1ib5ldv",
            "_TZE200_wunufsil",
            "_TZE200_7deq70b8",
            "_TZE200_tz32mtza",
            "_TZE200_2hf7x9n3",
            "_TZE200_aqnazj70",
            "_TZE200_1ozguk6x",
            "_TZE200_k6jhsr0q",
            "_TZE200_9mahtqtg",
        ):
            self.ZCL_INIT_ATTRS = {  # pylint: disable=invalid-name
                "backlight_mode": True,
                "power_on_state": True,
            }
        elif self.cluster.endpoint.manufacturer in (
            "_TZE200_ar0slwnd",
            "_TZE200_sfiy5tfs",
            "_TZE200_mrf6vtua",
            "_TZE200_ztc6ggyl",
            "_TZE204_ztc6ggyl",
            "_TZE200_ikvncluo",
        ):
            self.ZCL_INIT_ATTRS = {  # pylint: disable=invalid-name
                "occupancy": True,
                "sensitivity": True,
                "min_range": True,
                "max_range": True,
                "self_test": True,
                "target_distance": True,
                "detection_delay": True,
                "fading_time": True,
            }

bemble avatar Mar 20 '23 00:03 bemble

@bemble, just checking, did you ever submit a PR to zha with this stuff?

arbrandes avatar Sep 04 '23 22:09 arbrandes

@bemble, just checking, did you ever submit a PR to zha with this stuff?

No, for 2 reasons:

  1. I didn't find a good solution IMO (see my last message)
  2. I don't have the device and the time to work on it (even if I buy a new one) anymore

bemble avatar Sep 05 '23 09:09 bemble

I have the _TZE204_ztc6ggyl device and this quirk doesn't work for me. I can see five controls and one Occupancy sensor, I can adjust the sensitivity control between 1 and 9 and this is registered in the logbook, but the sensor itself never changes and always shows as Clear.

rahulpdev avatar Oct 30 '23 21:10 rahulpdev

There hasn't been any activity on this pull request recently. This pull request has been automatically marked as stale because of that and will be closed if no further activity occurs within 7 days. Thank you for your contributions.

github-actions[bot] avatar Apr 27 '24 22:04 github-actions[bot]