tesla
tesla copied to clipboard
feat: Support modern vehicles using HTTP proxy
This is my first attempt at implementing Tesla HTTP Proxy so that this integration can continue working with modern vehicles. It is currently working for me with a 2021 Model 3, both for reading sensors and sending commands.
This change is necessary because Tesla has deprecated the Owner API, so commands now require end-to-end encryption. Reference: https://github.com/teslamotors/vehicle-command
This change depends on my fork of teslajsonpy and the HTTP Proxy add-on.
I'm leaving this as a draft because I need some help cleaning this up and adding tests.
TODO: Update teslajsonpy requirement once this PR is merged.
Closes #743.
Currently, I am crashing if I don't provide a "SSL certificate for proxy" using this code
Currently, I am crashing if I don't provide a "SSL certificate for proxy" using this code
This should be fixed now
I'm having trouble figuring out how to run the tests on this. In a venv, after running poetry install
, pytest fails like this:
% pytest tests
Traceback (most recent call last):
File "/opt/homebrew/lib/python3.11/site-packages/_pytest/config/__init__.py", line 782, in import_plugin
__import__(importspec)
ModuleNotFoundError: No module named 'pytest_homeassistant_custom_component'
pip shows the module is installed:
% pip freeze | grep pytest-homeassistant
pytest-homeassistant-custom-component==0.13.45
Make sure you're in the poetry environment. poetry shell
.
Teslajsonpy 3.10.0 has been released so the version can be updated.
The whole setup appears to be working correctly so I think we can move this out of draft status.
One suggestion: I would update the readme to include some details about the new API/http-proxy and what the new input fields in the setup flow are for.
Edit: fixed the connection issue, can confirm it works.
Thanks, I will finish it up this weekend. I already updated the README, and linked to the add-on which has more details. Do you think it needs more explanation?
The explanation you've added is already great. I'd just like to add a bit of clarity regarding how you can know (or find out) if you need the new API or can stick with the old one. For instance:
- that the cut-over for requiring the new API is cars built after late 2021 (I believe?). Older cars should keep working.
- That this error in the logs indicates they need to make the switch to the new API:
[teslajsonpy.connection] 403: {"response":null,"error":"Tesla Vehicle Command Protocol required, please refer to the documentation here: https://developer.tesla.com/docs/fleet-api#2023-10-09-rest-api-vehicle-commands-endpoint-deprecation-warning","error_description":""}
- That the 3 fields (proxy url, cert, and client id) in the set-up flow are only required for new API and should otherwise be left empty
Maybe even a suggestion to stick with the old one if it still works because the new one might start costing money at some time in the (not so distant) future.
In any case, that's the information I'd like to have if I was going through the setup not knowing about the 2 different APIs.
that the cut-over for requiring the new API is cars built after late 2021 (I believe?). Older cars should keep working.
Based on the following, wouldn't pre-2021-late be impacted as well if they had not used the old API "preceding 30 days"? perhaps I don't fully understand the case.
November 2023 -newly delivered vehicles will only support the new API
Nov - Dec 2023 - the old API will stop working on existing vehicles that have not used the old API in the preceding 30 days
January 2024 - the old API will stop working on all vehicles
[edit] is this pre-2021 only for S/X ? Not sure where this 2021 comes from, I've read #774 and it's not even clear in my mind.
Went to check the docs again and seems that it is indeed only 'pre 2021 model S/X' that will be allowed to keep using the old API.
*Fleet accounts are excluded from these changes until further notice. Pre-2021 Model S/X are excluded from these changes.
https://developer.tesla.com/docs/fleet-api#2023-11-17-vehicle-commands-endpoint-deprecation-timeline-action-required
My M3 2018 doesn't even work anymore with the old API most likely cuz I had other things (health) more important and had not triggered at least 1 command preceding 30 days. I guess there are a few cases to cover in the doc.
All these exceptions sure makes the Q&A "form to fill" to configure the integration a little more complex. Makes me wonder if there shouldn't 2 independent integrations covering the old vs new API but using the same base library. If this makes sense?
I'm not too aware of the coding and only have 1 car. Will the final integration allows an owner of 2 cars to work if 1 uses the old API and the other the new API? If it already covers this case, fantastic.
My M3 2018 doesn't even work anymore with the old API most likely cuz I had other things (health) more important and had not triggered at least 1 command preceding 30 days. I guess there are a few cases to cover in the doc.
All these exceptions sure makes the Q&A "form to fill" to configure the integration a little more complex. Makes me wonder if there shouldn't 2 independent integrations covering the old vs new API but using the same base library. If this makes sense?
I'm not too aware of the coding and only have 1 car. Will the final integration allows an owner of 2 cars to work if 1 uses the old API and the other the new API? If it already covers this case, fantastic.
As of January 2024, there is no Tesla vehicle except pre 2021 model S/X
that can run without the proxy, it doesn't matter if you had sent commands every minute or once every 30 days, if you don't have a pre 2021 model S/X
as it currently stands, this integration is basically dead without the proxy running for 95% of users.
As of January 2024, there is no Tesla vehicle except pre 2021 model S/X that can run without the proxy
Hi, maybe I am one of the 5% but I have been using this integration for commands (with TeslaMate for logging) mainly to charge my 2023 Model Y on solar. That has worked throughout and was working today with commands for start and stop charging and multiple change charging amps. I am in Australia. I have been watching this and hoping you guys do something clever before it stops! So thanks for the work, and I am happy to help within my limitations (no relevant programming experience)
As of January 2024, there is no Tesla vehicle except pre 2021 model S/X that can run without the proxy
Hi, maybe I am one of the 5% but I have been using this integration for commands (with TeslaMate for logging) mainly to charge my 2023 Model Y on solar. That has worked throughout and was working today with commands for start and stop charging and multiple change charging amps. I am in Australia. I have been watching this and hoping you guys do something clever before it stops! So thanks for the work, and I am happy to help within my limitations (no relevant programming experience)
That's very interesting, that seems to imply that Tesla didn't quite shut down the http api in January like they said, how strange...
As of January 2024, there is no Tesla vehicle except pre 2021 model S/X that can run without the proxy
Hi, maybe I am one of the 5% but I have been using this integration for commands (with TeslaMate for logging) mainly to charge my 2023 Model Y on solar. That has worked throughout and was working today with commands for start and stop charging and multiple change charging amps. I am in Australia.
I have been watching this and hoping you guys do something clever before it stops!
So thanks for the work, and I am happy to help within my limitations (no relevant programming experience)
I do the same but I also pay for Tessie and I migrated to using the home assistant supported Tessie integration for controlling charging which does support the new Tesla api. So now I call an api that calls another api.
That's very interesting, that seems to imply that Tesla didn't quite shut down the http api in January like they said, how strange...
That was my assumption, and I was sort of hoping that it indicated Tesla might come up with something for owners. I was aware of the Tessie integration (and others like ChargeHQ) but would rather keep everything local if I can. It would be interesting to know how many others are in the same boat. Happy to act as a canary in the coal mine (:-)>
As of January 2024, there is no Tesla vehicle except pre 2021 model S/X that can run without the proxy
Hi, maybe I am one of the 5% but I have been using this integration for commands (with TeslaMate for logging) mainly to charge my 2023 Model Y on solar. That has worked throughout and was working today with commands for start and stop charging and multiple change charging amps. I am in Australia. I have been watching this and hoping you guys do something clever before it stops! So thanks for the work, and I am happy to help within my limitations (no relevant programming experience)
Same for me with a 2023 Shanghai MY. I'm in Germany.
It's pretty safe to say that the API shut down in January hasn't occurred as yet (as I'd be affected as well) otherwise I'd have expected an explosion of people reporting they have issues.
Will the final integration allows an owner of 2 cars to work if 1 uses the old API and the other the new API?
As I understand, Fleet API should work on all vehicles. Older vehicles don’t require the proxy but I’m guessing they should work with it anyway. Someone will have to test to be sure.
It's pretty safe to say that the API shut down in January hasn't occurred as yet (as I'd be affected as well) otherwise I'd have expected an explosion of people reporting they have issues.
Interesting. Perhaps the API is still functioning for cars that kept regularly sending commands? That's not the case for my M3 2018, the old API refuses to accept any commands.
@llamafilm I hope to do a release this weekend. Hopefully this can make it in. If so, please resolve conflicts and convert to non-Draft when ready. I'd also appreciate if you were around when it goes live to try to handle any issues that are raised. If you don't plan to be around or don't want to handle that, that's fine too.
I'll merge this in regardless once it's non-draft.
I'm reworking this to simplify the config flow depending on whether you want to use the proxy or not, and clarify the instructions. I'll try to finish it tonight. I'll be happy to help handle feedback once it goes live, but I may not be able to respond immediately.
I found a bug in the library, so I'll leave this as a draft until this PR gets merged: https://github.com/zabuldon/teslajsonpy/pull/460
Please address the broken config flow tests.
@thierryvt can you help with the tests? I changed the config flow to make it easier to understand. So the first step asks whether you want to use the proxy. If no, then step 2 looks the same as it did before. If yes, then step 2 adds the extra fields. If the tesla_http_proxy addon is configured correctly, this step will autofill those values for you. The addon is technically optional, as I did not want it to be a dependency for this component. So if someone wanted to run their own proxy outside HA, they could paste those values in here.
Step 1:
Step 2:
Quick comment, sadly Tesla Tokens on Android seem gone from Google Play store.
According to this site , last release Oct 2023.
I've use this site before, not sure if it still works ... obviously, disclaimer use at your own risk :/
https://tesla-info.com/tesla-token.php
[edit] Should keep current Android wording to move this PR forward and fix Android missing app doc later.
If that's true we should remove it from the config flow instructions. I think TeslaFi also can no longer be used for this — at least I didn't see a way to expose the refresh token generated there.
Refresh token generator, created #879 after digging a bit around.
Hi, is possible to configure tesla http proxy with only nabucasa ? thanks
@llamafilm I wouldn't mind helping but I can't for the life of me get the tests to run in this project. Every single test fails with this error:
FAILED tests/test_config_flow.py::test_option_flow_input_floor - pytest_socket.SocketBlockedError: A test tried to use socket.socket.
And something about a socketBlocked error which is caused by pytest_socket or some other plugin.
> raise SocketBlockedError() E pytest_socket.SocketBlockedError: A test tried to use socket.socket. venv\Lib\site-packages\pytest_socket.py:80: SocketBlockedError
The problem is that no matter what flags or configs I add or overwrite I can't get rid of it. I have already wasted many hours on this, so unless you have an idea on how to fix this I can't help.
I tried working through the tests but now I'm stuck on the new 2 step config flow. I added 1 more method so we can test both conditions, depending on whether or not the user chooses to use a proxy.
I started working on the next method test_form_invalid_auth
but I can't figure out how to pass `user_input['api_proxy_enable'] to the second step of the config flow. If someone can figure that out, then it should be straightforward to apply the same logic to the remaining methods.
The first step of the config flow passes a boolean user_input
to the second step. So user_input['api_proxy_enable']
should always exist, even if user_input['username']
does not. I can't figure out how to account for this in the test.