tauri
tauri copied to clipboard
Cookie Support Inconsistencies
I have created an app which requires accessing cookies from browser, while this works for debug builds it does not seem to work when building Tauri in release. I have also noticed that unlike Linux, Windows when testing release builds does not send cookies along with requests, however does work with debug builds on Windows.
Is there any reason why cookie support seems to be inconsistent?
This was wall tested using the latest Tauri beta.
Can you please be more specific about how you are getting / storing the cookies? is it the 'fetch' built into the webview, or is it our rust http API?
Sure sorry, I am not using the rust API at all. Just using 'fetch' with 'credentials: include' as an option when making a request. When I need to access a cookie directly, I use 'document.cookie'. It seems to work without issue in all debug environments, but the release environment is inconsistent between Linux and Windows. I have not fully tested on macOS, but the debug builds work.
After some testing and research macOS behaves the same as Linux. I was in fact able to get Windows to work as I found that Set-Cookie does not work without 'SameSite=None' and 'Secure' on Edge WebView2 as found in: https://github.com/MicrosoftEdge/WebView2Feedback/issues/1194
This presents some difficulties that I am unable to work around easily as the 'Secure' cookie flag does not work on Linux and macOS under Tauri. It would be great if this was supported, but I understand there might be underlying issues with the platforms webview support on Linux and macOS.
At this point in time I am probably going to have to just patch my code during a build when running on Windows to use the 'SameSite=None' and 'Secure' cookie flags as I have not figured out another way to handle this and the Edge WebView2 User-Agent is the same as Edge.
Furthermore, accessing 'document.cookie' works in an IFrame so I was able to get the cookie information out by sending a message to the parent containing the cookie data. I believe this is because on release builds, Tauri isn't actually "http://localhost" but in fact "tauri://localhost" on Linux and macOS, and "https://custom.protocol.tauri_localhost" on Windows.
Again it would be great if the Release environment mirrored the Debug environment with cookies as it seems unnecessary to have to make requests in an IFrame just to store and read cookies. That and having the 'Secure' flag working on Linux and macOS would be a huge help as well.
Thanks for your comment. That's very detailed summary and they are correct. I've been searching possible ways to do it on Linux and macOS. But it seems a dead end, custom URL scheme on wkwebview just can't handle cookies properly.
There are some workarounds but I'm not sure if they are easier than setting IFrame. We can provide a method to set the cookie before loading pages/requests. And the implementations could be:
- Add the cookie when handling loadRequest like this
WKWebView * webView = [WKWebViewnew];
NSMutableURLRequest * request = [NSMutableURLRequest requestWithURL:[NSURL URLWithString:@"tauri://localhost"]];
[request addValue:@"key=Value" forHTTPHeaderField:@"Cookie"];
[webView loadRequest:request];
- Add an initial script to set the cookie like this:
WKUserContentController* userContentController = [WKUserContentControllernew];
WKUserScript * cookieScript = [[WKUserScript alloc] initWithSource: @"document.cookie = 'key=Value';" injectionTime:WKUserScriptInjectionTimeAtDocumentStart forMainFrameOnly:NO];
[userContentController addUserScript:cookieScript];
What do you guys think which is better?
Initial script is the easier one right? Since the interface is already done, and we can also leverage the existing plugin interface to write one that reads the cookies when the window is closing, and rewrites it on initialization.
@lucasfernog Yes, it's easier. Code snippet above is just me messing with objc. We already have a method for it. If that can work as solution, then it's good for me. But if not, I'll try to explore the approach that handles loadRequest.
Sorry, I underestimate the restriction of custom url scheme on macOS. It just can't set any cookie no matter which way I do. I guess the only workarounds are either providing a server plugin as @lucasfernog suggested or the iframe OP mentioned.
I am a little unfamiliar with how it works with Rust - honestly just starting to get into it with ESP32 and embedded development, but I am going to say maybe it is better to not use custom URL scheme at all. We can try and lock it down as much as possible of course, but not using it would also help prevent issues from arising for first time users where they may have to spend quite a bit of extra time figuring out how to work around these issues - as I did.
Not sure how hard removing the custom URL scheme would be but I would be happy to help where I can.
Well, throwing the baby out with the bathwater because some people find it challenging to be a parent isn't really the right solution. Don't take the metaphor the wrong way, but not all apps need cookies, and I think it is better to offer multiple approaches based on the type of security level the author needs. Not having to ship a webserver is a HUGE opportunity for security gains in my book.
Yeah of course, maybe removing wasn't the best term, rather than having an option to switch it off? Keeping software as minimal as possible is always a priority in terms of security.
I do agree with that we should provide multiple methods of security, however there has been a few issues with the custom URL scheme now if I am not mistaken. Also some developers may want to integrate their existing applications with Tauri, of which rely on cookies and would create a lot of work for them to have to work around it.
The software I work with does require cookies for authentication, and while I could perhaps adapt it to use local or session storage it would create a lot of unnecessary work to make it work in Tauri when it already works across all the current browsers and even Electron last time I checked. I would prefer and still am using Tauri as I prefer its more minimal approach.
You're right, brownfield applications should be able to work - but just won't enjoy the heightened security of greenfield approaches built entirely within the scope of the abilities that tauri offers. In fact, we are working on solving for this right now in terms of API security. You can have great security but at the cost of needing to do some extra work. Everyone else will just need to be really sure that XSS can't happen in their systems. (Which is a very heavy lift, to be sure.)
With paradigm changes, people need to reevaluate approaches and the only way to embrace a new architecture or way of doing things is from the ground up. Bolting a house onto a horse is not the same as building a car. Sheesh, me and the similes this morning. LOL
For sure, personally and within my own business I always strive to embrace new ways of doing things, it drives innovation. However same can't be said for some of my contract jobs unfortunately, where I have to deal with older software.
At the end of the day, as you said, as long as we have something to allow brownfield applications to work, even if it does create a bit of extra work, that is better than having no option at all.
As for whether the IFrame method I figured out or the server plugin method mentioned is best to solve this issue, I am unsure - I would have try the latter out. How would the server plugin method work exactly?
The plugin doesn't have any doc yet but this is the example: https://github.com/tauri-apps/tauri-plugin-localhost/blob/dev/examples/vanilla/src-tauri/src/main.rs
Seems like it still has an issue on Linux though.
The plugin doesn't have any doc yet but this is the example: https://github.com/tauri-apps/tauri-plugin-localhost/blob/dev/examples/vanilla/src-tauri/src/main.rs
Seems like it still has an issue on Linux though.
Thank you for the amazing work. I have one consideration. Safari/WebKit seems to not save cookies on http://localhost if they have a Secure flag. Which is laughable, since it's the only browser behaving like this. If I create an SSL certificate on my dev machine then it works in Safari and tauri dev
. Not sure how can we solve this for production.
Initial script is the easier one right? Since the interface is already done, and we can also leverage the existing plugin interface to write one that reads the cookies when the window is closing, and rewrites it on initialization.
hello,i want get cookies when close the Multiwindow, event'close-requested' can listen,but how to obtain the Multiwindow cookies?
Furthermore, accessing 'document.cookie' works in an IFrame so I was able to get the cookie information out by sending a message to the parent containing the cookie data. I believe this is because on release builds, Tauri isn't actually "http://localhost" but in fact "tauri://localhost" on Linux and macOS, and "https://custom.protocol.tauri_localhost" on Windows.
@Verequies, I tried the above approach on macOS (tauri 1.2.1), but, again, it only works in debug
(when running a development server). When I compile the app (in both, release mode, and debug mode, compiled with tauri build --debug
so I can access the devTools) and run it, document.cookie
is always undefined
, even inside an iframe.
Tauri aside, this is a specific browser issue. As of now, cookies will not be stored on localhost on Safari
and Chromium
based browsers. Not sure how Edge works but the only browser who'll let this slip through is Firefox
.
If you setup a local SSL and trust the certificate via keychain on macOS, it should work. There are also ways of bypassing it with cookie settings on the server side but the truth is, we are fighting a losing battle here as it's only going to get worse in the future.
I would suggest finding an alternative approaches such as jwt
or oauth2
. In my case that's difficult because the IdP
my client uses only supports cookies and my live prod app is based on cookies so this is just going to cause a lot of dev time unfortunately.