vaultwarden icon indicating copy to clipboard operation
vaultwarden copied to clipboard

Importing silently fails without UI error for 10k character+ secure notes

Open meramsey opened this issue 2 years ago • 11 comments

Subject of the issue

Import errors are not handled the same as upstream for failing things. Which is confusing for people not tech savvy..

Your environment (Generated via diagnostics page)

  • Vaultwarden version: v1.27.0
  • Web-vault version: v2022.12.0
  • Running within Docker: true (Base: Debian)
  • Environment settings overridden: true
  • Uses a reverse proxy: false
  • Internet access: true
  • Internet access via a proxy: false
  • DNS Check: true
  • Time Check: true
  • Domain Configuration Check: false
  • HTTPS Check: true
  • Database type: SQLite
  • Database version: 3.39.2
  • Clients used:
  • Reverse proxy and version:
  • Other relevant information:

Config (Generated via diagnostics page)

Show Running Config

Environment settings which are overridden: SIGNUPS_ALLOWED, INVITATIONS_ALLOWED, ADMIN_TOKEN

{
  "_duo_akey": null,
  "_enable_duo": false,
  "_enable_email_2fa": false,
  "_enable_smtp": true,
  "_enable_yubico": true,
  "_icon_service_csp": "",
  "_icon_service_url": "",
  "_ip_header_enabled": true,
  "_smtp_img_src": "cid:",
  "admin_ratelimit_max_burst": 3,
  "admin_ratelimit_seconds": 300,
  "admin_token": "***",
  "allowed_iframe_ancestors": "",
  "attachments_folder": "data/attachments",
  "authenticator_disable_time_drift": false,
  "data_folder": "data",
  "database_conn_init": "",
  "database_max_conns": 10,
  "database_timeout": 30,
  "database_url": "***************",
  "db_connection_retries": 15,
  "disable_2fa_remember": false,
  "disable_admin_token": false,
  "disable_icon_download": false,
  "domain": "*****://*******************",
  "domain_origin": "*****://*******************",
  "domain_path": "",
  "domain_set": true,
  "duo_host": null,
  "duo_ikey": null,
  "duo_skey": null,
  "email_attempts_limit": 3,
  "email_expiration_time": 600,
  "email_token_size": 6,
  "emergency_access_allowed": true,
  "emergency_notification_reminder_schedule": "0 3 * * * *",
  "emergency_request_timeout_schedule": "0 7 * * * *",
  "enable_db_wal": true,
  "event_cleanup_schedule": "0 10 0 * * *",
  "events_days_retain": null,
  "extended_logging": true,
  "helo_name": null,
  "hibp_api_key": null,
  "icon_blacklist_non_global_ips": true,
  "icon_blacklist_regex": null,
  "icon_cache_folder": "data/icon_cache",
  "icon_cache_negttl": 259200,
  "icon_cache_ttl": 2592000,
  "icon_download_timeout": 10,
  "icon_redirect_code": 302,
  "icon_service": "internal",
  "incomplete_2fa_schedule": "30 * * * * *",
  "incomplete_2fa_time_limit": 3,
  "invitation_expiration_hours": 120,
  "invitation_org_name": "Vaultwarden",
  "invitations_allowed": true,
  "ip_header": "X-Real-IP",
  "job_poll_interval_ms": 30000,
  "log_file": null,
  "log_level": "Info",
  "log_timestamp_format": "%Y-%m-%d %H:%M:%S.%3f",
  "login_ratelimit_max_burst": 10,
  "login_ratelimit_seconds": 60,
  "org_attachment_limit": null,
  "org_creation_users": "",
  "org_events_enabled": false,
  "org_groups_enabled": false,
  "password_hints_allowed": true,
  "password_iterations": 100000,
  "reload_templates": false,
  "require_device_email": false,
  "rsa_key_filename": "data/rsa_key",
  "send_purge_schedule": "0 5 * * * *",
  "sends_allowed": true,
  "sends_folder": "data/sends",
  "show_password_hint": false,
  "signups_allowed": true,
  "signups_domains_whitelist": "*************",
  "signups_verify": false,
  "signups_verify_resend_limit": 6,
  "signups_verify_resend_time": 3600,
  "smtp_accept_invalid_certs": false,
  "smtp_accept_invalid_hostnames": false,
  "smtp_auth_mechanism": null,
  "smtp_debug": false,
  "smtp_embed_images": true,
  "smtp_explicit_tls": null,
  "smtp_from": "********************",
  "smtp_from_name": "Vaultwarden",
  "smtp_host": "********************",
  "smtp_password": "***",
  "smtp_port": 587,
  "smtp_security": "starttls",
  "smtp_ssl": null,
  "smtp_timeout": 15,
  "smtp_username": "********************",
  "templates_folder": "data/templates",
  "tmp_folder": "data/tmp",
  "trash_auto_delete_days": null,
  "trash_purge_schedule": "0 5 0 * * *",
  "use_syslog": false,
  "user_attachment_limit": null,
  "web_vault_enabled": true,
  "web_vault_folder": "web-vault/",
  "websocket_address": "0.0.0.0",
  "websocket_enabled": true,
  "websocket_port": 3012,
  "yubico_client_id": "82595",
  "yubico_secret_key": "***",
  "yubico_server": null
}
  • Vaultwarden version: v1.27.0
  • Install method: docker

Steps to reproduce

Export a 1Password account with 10,000character+ secure note as 1pux file... or use attached. Import 1password 1pux browse to file click import starts loading then shows console error

Example from attached file https://user-images.githubusercontent.com/1596188/209586637-c2158710-9e9d-487a-8808-f25370fbc611.mp4

Test 1pux file to test with. Just will need to rename to remove the .zip as github doesn't allow 1pux 1PasswordExport-4WNLQ6BLP5HXFOIOEC62DSEZPQ-20221226-170653.1pux.zip

Expected behaviour

For it to show error like it does on main bitwarden where it shows errors per bad note with the name so you know what to fix before exporting again.

Example: Screenshot from 2022-12-26 17-07-29

Import error There was a problem with the data you tried to import. Please resolve the errors listed below in your source file and try again. [1] [SecureNote] "Lorem ipsum 100 paragraphs large note": The field Notes exceeds the maximum encrypted value length of 10000 characters.

Error is Related to the limit upstream just not handled the same in the UI here: https://www.reddit.com/r/Bitwarden/comments/sklxy5/why_limit_secure_notes_to_10000_characters/ https://community.bitwarden.com/t/support-longer-notes-breaks-lastpass-import/2970

Actual behaviour

Import looks like its working, but stops and fails when it gets to long note and does not say any error in main UI.

image

Response raw

XHRPOST
https://redacted/api/ciphers/import
[HTTP/2 400 Bad Request 538ms]
1
{"ErrorModel":{"Message":"The field Notes exceeds the maximum encrypted value length of 10000 characters.","Object":"error"},"ExceptionMessage":null,"ExceptionStackTrace":null,"InnerExceptionMessage":null,"Message":"The field Notes exceeds the maximum encrypted value length of 10000 characters.","Object":"error","ValidationErrors":{"":["The field Notes exceeds the maximum encrypted value length of 10000 characters."]},"error":"","error_description":""}

Troubleshooting data

meramsey avatar Dec 26 '22 22:12 meramsey

Another similar previous discussion related to import of notes exceeding the maximum allowed value.

https://github.com/dani-garcia/vaultwarden/discussions/2176 https://github.com/dani-garcia/vaultwarden/issues/2937

I'm sure imposing the limit shouldn't be too difficult, but I wonder if there is any elegant way to "convert" any possible old notes that currently would be over the upstream value should Vaultwarden also honor the 10k encrypted limit to minimize breaking thing as much as possible.

cksapp avatar Dec 27 '22 16:12 cksapp

As a new user, also affected by this symptom I would suggest the following:

  1. The lowest hanging fruit seems to be adding a error popover, alerting the user to the error from the backend ~ similar to the Bitwarden example.
  2. I was surprised that a partial import had been performed. In-order to address this issue I would suggest breaking the import into separate stages ala: A. Validation of file, this would allow for useful feedback to the user ~ as to which specific entries are blocking import and why. B. When validation passes, the import is performed in a single transaction.

🤓

dlehammer avatar Dec 27 '22 16:12 dlehammer

Just discovered another surprise; re-import of the same file, generates duplicate folders and items, even though the import file contains identical data to already existing items.

Any tips on bulk cleaning up duplicates ?

dlehammer avatar Dec 27 '22 17:12 dlehammer

Just discovered another surprise; re-import of the same file, generates duplicate items, even though the import file contains identical data to already existing items.

Any tips on bulk cleaning up duplicates ?

I just purged the vault data and reattempted. That is an upstream bug too but I think if it validates all items before actually restoring that would prevent that in first place. Not sure how feasible that is or if you can use a temp vault then nuke after it fails and then if it succeeds copy or do real import to main one.

meramsey avatar Dec 27 '22 17:12 meramsey

Why is there 10_000 limit anyway, and can we extend it? If i googled correctly, the limit for TEXT field is 65535 characters.

That means we can get 65535 characters without changing the database schema.

Last pass allows a lot more characters so its impossible to import notes properly from last pass.

htunlogic avatar Dec 28 '22 11:12 htunlogic

@htunlogic see: #2937 We had no limit before. But if that causes the clients to crash or whatever, then it's not worth having an extended limit. Also, if we would make it larger, people can't switch to Bitwarden if they want to. Same as now happens for you coming from a different Password Storage.

As we want to keep as close as possible to Bitwarden, and don't want clients to break, because that would make it totally useless.

BlackDex avatar Dec 28 '22 11:12 BlackDex

@BlackDex You are correct. I haven't been really thinking about that... My only priority is to get the hell away from Lastpass :)

htunlogic avatar Dec 28 '22 11:12 htunlogic

Might i suggest to use attachments in this case? You can put attachments to these items. The only thing with that is, that they arn't synced, because you need to download them.

BlackDex avatar Dec 28 '22 11:12 BlackDex

Well, the only thing I'm having issue with really are the GPG keys I have stored in secure notes. I'll shuffle through them, some of them are expired and old so those aren't that important, and others, I'll give it a try with attachments.

Thanks a lot ! :)

htunlogic avatar Dec 28 '22 12:12 htunlogic

I wonder if we just created an attachment of the 10k+ character note in a note with same name as original one if that would be an acceptable compromise?

doesn't technically deviate from upstream just makes it easier for people to get into bitwarden format without having to delete long notes re-export and try importing again

meramsey avatar Dec 29 '22 12:12 meramsey

I wonder if we just created an attachment of the 10k+ character note in a note with same name as original one if that would be an acceptable compromise?

doesn't technically deviate from upstream just makes it easier for people to get into bitwarden format without having to delete long notes re-export and try importing again

That isn't something we can do in the background. Attachments are handled differently. Also filenames are encrypted for example, not something we can (or want) to do on the server side.

Soz you can do this your self of course.

BlackDex avatar Dec 29 '22 12:12 BlackDex