Creating a new link should not depend on a successful scrape
Disclaimers
- [x] Before filing this report, I have read the documentation fully and followed it.
- [x] I understand that if the "issue" is something that is already answered by the documentation and/or are caused by me not reading the documentation, the issue will be blocked and/or locked (with the implicit explanation being to "go read the docs"), and I may not be able to open another issue for this repository ever again.
Issue Description
Describe the bug If the scraper times out while attempting to scrape a given link, an exception is thrown, and the UI is left in a broken state: the new link modal remains, and appears functional, but continued use does nothing; the user must close the modal and start over (likely to end up in the same state)
Deployment Method Production:
- AWS ECS Fargate (behind Application Load Balancer) + AWS RDS (PostgreSQL 16.3) + AWS ElastiCache (Redis 6.2.6)
Local testing:
- Docker container from deploy/docker-compose/docker-compose.yml
To Reproduce
- Create a new link, using an un-scrapable URL, such as
http://foo.bar - Click the magnifying glass icon to invoke the background scrape process
- A toaster will appear with the message "Failed to shorten link!"
- UI will be in the state described above
Expected behavior I don't believe that creating a link should depend on a successful scrape, as the brand link or short link would still work without it. Thus, if scraping fails, the user could be notified, but allowed to either edit the URL (in case of a mistake), or just save the new link anyway.
Screenshots N/A
Additional context Nah, we're good 😎
This leads to the inability to shorten anything with any login before it.