ramda.github.io
ramda.github.io copied to clipboard
Replace goo.gl url shortener
Google will discontinue goo.gl
.
https://goo.gl/
Starting March 30, 2018, we will be turning down support for goo.gl URL shortener. From April 13, 2018 only existing users will be able to create short links on the goo.gl console. You will be able to view your analytics data and download your short link information in csv format for up to one year, until March 30, 2019, when we will discontinue goo.gl. Previously created links will continue to redirect to their intended destination. Please see this blog post for more details.
It's currently in use.
https://github.com/ramda/ramda.github.io/blob/88b9e38495377430c13070233518a21f136ed155/repl/lib/js/googl.js
We should replace it with another service :)
@andys8: Yes, we should. Are you interested in creating a PR for this?
I'm looking into which service to use, to replace it. I did a little research, but it is harder than I initially thought, to find an alternative.
Criteria
- Will hopefully stay not be deprecated
- Will not show advertising to users
- Has an API
- No CORS issues
- API should be robust and not block or rate limit requests
- (?) No key or account necessary
Links
https://medium.com/@guillotegg/google-url-api-replacement-options-da18194b691c
https://is.gd/apishorteningreference.php looks interesting. CORS headers don't seem to be set.
I really doubt that any non-rate-limited services would not require a key.
Is this something we should build ourselves? It would be easy enough to point, say, https://links.ramdajs.com/
to a Node service we could host somewhere with a simple db. If we whitelisted only ramdajs.com
targets, we might create something not likely to be abused. And if there were problems, we could rate-limit by IP. This is not something that looks like fun to me, but it also doesn't look difficult.
https://is.gd shares one problem with https://goo.gl: both limit the size of the URLS. I've never investigated where the limit is, but I have run into it on occasion.
As an alternative to using a URL shortener or database, you could compress the source using LZString and just store it in a hash. Flems, a general-purpose code playground, works by literally storing the entire editor state in the URL's fragment part as a compressed JSON blob, and I've only a couple times ran into issues with that:
- A person using IE ran into their fairly low limit of ~2K characters in the URL and couldn't load a snippet I posted. They were able to switch to a more modern browser (Firefox IIRC), so the issue rectified itself. (The relevant RFC actively recommends a minimum of 8000 octets to be accepted, and most modern browsers accept far longer URLs. IE is the exception, and it's not especially popular among technical people.)
- I've literally once hit Gitter's limit of 4096 characters per message (well above that of most URL shorteners). What it took to hit that was literally a miniature demo app complete with mock data, and even that could still be directly opened in Edge (with a low address bar limit) by clicking the long URL hash. I would find it far more difficult to achieve with a simple library REPL playground.
As a couple more points of comparison:
- Babel's REPL stores its settings in the hash uncompressed, but the code itself is compressed similarly using LZString.
- TypeScript's playground just naïvely URL-escapes the source and stores that in the URL.
For most things that would press against typical URL length limits, it's probably better to just move to CodePen, Glitch, or similar, something that's a lot heavier and more fully-featured and made for actual projects rather than simple toy experiments. In retrospect, that mini demo app would've been easier to write and manage in CodePen, but I didn't write it initially - it was based on another's broken app that already existed in Flems.
Thanks for the suggestions.
We actually do put the content in the URL the same way the Typescript playground does. The shortener is just to have a simpler link to pass around. We could skip it if necessary.
Flems looks very interesting. I hadn't see that one before.
It has been a while.
Do we have a rough idea of what kind of request rates we are need from the url shortening service? That would help address the rate limit issue when considering any potential services.
https://medium.com/@guillotegg/google-url-api-replacement-options-da18194b691c
This article only listed 3 options. There are more available:
https://zapier.com/blog/best-url-shorteners/ https://blog.rebrandly.com/8-best-free-url-shortener-apis-for-creating-your-short-links/
Maybe it's worth putting something that works in for now, as the google one is not working atm.
Tinyurl actually looks like simple to use. It does not offer branding, e.g., you get tinyurl.com/xyz
type of urls. But that's probably not a big deal here.
I'd love to put in a replacement even if it's not perfect.
I don't have any sense of necessary rates, but I'm guessing its more likely hundreds or thousands per day than tens of thousands or more.
Anyone interested in creating a PR with any one of these?