Does not generate correct values when using custom hash length and alphabet
If you remove the salt and use a custom alphabet with min hash length the resulting hash does not match other languages/libraries.
Expected SQL settings
DECLARE
@salt nvarchar(255) = N'',
@alphabet nvarchar(255) = N'ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890',
@seps nvarchar(255) = N'cfhistuCFHISTU', -- Match .NET
@guards nvarchar(255) = N'',
@minHashLength int = 7;
This breaks with a divide by zero error because @guards is blank but it actually needs to be calculated.
All the code that changes the alphabet and calculates the guard characters is missing:
https://github.com/ullmark/hashids.net/blob/master/src/Hashids.net/Hashids.cs#L85 https://github.com/niieani/hashids.js/blob/master/src/hashids.ts#L80
The guard characters in this scenario should get calculated to DEG and the alphabet should be reduced to JKLMNOPQRVWXYZ1234567890. So this following unintuitive settings actually result in the same values as other languages with the expected alphabet and calculated guards:
DECLARE
@salt nvarchar(255) = N'',
@alphabet nvarchar(255) = N'JKLMNOPQRVWXYZ1234567890',
@seps nvarchar(255) = N'CFHISTU',
@guards nvarchar(255) = N'DEG',
@minHashLength int = 7;
-- LG0WPG8
Working examples in C# and TS.
JavaScript
var hashids = new Hashids("", 7, "ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890");
var id = hashids.encode(123);
var numbers = hashids.decode(id);
console.log(id)
// LG0WPG8
.NET
using System;
using HashidsNet;
public class Program
{
public static void Main()
{
const long testId = 123;
var hashids = new Hashids(minHashLength: 7, alphabet: "ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890");
var id = hashids.EncodeLong(testId);
Console.WriteLine(id);
// LG0WPG8
}
}