htmx
htmx copied to clipboard
json-enc only sends strings in request
Hi there, I've started working recently with HTMX. I am currently writing the reactive frontend of a simple REST API. So I'm using client-side-templates
, json-enc
, and mustache
for this.
I came to realize that HTMX will only support strings, no matter what input type are used in the form. This was causing some issues with Unmarshal in Go.
Here's an example of an HTML form.
<form id="form" hx-post="/movies" hx-ext="json-enc">
<label for="imdb_id">IMDbId</label>
<input id="imdb_id" type="string" name="imdb_id" placeholder="IMDbId" required>
<label for="title">Title</label>
<input id="title" type="text" name="title" placeholder="Title" required>
<label for="rating">IMDbRating</label>
<input id="rating" type="number" step="0.1" min="1" max="10" name="rating" placeholder="Rating" required>
<label for="year">Release Year</label>
<input id="year" type="number" name="year" min="1800" placeholder="Year" required>
<label for="plot">Plot</label>
<textarea id="plot" name="plot" placeholder="Plot" required></textarea>
<button type="submit">Submit</button>
</form>
And here's the JSON that is submitted to the server:"
{"imdb_id":"tt6781235","title":"A great movie title :)","rating":"8.0","year":"2000","plot":"There is no plot to this movie..."}
As you can see, even the numbers are sent as strings. Solving it ended up being quite easy, but that's after I had spent some hours on debugging the issue: https://stackoverflow.com/questions/21151765/cannot-unmarshal-string-into-go-value-of-type-int64
Would it be an idea to add an option to parse numbers on the client before they are sent to a server (e.g. with )? Or at a minimum, add something to the documentation about this?
The extension is almost only a JSON.stringify().
If we add a func to it, we could handle it.
params = {n: 123, f: 1.23, s: "123"}
console.log(JSON.stringify(params))
console.log(JSON.stringify(params, (k, v) => (isNaN(v) ? v : +v)))
Before creating a PR, I suggest:
- Add a method to convert "TrUe" or "fAlSe" to a bool.
- Create a single, simple, IE11 compatible converter function to contain both.
- Bonus points for nested object support
- This can be a breaking change for some. Changing types is critical for strict applications.
- Create an option to parse strict or loose.
- ~~How about introducing an options parameter in hx-ext, like `hx-ext="json-enc:strict" (no parameter would always be a compatible, non breaking default, if an existing extension changes)~~ - will not work due to hx-ext already accepting commands like ignore:ext. I would not want to extend this behavior, it is dirty to piggyback a new behavior on this attribute.
- A more fancy way could be
hx-ext:json-enc="strict"
resp.hx-ext:json-enc
while keeping legacy support.hx-ext:ignore:json-enc
is too far off the style of HTMX.hx-ext:ingore="json-enc"
sounds sane but will again collide a command with a variable (ignore).hx-disinherit="hx-ext:json-enc"
in addition to legacy behavior sounds good, doesn’t it?
Technically the implementation isn't wrong. Everything in the DOM is a string.
The source code of json-enc
is very simple: https://unpkg.com/[email protected]/dist/ext/json-enc.js
It can easily be extended to add some more post-processing.
I am also interested in handling data types better there.
here is one example, which assumes the source element is a <form>
, accesses it's .elements[name].type
and just makes a mapping of common types.
Additionally it transforms "foo.bar"="value"
into {"foo": {"bar": value}}
and "foo[bar]"="value" into
{"foo": {"bar": value}}`.
function getTypeFromHTMLInput(form, fieldName) {
const field = form.elements[fieldName];
if (field) {
return field.type;
}
return null;
}
const typeMappings = {
text: (value) => value.toString(),
number: (value) => Number(value),
date: (value) => new Date(value),
datetime: (value) => new Date(value),
email: (value) => value.toString(),
password: (value) => value.toString(),
tel: (value) => value.toString(),
url: (value) => value.toString(),
checkbox: (value) => value === 'true' || value === 'on',
radio: (value) => value.toString(),
range: (value) => Number(value),
};
function convertValues(data, form) {
const convertedData = {};
for (const key in data) {
const value = data[key];
const type = getTypeFromHTMLInput(form, key);
if (type && typeMappings[type]) {
convertedData[key] = typeMappings[type](value);
} else {
convertedData[key] = value;
}
}
return convertedData;
}
function transformKeys(obj) {
const transformed = {};
for (const key in obj) {
const parts = key.split('.');
let current = transformed;
for (let i = 0; i < parts.length; i++) {
const part = parts[i];
if (part.includes('[')) {
const [prop, value] = part.split('[');
const cleanValue = value.replace(']', '');
current[prop] = current[prop] || {};
current = current[prop];
current[cleanValue] = obj[key];
} else {
if (!current[part]) {
current[part] = {};
}
if (i === parts.length - 1) {
current[part] = obj[key];
}
current = current[part];
}
}
}
return transformed;
}
htmx.defineExtension('json-enc', {
onEvent: function (name, evt) {
if (name === "htmx:configRequest") {
evt.detail.headers['Content-Type'] = "application/json";
}
},
encodeParameters : function(xhr, parameters, elt) {
xhr.overrideMimeType('text/json');
return (JSON.stringify(transformKeys(convertValues(parameters, elt))));
}
});
@spaceone it's a great example of how it can be done. But sadly you can't always (hardly ever) assume that elt
is a FORM
tag. So in many cases it would die on form.elements
. Because of the nature of HTMX you'd have to reuse it's internal logic to refetch the inputs from elt
and map over them. In cases where vars
or values
are used you can't even be sure the data comes from an input.
So as a PoC it's great. But the code needs to be more rigid to handle the complete set of HTMX cases.
My personal preference would be handling these cases in the backend. Going json-enc you just set yourself up for double validation. You have to validate the input in the server not matter how you handle it.
So send a POST form and handle casting + error handling + state in the backend. This is the (HATEOAS) way.
So as a PoC it's great. But the code needs to be more rigid to handle the complete set of HTMX cases.
yes, it's only meant to be a temporary short-cut for others who might have the similar problem until something in HTMX exists.
My personal preference would be handling these cases in the backend. Going json-enc you just set yourself up for double validation. You have to validate the input in the server not matter how you handle it.
Well, I see this as a valid approach for Code on Demand. HTML 5 defines some kind of knowledge about different "types" with validation (number, email, etc) and JSON is able to represent accurate types - so why use json-enc
at all if you just transmit strings and have to put backend logic to re-transform them into the correct wanted types? Then I can go with application/x-www-form-urlencode
.
For me, I want to support a parallel JSON data RPC-API but have a nice hypermedia based UI.
So send a POST form and handle casting + error handling + state in the backend. This is the (HATEOAS) way.
I don't understand why this is the HATEOAS way - how is application/x-www-form-urlencode
or multipart/form-data
more HATEOAS than JSON
? Both formats don't provide the capability to contain links or other hypermedia elements.
Code on Demand with Javascript absolutely allows this in case client/browser functionality is limited.
I don't understand why this is the HATEOAS way - how is application/x-www-form-urlencode or multipart/form-data more HATEOAS than JSON? Both formats don't provide the capability to contain links or other hypermedia elements. Code on Demand with Javascript absolutely allows this in case client/browser functionality is limited.
It doesn't necessarily have to be more HATEOAS. And this is probably not the place to discuss this. But sending JSON back and forth, and especially receiving JSON indicates rendering on the client aka keeping state on the client and that is not the idea behind HATEOAS.
@Auxority I solved this two ways, though I think I'll go with form encoding. I basically took inspiration from json.Unmarshal
and used reflection to view json tags and read the request form. This does have a couple issues. Currently it doesn't handle dates or durations, but I don't use it yet, so haven't looked at it. The second issue is the endpoint can't take json, but if I'm gonna go form data, I'll do it everywhere.
func parseFormData(r *http.Request, v any) error {
rv := reflect.ValueOf(v)
if rv.Kind() != reflect.Pointer || rv.IsNil() {
return fmt.Errorf("value is not a pointer %T", v)
}
err := r.ParseForm()
if err != nil {
return fmt.Errorf("could not parse form data")
}
elem := rv.Elem()
elemType := elem.Type()
for i := 0; i < elemType.NumField(); i++ {
field := elem.Field(i)
fieldType := elemType.Field(i)
if field.Kind() == reflect.Struct {
err := parseFormData(r, field.Addr().Interface())
if err != nil {
return fmt.Errorf("could not parse field %s: %w", fieldType.Name, err)
}
continue
}
typeAttr := slog.String("type", elemType.String())
nameAttr := slog.String("name", fieldType.Name)
tag, ok := fieldType.Tag.Lookup("json")
if !ok {
logger.DebugContext(r.Context(), "field does not have json tag", typeAttr, nameAttr)
continue
}
tagAttr := slog.String("tag", tag)
if !r.Form.Has(tag) {
logger.DebugContext(r.Context(), "form value not present", typeAttr, nameAttr, tagAttr)
continue
}
val := r.Form.Get(tag)
switch field.Kind() {
case reflect.String:
field.SetString(val)
case reflect.Bool:
x, err := strconv.ParseBool(val)
if err != nil {
return fmt.Errorf("could not parse bool tag %s for %s.%s: %w", tag, elemType.String(), fieldType.Name, err)
}
field.SetBool(x)
case reflect.Int:
x, err := strconv.ParseInt(val, 10, 64)
if err != nil {
return fmt.Errorf("could not parse int tag %s for %s.%s: %w", tag, elemType.String(), fieldType.Name, err)
}
field.SetInt(x)
case reflect.Float32, reflect.Float64:
x, err := strconv.ParseFloat(val, 64)
if err != nil {
return fmt.Errorf("could not parse int tag %s for %s.%s: %w", tag, elemType.String(), fieldType.Name, err)
}
field.SetFloat(x)
default:
logger.WarnContext(r.Context(), "could not handle field", typeAttr, nameAttr, "kind", field.Kind())
}
}
return nil
}
The second way was to remove the json-enc
package and add my own version of it. I didn't go with this route because if javascript was turned off, the form would still be submitted because its in a boosted state.
// HTMX Extensions
htmx.defineExtension('json-enc', {
onEvent: function (name, evt) {
if (name === "htmx:configRequest") {
evt.detail.headers['Content-Type'] = "application/json";
}
},
encodeParameters: function (xhr, parameters, elt) {
for (var name in parameters) {
switch (elt.querySelector(`input[name="${name}"]`).type) {
case "checkbox":
parameters[name] = parameters[name].toLowerCase() === "true";
break;
case "number":
parameters[name] = Number(parameters[name])
break;
}
}
xhr.overrideMimeType('text/json');
return (JSON.stringify(parameters));
}
});
Hi, I've got the same issue with unmarshalling numbers in go, see my comment is this other issue: https://github.com/bigskysoftware/htmx/issues/2401#issuecomment-2044081349
Would love a solution to this!