dtsgenerator
dtsgenerator copied to clipboard
error: The $ref targets root is not found:
dtsgen throws error for http://adaptivecards.io/schemas/adaptive-card.json
The $ref targets root is not found: #/definitions/AdaptiveCarD
the command of npx dtsgenerator --url https://adaptivecards.io/schemas/adaptive-card.json
return corret result.
What do you do?
@horiuchi thans for reply. yes above command works (I did'n know about above syntax) what i did was
- create a new npm project
- copy json schema to folder
-
npm install --save dtsgenerator
- ran
node_modules/.nin/dtsgen schema.json
- I get the error
@horiuchi If you think this issue is Invalid then please close this Thanks again for your wonderful tool
@horiuchi you should close this issue, it was invalid ;)
Actually this behavior is not invalid.
This is due to the slight difference between getting the schema using --url
and getting the schema from a local file.
It's essentially due to the schema file, but I want it to work in either case.
@horiuchi just ran into this trying to process a local openapi v3 yaml. would love to use the tool
I get this error when I have a response component with an empty content (ie only headers).
Just ran into this issue too ! I was making changes to an existing API. dtsgen was running fine before.
I'm investigating... in my case, it seems linked to the content-type value of the response definition.
My new response is an application/octet-stream
one.
If I change it to text/XXX
(literaly) or text/octet-stream
or application/json
, dtsgen doesn't complain anymore.
Some news : it was on my side ! The response mapped on a schema that contained a oneOf
, branching between a text format and binary format.
This "branching" is better done at response definition level, like this : https://swagger.io/docs/specification/describing-responses/#media-types Maybe the dtsgen output could be a bit more specific about possible cause ? (I don"t know if it's possible, and I'm already glad to have this tool, thanks !)
Hi, I've a similar issue but looks related to the remote reference $ref is pointing to a document located in another folder. my yaml $ref looks like this $ref: '../common/responses.yaml#/responses/resp500' according with https://swagger.io/docs/specification/using-ref/
Any ideas? using a URL instead of a path seems to work
Thanks for any suggestion... i really like this generator many tanks
Hello, I'm facing the same error when trying to parse JSON Schema of the serverless workflow specification.
I downloaded them locally and tried to process workflow.json
(the "entry point") but ended up with those errors:
> dtsgen --out types.d.ts schema/**/*.json
Error: The $ref targets root is not found: #/definitions/eventdef
The $ref targets root is not found: #/definitions/function
The $ref targets root is not found: #/definitions/retrydef
(related to #414 #321 ?)
I also tried to process the schema remotely and I think there is a mixup between $id and $ref:
> dtsgen --url https://raw.githubusercontent.com/serverlessworkflow/specification/main/schema/workflow.json --out types.d.ts
Error: Fail to fetch the $ref target: https://serverlessworkflow.org/core/common.json#/definitions/metadata, Error: Error on fetch from url(https://serverlessworkflow.org/core/common.json#): 404, <!DOCTYPE html> ...
I looks like it uses a relative url from $id (? not even sure because /0.6/ is lost) of the entry schema 'instead of' a relative url of the one provided by --url
. But this is probably another issue than the one discussed here.
@JBBianchi I looked into this matter: https://github.com/serverlessworkflow/specification/tree/main/schema
As a result, the file linked from workflow.json
was not reading correctly because of its format which is not expected for JSON Schema. For example, in the root of events.json
, there is an unexpected property events
. We are trying to reference this path from workflow.json
, but the parse of events.json
ignores this path, resulting in an error.
The reason for the Fail to fetch the $ref target
is that if you reference another file with $ref
and it is a relative reference, it will be treated as a relative reference to the original file's $id
URL.
Since the $id
of workflow.json
is https://serverlessworkflow.org/core/workflow.json
,
when resolving references to common.json#/definitions/metadata
,
the base URL will be https://serverlessworkflow.org/core/
and the result will be https://serverlessworkflow.org/core/common.json#/definitions/metadata
.
However, the file does not exist in this URL, so fetch is giving an error.
@horiuchi
First, thanks for your feedback. Here are my remarks:
As a result, the file linked from
workflow.json
was not reading correctly because of its format which is not expected for JSON Schema. For example, in the root ofevents.json
, there is an unexpected propertyevents
. We are trying to reference this path fromworkflow.json
, but the parse ofevents.json
ignores this path, resulting in an error.
It's not strictly compliant but it's still a valid JSON Schema, they can have as many extra properties is still be valid per say. As it is, events.json
cannot really be used as an entry point. But, it can be referenced, like it is in workflow.json
, explicitly pointing to the property in the document.
The reason for the
Fail to fetch the $ref target
is that if you reference another file with$ref
and it is a relative reference, it will be treated as a relative reference to the original file's$id
URL. Since the$id
ofworkflow.json
ishttps://serverlessworkflow.org/core/workflow.json
, when resolving references tocommon.json#/definitions/metadata
, the base URL will behttps://serverlessworkflow.org/core/
and the result will behttps://serverlessworkflow.org/core/common.json#/definitions/metadata
. However, the file does not exist in this URL, so fetch is giving an error.
I did understand the cause and it's indeed a valid way to see things. But, the specs isn't very specific on that topic, the top-level $id
MAY be used as a base URI for the $refs
of the documents but it's implementation specific. Maybe it would be a good idea for try the relative URL from $id
first and if it fails, try it relative from the 'input URI' ?
In the end, I found a way to achieve what I wanted. I used json-schema-ref-parser
to merge all schemas in one and generated the TS from there. It's probably not the most robust approach but it works for that use-case.
@JBBianchi Thank you for your useful comments!
As a result, the file linked from workflow.json was not reading correctly because of its format which is not expected for JSON Schema. For example, in the root of events.json, there is an unexpected property events. We are trying to reference this path from workflow.json, but the parse of events.json ignores this path, resulting in an error.
It's not strictly compliant but it's still a valid JSON Schema, they can have as many extra properties is still be valid per say. As it is, events.json cannot really be used as an entry point. But, it can be referenced, like it is in workflow.json, explicitly pointing to the property in the document.
Yes, I know. But In my implementation, the extra properties are ignored. So if you try to reference it later, it will be treated as if it doesn't exist.
I did understand the cause and it's indeed a valid way to see things. But, the specs isn't very specific on that topic, the top-level $id MAY be used as a base URI for the $refs of the documents but it's implementation specific. Maybe it would be a good idea for try the relative URL from $id first and if it fails, try it relative from the 'input URI' ?
Yes, so if '$id' does not exist, the 'input URL' is treated as the Base URI. As far as I'm concerned, I'm assuming that the URL in '$id' is the actual hosting URL.
In the end, I found a way to achieve what I wanted. I used
json-schema-ref-parser
to merge all schemas in one and generated the TS from there. It's probably not the most robust approach but it works for that use-case.
I thought I should reconsider using the json-schema-ref-parser
to replace the original implementation for '$ref' resolution. Because the '$ref' resolution is really complicated.