Gsoc poc for Api Explorer wiht automation pipeline
PR Description
Add your description
Related Issues
- Closes # #619
Checklist
- [x] I have gone through the contributing guide
- [x] I have updated my branch and synced it with project
mainbranch before making this PR - [x] I am using the latest Flutter stable branch (run
flutter upgradeand verify) - [x] I have run the tests (
flutter test) and all tests are passing
Added/updated tests?
We encourage you to add relevant test cases.
- [] Yes
- [x] No, and this is why: please replace this line with details on why tests have not been included
OS on which you have developed and tested the feature?
- [x] Windows
- [x] macOS
- [x] Linux
Have to refactor the code will make it perfect as it have some good optmiziatioin i will do it in a week
Thanking you for checking my proof of concept
You can see the current draft where I have set up GitHub Actions and automation pipelines for the project. These pipelines help create new releases for the API based on the OpenAPI specifications.
For the HTML file parser, we can use the OpenAPI specs to parse the files. Then, for the HTML file type specifically, we can also make an LLM (Large Language Model) call to extract relevant data from the endpoints. We'll design a good prompt that asks the model to extract important information from the API response.
Since building our own API could lead to potential misuse by users, we will instead allow users to connect their own LLM APIs offline, and we'll securely store their API keys.
As usual, we'll add a dropdown menu in the api_dash where users can choose between different LLM providers like OpenAI, Gemini, Claude, and Llama.
Additionally, we will maintain a separate GitHub repository for the URL parsing logic. Here is the link to that repo: GITHUB REPO LINK.
Working Video
working video second
Architecture of whole draft
Provider of api explorer
Both Services Layer ( apipackagezip_service.dart and apiproccessing.dart)
Api Explorer Data Model
Data Loading Arch.
Github action bundle for the api file ( zip )
Error View if error happens during extracting zip file using archive pacakge
```dart import 'package:apidash_core/consts.dart';
import 'package:freezed_annotation/freezed_annotation.dart';
part 'api_explorer_models.freezed.dart';
part 'api_explorer_models.g.dart';
@freezed
class ApiCollection with _$ApiCollection {
const factory ApiCollection({
required String id,
required String name,
String? description,
required List endpoints,
String? sourceUrl,
}) = _ApiCollection;
factory ApiCollection.fromJson(Map json) =>
_$ApiCollectionFromJson(json);
}
@freezed
class ApiEndpoint with _$ApiEndpoint {
const factory ApiEndpoint({
required String id,
required String name,
String? description,
required String path,
required HTTPVerb method,
required String baseUrl,
List? parameters,
ApiRequestBody? requestBody,
Map? headers,
Map? responses,
}) = _ApiEndpoint;
factory ApiEndpoint.fromJson(Map json) =>
_$ApiEndpointFromJson(json);
}
@freezed
class ApiParameter with _$ApiParameter {
const factory ApiParameter({
required String name,
required String inLocation,
String? description,
required bool required,
ApiSchema? schema,
String? example,
}) = _ApiParameter;
factory ApiParameter.fromJson(Map json) =>
_$ApiParameterFromJson(json);
}
@freezed
class ApiRequestBody with _$ApiRequestBody {
const factory ApiRequestBody({
String? description,
required Map content,
}) = _ApiRequestBody;
factory ApiRequestBody.fromJson(Map json) =>
_$ApiRequestBodyFromJson(json);
}
@freezed
class ApiHeader with _$ApiHeader {
const factory ApiHeader({
String? description,
required bool required,
ApiSchema? schema,
String? example,
}) = _ApiHeader;
factory ApiHeader.fromJson(Map json) =>
_$ApiHeaderFromJson(json);
}
@freezed
class ApiResponse with _$ApiResponse {
const factory ApiResponse({
String? description,
Map? content,
}) = _ApiResponse;
factory ApiResponse.fromJson(Map json) =>
_$ApiResponseFromJson(json);
}
@freezed
class ApiContent with _$ApiContent {
const factory ApiContent({
required ApiSchema schema,
}) = _ApiContent;
factory ApiContent.fromJson(Map json) =>
_$ApiContentFromJson(json);
}
@freezed
class ApiSchema with _$ApiSchema {
const factory ApiSchema({
String? type,
String? format,
String? description,
String? example,
ApiSchema? items,
Map? properties,
}) = _ApiSchema;
factory ApiSchema.fromJson(Map json) =>
_$ApiSchemaFromJson(json);
}
have implemented the whole ui
@animator @ashitaprasad please check this draft pr for the application once max work have been done
👍
ai autotagging and pipelines diagram how will this work
in this arch first user can send a variour pr of the new open api specs in yaml then our parrser converts this into json and then dart llm file wil run in github aciton and do autotagging in that after that will make a release
for making the html parser we can do a thing can send api llm calls with json with the user api key so it parser them as per the html markup strucutre our html parser with llm call will fetch and extract relevant info of which we need store them in the hive for local db in json
as i have created open api spec parser alreay this is htmp parser proposed architecture
AI HTML PARSER
@animator @ashitaprasad is this approach will be effecient for apidash html parsing
Hello @pratapsingh9 , It seems that your current POC models are not aligned with the APIDash request models. From what I can observe, there's a significant mismatch in structure and flow.
You might want to check the references I previously discussed with @ashitaprasad ma'am in my Initial Idea Submission, as well as the initial POC I developed. That should give you a clearer understanding of how the models are expected to function and align more closely with the actual requirements.
Hey @BalaSubramaniam12007 👋,
I appreciate you taking time to review my POC. Let me explain why my model architecture is better suited for both current needs and future extensions like HTML/AI parsing:
Core Model Comparison
| Feature | @BalaSubramaniam12007 Template Model | My OpenAPI Model |
|---|---|---|
| Structure | Flat request list | Nested collections with endpoints |
| Spec Compliance | Custom format | OpenAPI 3.x standard |
| Parameter Handling | Basic key-value pairs | Supports path/query/header/cookie locations |
| Validation | ❌ None | ✅ Full schema validation |
| Future Extensions | Difficult to extend | Ready for HTML/AI parsing |
Why My Model Works Better with HttpRequestModel
My ApiEndpoint converts perfectly to HttpRequestModel:
// Easy conversion from OpenAPI to executable request
HttpRequestModel toRequestModel(ApiEndpoint endpoint) {
return HttpRequestModel(
method: endpoint.method,
url: endpoint.baseUrl + endpoint.path,
headers: endpoint.headers?.entries.map((e) =>
NameValueModel(name: e.key, value: e.value.example ?? '')).toList(),
bodyContentType: endpoint.requestBody?.content.keys.contains('application/json')
? ContentType.json
: ContentType.text
);
}
Why myModel is Superior
-
Standards Compliance
- Direct OpenAPI spec compatibility
- Works with existing API ecosystems
- Example:
// OpenAPI-aligned parameter definition ApiParameter( inLocation: 'query', schema: ApiSchema(type: 'integer', format: 'int64') )
@BalaSubramaniam12007's model uses a custom-supported JSON schema, which can cause issues for users when contributing to new APIs. That's why I proposed my solution
You can see the current draft where I have set up GitHub Actions and automation pipelines for the project. These pipelines help create new releases for the API based on the OpenAPI specifications.
For the HTML file parser, we can use the OpenAPI specs to parse the files. Then, for the HTML file type specifically, we can also make an LLM (Large Language Model) call to extract relevant data from the endpoints. We'll design a good prompt that asks the model to extract important information from the API response.
Since building our own API could lead to potential misuse by users, we will instead allow users to connect their own LLM APIs offline, and we'll securely store their API keys.
As usual, we'll add a dropdown menu in the api_dash where users can choose between different LLM providers like OpenAI, Gemini, Claude, and Llama.
Additionally, we will maintain a separate GitHub repository for the URL parsing logic. Here is the link to that repo: GITHUB REPO LINK.
Working Video
working video second
Architecture of whole draft
Provider of api explorer
Both Services Layer ( apipackagezip_service.dart and apiproccessing.dart)
Api Explorer Data Model
Data Loading Arch.
Github action bundle for the api file ( zip )
Error View if error happens during extracting zip file using archive pacakge
import 'package:freezed_annotation/freezed_annotation.dart'; part 'api_explorer_models.freezed.dart'; part 'api_explorer_models.g.dart'; @freezed class ApiCollection with _$ApiCollection { const factory ApiCollection({ required String id, required String name, String? description, required List endpoints, String? sourceUrl, }) = _ApiCollection; factory ApiCollection.fromJson(Map json) => _$ApiCollectionFromJson(json); } @freezed class ApiEndpoint with _$ApiEndpoint { const factory ApiEndpoint({ required String id, required String name, String? description, required String path, required HTTPVerb method, required String baseUrl, List? parameters, ApiRequestBody? requestBody, Map? headers, Map? responses, }) = _ApiEndpoint; factory ApiEndpoint.fromJson(Map json) => _$ApiEndpointFromJson(json); } @freezed class ApiParameter with _$ApiParameter { const factory ApiParameter({ required String name, required String inLocation, String? description, required bool required, ApiSchema? schema, String? example, }) = _ApiParameter; factory ApiParameter.fromJson(Map json) => _$ApiParameterFromJson(json); } @freezed class ApiRequestBody with _$ApiRequestBody { const factory ApiRequestBody({ String? description, required Map content, }) = _ApiRequestBody; factory ApiRequestBody.fromJson(Map json) => _$ApiRequestBodyFromJson(json); } @freezed class ApiHeader with _$ApiHeader { const factory ApiHeader({ String? description, required bool required, ApiSchema? schema, String? example, }) = _ApiHeader; factory ApiHeader.fromJson(Map json) => _$ApiHeaderFromJson(json); } @freezed class ApiResponse with _$ApiResponse { const factory ApiResponse({ String? description, Map? content, }) = _ApiResponse; factory ApiResponse.fromJson(Map json) => _$ApiResponseFromJson(json); } @freezed class ApiContent with _$ApiContent { const factory ApiContent({ required ApiSchema schema, }) = _ApiContent; factory ApiContent.fromJson(Map json) => _$ApiContentFromJson(json); } @freezed class ApiSchema with _$ApiSchema { const factory ApiSchema({ String? type, String? format, String? description, String? example, ApiSchema? items, Map? properties, }) = _ApiSchema; factory ApiSchema.fromJson(Map json) => _$ApiSchemaFromJson(json); } have implemented the whole ui     
hey mentors @ashitaprasad ma'am , @animator sir and @DenserMeerkat bhaiya, This current data model is properly aligned with the OpenAPI specification. If needed, we can parse it using an HTML AI parser or even add the entire endpoint collection via a URL (like this OpenAPI example: https://api.apidash.dev/docs). Please let me know if any further updates are required in the model
@pratapsingh9 Since you have now worked with OpenAPI spec now .. I recommend you to resolve https://github.com/foss42/apidash/issues/121 which should be any easy task for you. We will come back to this PR post review of that PR. Looking forward to it.







