apidash icon indicating copy to clipboard operation
apidash copied to clipboard

Gsoc poc for Api Explorer wiht automation pipeline

Open praaatap opened this issue 10 months ago • 4 comments

PR Description

Add your description

Related Issues

image image image image

  • Closes # #619

Checklist

  • [x] I have gone through the contributing guide
  • [x] I have updated my branch and synced it with project main branch before making this PR
  • [x] I am using the latest Flutter stable branch (run flutter upgrade and verify)
  • [x] I have run the tests (flutter test) and all tests are passing

Added/updated tests?

We encourage you to add relevant test cases.

  • [] Yes
  • [x] No, and this is why: please replace this line with details on why tests have not been included

OS on which you have developed and tested the feature?

  • [x] Windows
  • [x] macOS
  • [x] Linux

praaatap avatar Apr 07 '25 17:04 praaatap

Have to refactor the code will make it perfect as it have some good optmiziatioin i will do it in a week

Thanking you for checking my proof of concept

praaatap avatar Apr 07 '25 17:04 praaatap

You can see the current draft where I have set up GitHub Actions and automation pipelines for the project. These pipelines help create new releases for the API based on the OpenAPI specifications.

For the HTML file parser, we can use the OpenAPI specs to parse the files. Then, for the HTML file type specifically, we can also make an LLM (Large Language Model) call to extract relevant data from the endpoints. We'll design a good prompt that asks the model to extract important information from the API response.

Since building our own API could lead to potential misuse by users, we will instead allow users to connect their own LLM APIs offline, and we'll securely store their API keys.

As usual, we'll add a dropdown menu in the api_dash where users can choose between different LLM providers like OpenAI, Gemini, Claude, and Llama.

Additionally, we will maintain a separate GitHub repository for the URL parsing logic. Here is the link to that repo: GITHUB REPO LINK.

Working Video

Demo Video Thumbnail

working video second

Demo Video Thumbnail

Architecture of whole draft

Provider of api explorer

image

Both Services Layer ( apipackagezip_service.dart and apiproccessing.dart)

image

Api Explorer Data Model

image

Data Loading Arch.

image

Github action bundle for the api file ( zip )

image

Error View if error happens during extracting zip file using archive pacakge

image

```dart import 'package:apidash_core/consts.dart';
import 'package:freezed_annotation/freezed_annotation.dart';

part 'api_explorer_models.freezed.dart';
part 'api_explorer_models.g.dart';

@freezed
class ApiCollection with _$ApiCollection {
  const factory ApiCollection({
    required String id,
    required String name,
    String? description,
    required List endpoints,
    String? sourceUrl,
  }) = _ApiCollection;

  factory ApiCollection.fromJson(Map json) =>
      _$ApiCollectionFromJson(json);
}

@freezed
class ApiEndpoint with _$ApiEndpoint {
  const factory ApiEndpoint({
    required String id,
    required String name,
    String? description,
    required String path,
    required HTTPVerb method,
    required String baseUrl,
    List? parameters,
    ApiRequestBody? requestBody,
    Map? headers,
    Map? responses,
  }) = _ApiEndpoint;

  factory ApiEndpoint.fromJson(Map json) =>
      _$ApiEndpointFromJson(json);
}

@freezed
class ApiParameter with _$ApiParameter {
  const factory ApiParameter({
    required String name,
    required String inLocation,
    String? description,
    required bool required,
    ApiSchema? schema,
    String? example,
  }) = _ApiParameter;

  factory ApiParameter.fromJson(Map json) =>
      _$ApiParameterFromJson(json);
}

@freezed
class ApiRequestBody with _$ApiRequestBody {
  const factory ApiRequestBody({
    String? description,
    required Map content,
  }) = _ApiRequestBody;

  factory ApiRequestBody.fromJson(Map json) =>
      _$ApiRequestBodyFromJson(json);
}

@freezed
class ApiHeader with _$ApiHeader {
  const factory ApiHeader({
    String? description,
    required bool required,
    ApiSchema? schema,
    String? example,
  }) = _ApiHeader;

  factory ApiHeader.fromJson(Map json) =>
      _$ApiHeaderFromJson(json);
}

@freezed
class ApiResponse with _$ApiResponse {
  const factory ApiResponse({
    String? description,
    Map? content,
  }) = _ApiResponse;

  factory ApiResponse.fromJson(Map json) =>
      _$ApiResponseFromJson(json);
}

@freezed
class ApiContent with _$ApiContent {
  const factory ApiContent({
    required ApiSchema schema,
  }) = _ApiContent;

  factory ApiContent.fromJson(Map json) =>
      _$ApiContentFromJson(json);
}

@freezed
class ApiSchema with _$ApiSchema {
  const factory ApiSchema({
    String? type,
    String? format,
    String? description,
    String? example,
    ApiSchema? items,
    Map? properties,
  }) = _ApiSchema;

  factory ApiSchema.fromJson(Map json) =>
      _$ApiSchemaFromJson(json);
}

have implemented the whole ui

image

image

image

image

image

praaatap avatar Apr 26 '25 19:04 praaatap

@animator @ashitaprasad please check this draft pr for the application once max work have been done

praaatap avatar Apr 26 '25 19:04 praaatap

👍

animator avatar Apr 26 '25 23:04 animator

ai autotagging and pipelines diagram how will this work

in this arch first user can send a variour pr of the new open api specs in yaml then our parrser converts this into json and then dart llm file wil run in github aciton and do autotagging in that after that will make a release

for making the html parser we can do a thing can send api llm calls with json with the user api key so it parser them as per the html markup strucutre our html parser with llm call will fetch and extract relevant info of which we need store them in the hive for local db in json

image

as i have created open api spec parser alreay this is htmp parser proposed architecture

AI HTML PARSER

image

praaatap avatar Apr 28 '25 18:04 praaatap

@animator @ashitaprasad is this approach will be effecient for apidash html parsing

praaatap avatar Apr 28 '25 18:04 praaatap

Hello @pratapsingh9 , It seems that your current POC models are not aligned with the APIDash request models. From what I can observe, there's a significant mismatch in structure and flow.

You might want to check the references I previously discussed with @ashitaprasad ma'am in my Initial Idea Submission, as well as the initial POC I developed. That should give you a clearer understanding of how the models are expected to function and align more closely with the actual requirements.

BalaSubramaniam12007 avatar Apr 30 '25 12:04 BalaSubramaniam12007

Hey @BalaSubramaniam12007 👋,

I appreciate you taking time to review my POC. Let me explain why my model architecture is better suited for both current needs and future extensions like HTML/AI parsing:

Core Model Comparison

Feature @BalaSubramaniam12007 Template Model My OpenAPI Model
Structure Flat request list Nested collections with endpoints
Spec Compliance Custom format OpenAPI 3.x standard
Parameter Handling Basic key-value pairs Supports path/query/header/cookie locations
Validation ❌ None ✅ Full schema validation
Future Extensions Difficult to extend Ready for HTML/AI parsing

Why My Model Works Better with HttpRequestModel

My ApiEndpoint converts perfectly to HttpRequestModel:

// Easy conversion from OpenAPI to executable request
HttpRequestModel toRequestModel(ApiEndpoint endpoint) {
  return HttpRequestModel(
    method: endpoint.method,
    url: endpoint.baseUrl + endpoint.path,
    headers: endpoint.headers?.entries.map((e) => 
      NameValueModel(name: e.key, value: e.value.example ?? '')).toList(),
    bodyContentType: endpoint.requestBody?.content.keys.contains('application/json') 
      ? ContentType.json 
      : ContentType.text
  );
}

Why myModel is Superior

  1. Standards Compliance
    • Direct OpenAPI spec compatibility
    • Works with existing API ecosystems
    • Example:
      // OpenAPI-aligned parameter definition
      ApiParameter(
        inLocation: 'query',
        schema: ApiSchema(type: 'integer', format: 'int64')
      )
      

@BalaSubramaniam12007's model uses a custom-supported JSON schema, which can cause issues for users when contributing to new APIs. That's why I proposed my solution

praaatap avatar Apr 30 '25 14:04 praaatap

You can see the current draft where I have set up GitHub Actions and automation pipelines for the project. These pipelines help create new releases for the API based on the OpenAPI specifications.

For the HTML file parser, we can use the OpenAPI specs to parse the files. Then, for the HTML file type specifically, we can also make an LLM (Large Language Model) call to extract relevant data from the endpoints. We'll design a good prompt that asks the model to extract important information from the API response.

Since building our own API could lead to potential misuse by users, we will instead allow users to connect their own LLM APIs offline, and we'll securely store their API keys.

As usual, we'll add a dropdown menu in the api_dash where users can choose between different LLM providers like OpenAI, Gemini, Claude, and Llama.

Additionally, we will maintain a separate GitHub repository for the URL parsing logic. Here is the link to that repo: GITHUB REPO LINK.

Working Video

Demo Video Thumbnail

working video second

Demo Video Thumbnail

Architecture of whole draft

Provider of api explorer

image

Both Services Layer ( apipackagezip_service.dart and apiproccessing.dart)

image

Api Explorer Data Model

image

Data Loading Arch.

image

Github action bundle for the api file ( zip )

image

Error View if error happens during extracting zip file using archive pacakge

image

import 'package:freezed_annotation/freezed_annotation.dart';

part 'api_explorer_models.freezed.dart';
part 'api_explorer_models.g.dart';

@freezed
class ApiCollection with _$ApiCollection {
  const factory ApiCollection({
    required String id,
    required String name,
    String? description,
    required List endpoints,
    String? sourceUrl,
  }) = _ApiCollection;

  factory ApiCollection.fromJson(Map json) =>
      _$ApiCollectionFromJson(json);
}

@freezed
class ApiEndpoint with _$ApiEndpoint {
  const factory ApiEndpoint({
    required String id,
    required String name,
    String? description,
    required String path,
    required HTTPVerb method,
    required String baseUrl,
    List? parameters,
    ApiRequestBody? requestBody,
    Map? headers,
    Map? responses,
  }) = _ApiEndpoint;

  factory ApiEndpoint.fromJson(Map json) =>
      _$ApiEndpointFromJson(json);
}

@freezed
class ApiParameter with _$ApiParameter {
  const factory ApiParameter({
    required String name,
    required String inLocation,
    String? description,
    required bool required,
    ApiSchema? schema,
    String? example,
  }) = _ApiParameter;

  factory ApiParameter.fromJson(Map json) =>
      _$ApiParameterFromJson(json);
}

@freezed
class ApiRequestBody with _$ApiRequestBody {
  const factory ApiRequestBody({
    String? description,
    required Map content,
  }) = _ApiRequestBody;

  factory ApiRequestBody.fromJson(Map json) =>
      _$ApiRequestBodyFromJson(json);
}

@freezed
class ApiHeader with _$ApiHeader {
  const factory ApiHeader({
    String? description,
    required bool required,
    ApiSchema? schema,
    String? example,
  }) = _ApiHeader;

  factory ApiHeader.fromJson(Map json) =>
      _$ApiHeaderFromJson(json);
}

@freezed
class ApiResponse with _$ApiResponse {
  const factory ApiResponse({
    String? description,
    Map? content,
  }) = _ApiResponse;

  factory ApiResponse.fromJson(Map json) =>
      _$ApiResponseFromJson(json);
}

@freezed
class ApiContent with _$ApiContent {
  const factory ApiContent({
    required ApiSchema schema,
  }) = _ApiContent;

  factory ApiContent.fromJson(Map json) =>
      _$ApiContentFromJson(json);
}

@freezed
class ApiSchema with _$ApiSchema {
  const factory ApiSchema({
    String? type,
    String? format,
    String? description,
    String? example,
    ApiSchema? items,
    Map? properties,
  }) = _ApiSchema;

  factory ApiSchema.fromJson(Map json) =>
      _$ApiSchemaFromJson(json);
}

have implemented the whole ui

![image](https://private-user-images.githubusercontent.com/98013953/437739649-f401e83e-e2f8-4ef4-9494-6083d1d5c754.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NDYwNjk1NjcsIm5iZiI6MTc0NjA2OTI2NywicGF0aCI6Ii85ODAxMzk1My80Mzc3Mzk2NDktZjQwMWU4M2UtZTJmOC00ZWY0LTk0OTQtNjA4M2QxZDVjNzU0LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTA1MDElMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwNTAxVDAzMTQyN1omWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWYzNGM2MmJiODg1MTNmZTk2Zjc4ODFkNTE2MDBiYTY1YTFlMjI5NTcyZjY3ZDJlOWRkMGUxYmVjMDA5MjQyYWUmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.UMM1S4M5ZYKu80F-YN854Wd9PuqKv8cephqIbGCMxj0)

![image](https://private-user-images.githubusercontent.com/98013953/437739743-f25d8dd2-f256-407c-a358-17668b3cd4ef.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NDYwNjk1NjcsIm5iZiI6MTc0NjA2OTI2NywicGF0aCI6Ii85ODAxMzk1My80Mzc3Mzk3NDMtZjI1ZDhkZDItZjI1Ni00MDdjLWEzNTgtMTc2NjhiM2NkNGVmLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTA1MDElMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwNTAxVDAzMTQyN1omWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWVhYzRkZjYwNGI2ZmVkNzhmMTQ5NjQ4MjRlYTI0OGRkYjlmZGU4Mzc1ZmEwZDdjMTYwZGNhMWQ1MzRiODZjZGEmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.A8hn6EaGILuUjdVSqfP2AFyKvz3kw8GZ91UvehBGVuo)

![image](https://private-user-images.githubusercontent.com/98013953/437739571-bd9baed5-ddac-45d0-816b-2e9f58e498c5.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NDYwNjk1NjcsIm5iZiI6MTc0NjA2OTI2NywicGF0aCI6Ii85ODAxMzk1My80Mzc3Mzk1NzEtYmQ5YmFlZDUtZGRhYy00NWQwLTgxNmItMmU5ZjU4ZTQ5OGM1LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTA1MDElMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwNTAxVDAzMTQyN1omWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTgwMDY3ODliNTAxZTA2YjhmYmUxNTYzYmU2OTExNTJlMDAzNzRmYTViNjFiM2IwNGI5MmZlOGJiNjhlNjYyNzcmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.TZNLTuc0kfCaoDWOpSJ06stuTmdqjBUUEjrdJKjXGto)

![image](https://private-user-images.githubusercontent.com/98013953/437744617-ea3ee325-ee67-402b-8b55-3c5c41cb65a6.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NDYwNjk1NjcsIm5iZiI6MTc0NjA2OTI2NywicGF0aCI6Ii85ODAxMzk1My80Mzc3NDQ2MTctZWEzZWUzMjUtZWU2Ny00MDJiLThiNTUtM2M1YzQxY2I2NWE2LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTA1MDElMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwNTAxVDAzMTQyN1omWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWYyMWI4NWJlNDkwYzJkNTJhMjYyMmE1NWI3OTBmZTBiYWIxYmJmMjhkM2NhY2NkNzgyZjQ3Njc1NGEyMTA2NGUmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.8W460EN0B2qeb58Dp4qOUdq6UVjfY_zFeHTzfDlgSo0)

![image](https://private-user-images.githubusercontent.com/98013953/437744912-9f073f1a-8461-4245-98ab-d9f31188efc7.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NDYwNjk1NjcsIm5iZiI6MTc0NjA2OTI2NywicGF0aCI6Ii85ODAxMzk1My80Mzc3NDQ5MTItOWYwNzNmMWEtODQ2MS00MjQ1LTk4YWItZDlmMzExODhlZmM3LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTA1MDElMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwNTAxVDAzMTQyN1omWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWExMTlmY2U4YzI5OWFmYTllYTFkMGE3YzE5YTc1YmI3OTlmNTk3ZmJmOTViODhhODhhZGYwNTk0YTQwMzVhN2YmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.hVkPO58kiweeVHn4aN5zJTC_xPlr8lmZ7RR2RXMW-G0)

hey mentors @ashitaprasad ma'am , @animator sir and @DenserMeerkat bhaiya, This current data model is properly aligned with the OpenAPI specification. If needed, we can parse it using an HTML AI parser or even add the entire endpoint collection via a URL (like this OpenAPI example: https://api.apidash.dev/docs). Please let me know if any further updates are required in the model

praaatap avatar May 01 '25 03:05 praaatap

@pratapsingh9 Since you have now worked with OpenAPI spec now .. I recommend you to resolve https://github.com/foss42/apidash/issues/121 which should be any easy task for you. We will come back to this PR post review of that PR. Looking forward to it.

animator avatar May 03 '25 04:05 animator