wazuh-dashboard-plugins
wazuh-dashboard-plugins copied to clipboard
Event generator and mock server
Description
We need to generate sample documents for elastic search using CI/CD compatible tools and compatible with our testing framework Cypress. We need to create a tool to generate events based on the templates built into the functional inventory.
- [x] Search for a mock server to use when developing the app. We need to mock the Wazuh API and ElasticSearch, though this should not be mandatory.
graph BT
A(Inventory) --> B[Event Generator]
B --> |Insert events using the API| C[Elastic / Splunk]
C --> |Get indices using the API| D[UI]
E[Wazuh API] --> |Mock Server| D
Splunk provides an addon named Splunk Event Generator Utilty (EventGen):
Provide sample data with your app or add-on using the Splunk Event Generator utility (Eventgen), along with an eventgen.conf file. Sample data helps demonstrate how your app or add-on functions during the code review. For more information, see the Eventgen app on Splunkbase. Source: Best practices for optimizing apps and add-ons for Splunk Cloud Platform
EventGen
Some highlights.
- Can be executed inside of Splunk (relying on a common event generation framework) as well as outside of Splunk
- Event output can easily be directed to a Splunk input (modular inputs, HEC, etc.), a text file, or any REST endpoint in an extensible way
- Easily configurable to make fake data look as real as possible, either by ordering events and token replacements by time of the day or by allowing generators to replay real data replacing current time by generating data exactly at the same time intervals as the original data
EventGen Training
Reading the documentation and trying out the examples.
Mock Server
We are looking for a mocking server with the following characteristics:
- Easily integrable with Cypress.
- Easily configurable / programmable, allowing us to setup different environments for testing purposes.
- Free to use.
- Open source preferred.
- Ability to import an OpenAPI specification.
| MockServer | MirageJS | Postman | Mocks Server | Prism | Cypress-mock-OpenAPI | |
|---|---|---|---|---|---|---|
| Cypress integration | No | Yes | No | Yes | No | Yes |
| Multi mock | Yes | Yes | - | Yes | Yes | - |
| Open Source | Yes | Yes | Yes | Yes | Yes | Yes |
| Pricing | Free | Free | Limited | Free | Free | Free |
| OpenAPI support | Yes | No* | Yes | No | Yes | Yes |
- This tool (miragejs-open-api) generates MirageJS code from an OpenAPI Specification. The stability of this tool is unknown, so this is an important factor to take into account during MirageJS' evaluation.
After an evaluation with the team, we have chosen the MockServer as our first option to mock the Wazuh API services on our development environments due to the big community that supports the project on GitHub, the active development and the wide configuration options it provides.
As a result, a PoC has been performed, where we deployed the MockServer on a Docker environment, tested it out with some simple requests and then imported the OpenAPI specification for the Wazuh API.
# docker-compose.yml
version: "3.9"
services:
mock-server:
image: mockserver/mockserver:mockserver-5.12.0
command: -logLevel DEBUG -serverPort 1080
ports:
- 55000:1080
Although the services have been imported correctly, there is a problem on the responses given by the MockServer, as it fails to detect and load the examples on the spec file and provides sample date instead. This drawback has minor relevance as we plan to override the responses of the API and adjust them to our development needs, but it would be great to have this working on the future as this would mean much less configuration on our side. For this reason, a Feature Request has been opened on the GitHub repository of the project.
With the PoC succesfully completed, we move on to optimize the configuration of this tool to fit our needs and integrate it on our development environments (#3872).
- [x] Automatize the import of the Wazuh API spec using the
docker-compose.ymland the Environment Variables. - [x] Configure the Expectations Initializers and the Expectacions Persistance.
- [x] Find a way to make the MockServer to use the examples provided on the spec file.
- [x] Design and document a workflow for the use of this tool, as well as the procedure to easily modify the MockServer behaviour on a always changing development environment.
Task: Automatize the import of the Wazuh API Spec
Requirements: to build a script or program that performs the following tasks, in order to automatize the configuration of the Mock Server, so we can setup a mocked service in no time and quickly adapt to changes on the API specification.
- [x] 1. Download the spec from Github using cURL.
- [x] 2. Provide +rw permissions to the docker container on the mounted volumes.
- [x] 3. Import the spec to the Mock Server.
- Save the generated expectations to a file:
expectations.json.
- Save the generated expectations to a file:
- [x] 4. Make the expectations return the examples defined in the spec file.
- Read the
expectations.json* file and sort the expectation by its operation ID (in memory). - Read the
spec.yamlfile, get the endpoints chunk and sort them by the operation ID (in memory) - Match expectations and endpoints by key (operation ID) and for each, update the expectation's response with the example in the spec.
- Save the modified expectations to a file:
initializer.json.
- Read the
* The expectations.json is a collection (array|list) of expectations objects, represented in a JSON format.
Status
- 28/03/2022
- Currently redacting a documentation / user manual.
- 01/04/2022
- The environment is ready and so is the user manual.
- wazuh-mock-server.zip
- Some improvements have been applied.
- This issue is staled as our App cannot use the MockSever due to an issue that prevents requests using lists on their query string paramaters to not match, returning a
404 Not Foundstatus code. This affects essential functionalities, as the requests of the kind ofGET /agents?agents_list=000fail. - For the time being, I move on to integrate the MockServer with our environments redesign.
- 05/04/2022
- Extended the User Manual with a few more sections.
- Follow up of this issue. I've tried the new version (5.13.1) and tried different parameter deserialization styles, without success so far.
- https://swagger.io/docs/specification/serialization/
- https://spec.openapis.org/oas/v3.1.0#style-values
Status Update - 19/04/2022
Pulled the latest version of the MockServer Docker Image, which extends the verbosity for the cause of unmatched expectations.
The endpoints accepting arrays as query string parameters fail to match due to the following error:
errors:
1 error:
- $: integer found, array expected
To determine if this is an issue on our OpenAPI spec or a bug on the MockServer, I've tried using a different OpenAPI spec that takes array on the query string parameters. The well known Pet Store spec has been used. This spec defines the following endpoint:
/pet/findByStatus:
get:
tags:
- pet
summary: Finds Pets by status
description: Multiple status values can be provided with comma separated strings
operationId: findPetsByStatus
parameters:
- name: status
in: query
description: Status values that need to be considered for filter
required: false
explode: true
schema:
type: string
enum:
- available
- pending
- sold
default: available
responses:
'200':
description: successful operation
content:
application/json:
schema:
type: array
items:
$ref: '#/components/schemas/Pet'
With this definition, the following request is accepted:
/pet/findByStatus?status=available&status=pending
but this is not:
/pet/findByStatus?status=available,pending
As seen on the documentation about the Query Parameters Serialization on the OpenAPI standard that was linked on the previous comment, we follow the unexploded form style. I modified the Pet Store spec to follow this style, and as a result, the request /pet/findByStatus?status=available,pending is now accepted.
This confirms that the MockServer is able to work with arrays as query string parameters, so we'll need to adjust our OpenAPI spec.
We still can't use arrays as query parameters on a request using an OpenAPI spec.
To simplify the scenario and conclude whether there is an error with our OpenAPI schema or with the MockServer itself, I did some tests with the well known PetStore schema. This schema defines the endpoint /pet/findByTags, which accepts the ?tags query parameter. This endpoint can be tried online on https://petstore3.swagger.io/ and can look like this:
https://petstore3.swagger.io/api/v3/pet/findByTags?tags=tag1&tags=tag2&tags=tag3
/pet/findByTags:
get:
tags:
- pet
summary: Finds Pets by tags
description: >-
Multiple tags can be provided with comma separated strings. Use tag1,
tag2, tag3 for testing.
operationId: findPetsByTags
parameters:
- name: tags
in: query
description: Tags to filter by
required: false
explode: false
schema:
type: array
items:
type: string
responses:
'200':
description: successful operation
content:
application/xml:
schema:
type: array
items:
$ref: '#/components/schemas/Pet'
application/json:
schema:
type: array
items:
$ref: '#/components/schemas/Pet'
'400':
description: Invalid tag value
However, if I import this schema and operation and try it out on the MockServer, it returns a 404 response as it decides that the request does not match any expectation, when IMO it should.
Here's the import:
{
"specUrlOrPayload": "https://raw.githubusercontent.com/swagger-api/swagger-petstore/master/src/main/resources/openapi.yaml",
"operationsAndResponses": {
"findPetsByTags": "200"
}
}
Here's the request (as logged by the MockServer):
{
"method": "GET",
"path": "/pet/findByTags",
"queryStringParameters": {
"tags": [
"tag1",
"tag2",
"tag3"
]
},
}
Here's the expectation (as logged by the MockServer):
{
"method": "GET",
"path": "/pet/findByTags",
"queryStringParameters": {
"keyMatchStyle": "MATCHING_KEY",
"?tags": {
"parameterStyle": "FORM_EXPLODED",
"values": [
{
"schema": {
"items": {
"type": "string"
},
"type": "array"
}
}
]
}
}
}
And here's the cause:
method matched
path matched
body matched
headers matched
cookies matched
pathParameters matched
queryParameters didn't match:
schema match failed expect:
{
"items" : {
"type" : "string"
},
"type" : "array"
}
found:
tag1
errors:
JsonParseException - Unrecognized token 'tag1': was expecting (JSON String, Number (or 'NaN'/'INF'/'+INF'), Array, Object or token 'null', 'true' or 'false')
at [Source: (String)"tag1"; line: 1, column: 5]
multimap matching key match failed for key:
?tags
multimap match failed expected:
{
"keyMatchStyle" : "MATCHING_KEY",
"?tags" : {
"parameterStyle" : "FORM_EXPLODED",
"values" : [ {
"schema" : {
"items" : {
"type" : "string"
},
"type" : "array"
}
} ]
}
}
found:
{
"tags" : [ "tag1", "tag2", "tag3" ]
}
failed because:
multimap values don't match
Finally, we dropped MockServer due to bugs which won't allow us to import and use our OpenAPI spec correctly. Check the previous comments tagged as outdated for more information about this problem.
We ended up looking for a new mock server that would provide us the functionality we need: OpenAPI ready, highly configurable / scriptable, easy to install and use, and with good enough documentation.
After several tries, we have decided to use Imposter, and we have been able to include in our new environments, which are hosted in this branch: feature/3872-environments-docker
We are working on configuring and improving Imposter, the new Docker Environments and on an event generator that would allow us to populate Elastic / OpenSearch indices.
We're using the imposter as the mock server, and it is serving us well for now. We will implement the event generator after the new engine development finish, because it will change their format in the coming releases.