feat: add POC for measuring adoption
Description
[!IMPORTANT]
Not ready yet, just marking as ready for review to let CI run tests
This PR is meant as a simple POC for demonstrating how we could collect metrics for measuring adoption. This code uses the following shared library https://github.com/smoya/asyncapi-adoption-metrics. It can be shared between CLI, Studio, and any other tool that wants to keep track of those metrics.
In this example, we are just measuring the happy path of validate CLI command. The sink (aka backend where metrics are sent) in this POC is just logging to StdOut (console.log).
The output when validating a simple and small doc atm will output the collected metrics, I.e.:
asyncapi-adoption.action.executed COUNTER [
_asyncapi_version: '2.6.0',
_asyncapi_servers: 1,
_asyncapi_channels: 1,
_asyncapi_messages: 1,
_asyncapi_operations_send: 1,
_asyncapi_operations_receive: 0,
_asyncapi_schemas: 1,
success: true,
validation_result: 'valid',
action: 'validate'
] 1
Of course, a lot of things are missing surrounding this, like the important warning to the user saying we are collecting metrics, a way to disable, etc etc etc. Also, we might want to encapsulate this logic as well in a single file or something we can reuse for other commands in CLI
cc @Amzani @peter-rr @fmvilas @derberg
Related issue(s) https://github.com/asyncapi/cli/issues/841
Something we might want to move on is to use Oclif hooks for lifecycle events so we can create the metric metadata on the prerun hook, and send the metric with a success: true in the postrun hook or rather with a success: false a the catch part.
In that way we would be always sending a metric, no matters if the command fails or not, and log the cmd result (OK|KO).
@smoya Let me work on the metrics recording implementation for all the actions/commands as you did for validate.ts
[!IMPORTANT]
Not ready yet, just marking as ready for review to let CI run tests
[!IMPORTANT] Ready for review 👍 This message overrides the previous comment by @smoya
cc @Souvikns @derberg @fmvilas
@derberg Yup! That's already considered as one of the next actions to be carried out 💪
to have in docs a new markdown document dedicated to this topic
Do you mean in a separate markdown document just related to metrics collection? @smoya and I were thinking on adding a new section at CLI's README file with all the info you mentioned regarding metrics.
- where we collect
Should we mention we are making use of New Relic Metrics API in our case? Or not necessary to include this info?
- we should stop metrics when
process.env.CI === true?
Isn't that case covered here? Or should we add another specific case or condition for CI?
Isn't that case covered here? Or should we add another specific case or condition for CI?
Nope. That's not to pollute metrics when we run npm test. We should also stop sending metrics if process.env === 'CI' because that would mean that a CLI command is being executed on a CI/CD pipeline and that's a tricky case when it comes to privacy and security that people for sure won't like. Plus it's not our target. We want to see how people use the CLI, not how scripts/machines use the CLI.
Do you mean in a separate markdown document just related to metrics collection? @smoya and I were thinking on adding a new section at CLI's README file with all the info you mentioned regarding metrics.
I think we should be as transparent as possible here. If someone wants to find out what are we measuring and where are we putting this information they should be able to easily find it and disable it if needed. IMHO, this is too important to be a carry over task. It should be a must to merge this PR. Otherwise, we're risking that this task would never get done.
Should we mention we are making use of New Relic Metrics API in our case? Or not necessary to include this info?
IMHO as much information as possible. We have nothing to hide so we shouldn't hide it.
I think we should be as transparent as possible here. If someone wants to find out what are we measuring and where are we putting this information they should be able to easily find it and disable it if needed. IMHO, this is too important to be a carry over task. It should be a must to merge this PR. Otherwise, we're risking that this task would never get done.
Fair enough 👍 Let me work on a separate document for that purpose then.
@smoya @derberg @fmvilas
to have in docs a new markdown document dedicated to this topic
Markdown document dedicated to metrics collection is ready for review here.
We should also stop sending metrics if
process.env === 'CI'because that would mean that a CLI command is being executed on a CI/CD pipeline
Also this scenario is already implemented and ready for review here.
@peter-rr I don't think standard is process.env.NODE_ENV !== 'CI'
I think it is env called CI (process.env.CI === true) that once set to true - metrics should not be collected. But yeah I googled https://adamj.eu/tech/2020/03/09/detect-if-your-tests-are-running-on-ci/ and I also found https://stackoverflow.com/questions/73973332/check-if-were-in-a-github-action-travis-ci-circle-ci-etc-testing-environme which might mean we need to support several different variables 🤷🏼 - you need to research basically, but I'm pretty sure it is not NODE_ENV set to CI
@peter-rr I don't think standard is
process.env.NODE_ENV !== 'CI'I think it is env calledCI(process.env.CI === true) that once set totrue- metrics should not be collected. But yeah I googled https://adamj.eu/tech/2020/03/09/detect-if-your-tests-are-running-on-ci/ and I also found https://stackoverflow.com/questions/73973332/check-if-were-in-a-github-action-travis-ci-circle-ci-etc-testing-environme which might mean we need to support several different variables 🤷🏼 - you need to research basically, but I'm pretty sure it is notNODE_ENVset toCI
Indeed, after doing some research I've found the env name is process.env.CI. See:
- https://dev.to/kapi1/solved-treating-warnings-as-errors-because-of-process-env-ci-true-bk5
- https://docs.github.com/en/actions/learn-github-actions/variables#default-environment-variables
I've tested this change with CI=true ./bin/run validate <spec-file-path> and also CI=false ./bin/run validate <spec-file-path>, and everything is working as expected.
Change requested is ready for review now 👍 https://github.com/smoya/cli/pull/10
- Ask users to
export ASYNCAPI_METRICS="false"
This is the easiest solution I think, and can be done easily by editing the informational message.
- Add a new command
asyncapi analytics off|on
This one would be more DX-friendly. I guess it could be considered as a subcommand of asyncapi config, something like asyncapi config analytics off|on
- This message is polluting every command execution, we could show it only once during installation/update or the first command execution.
Agree 👍 We could also consider to show the message only at the beginning of every user session, but not sure if that option might be too invasive in case the user starts many different sessions during the same day.
- Remove
Warningas this is informational to allow users to disable analytics. After this informational message, we start collecting metrics.
Should we add some message title like Information: or just remove Warning?
This one would be more DX-friendly. I guess it could be considered as a subcommand of asyncapi config, something like asyncapi config analytics off|on
Sounds good 👍.
Agree 👍 We could also consider to show the message only at the beginning of every user session, but not sure if that option might be too invasive in case the user starts many different sessions during the same day.
It's still too invasive, why we can't just show it one time at any command execution and then store that we have shown this message in some file like .asyncapi-analytics, btw we can use the same file to store all related analytics information
- Analytics disabled/enabled (asyncapi config analytics off|on)
- Analytic information shown (yes|no)
- User ID or Correlation ID
As this PR is only focused on Adoption metrics, we can implement 1) and 2)
Should we add some message title like Information: or just remove Warning?
I'll just remove it for now, CLI UI will be improved later in https://github.com/asyncapi/cli/issues/872
@Amzani At the moment I've changed the previous warning message to and informational message, and "How to disable tracking" section at markdown document has been updated in https://github.com/smoya/cli/pull/9
Working in the meanwhile on a fancier and more DX-friendly solution like asyncapi config analytics off|on. I agree with the idea about storing all the analytics info in the same file 👍
@Amzani
Both asyncapi config analytics command and .asyncapi-analytics config file are already implemented and ready for review: https://github.com/asyncapi/cli/pull/859/commits/e3fc3c79476703c9f78c52ee1c585510df0bfebd
@Amzani Integration tests already added. Let me know if something else missing :)
@peter-rr some Linting issues to fix https://github.com/asyncapi/cli/actions/runs/8079494096/job/22074009875?pr=859
@Amzani Should we use safer alternatives, such as SHA-256, as recommended by Sonarcloud? Or should we ignore it?
@peter-rr we have discussed it privately! let's go ahead with SHA-256
Hey @Souvikns, could you take a look so we can get the final review on this PR? 💪 I'm working now on fixing the current conflict regarding the package-lock.json.
@Souvikns The CLI build process seems to be failing since the latest version v1.6.1 has been released. Could you please have a look? https://github.com/asyncapi/cli/actions/runs/8201865027/job/22431460372?pr=859
@Souvikns Conflict already solved 💪 I think I need your approval again since your review has been staled with the last merge.
/rtm
@peter-rr anything blocking this PR from merging ?, we would like to get some adoption insights :)
@peter-rr anything blocking this PR from merging ?, we would like to get some adoption insights :)
Now I just need to update the dependency with asyncapi-adoption-metrics repo and also resolve the current conflicts. I'll let you know in case I find any blocker.
@Amzani Conflicts already solved 👍 Just waiting for approval to be merged.
Yo folks @derberg @Souvikns @magicmatatjahu , Could you have a look at this PR for a final review?
Quality Gate passed
Issues
0 New issues
0 Accepted issues
Measures
0 Security Hotspots
No data about Coverage
0.0% Duplication on New Code
/rtm
@Souvikns please have a look into CI. Since this PR was merged, the release pipeline fails
Next PRs like -> https://github.com/asyncapi/cli/pull/1297 also are not releasable for now
https://github.com/asyncapi/cli/actions/workflows/if-nodejs-release.yml