trunk-recorder icon indicating copy to clipboard operation
trunk-recorder copied to clipboard

Per-day call logs and live updating Web page

Open rosecitytransit opened this issue 3 years ago • 7 comments

Completes #153

As requested in https://github.com/robotastic/trunk-recorder/pull/470#issuecomment-841818003, I have separated out the daily log function and the Web page that depends on it.

CALL LOG NOTES:

  • I know the log file format may not be ideal (for one, it does not save the offsets for individual transmissions, though that could easily be changed), and others are welcome to modify it as seen fit.
  • If I were creating this project, I'd do something like this instead of individual JSON files, and for uploading, include a call's entry as a header in the upload command. I know the decision is very firmly set.
  • I'm not sure I'm appending to the log file in the ideal way. I was thinking of maybe something like a second BOOST log process (to stdout while program logging goes to stderr)
  • I'd really like to have a daily counter in Call::create_filename() that can be put in the daily log file/on the live Web page; it would allow calls to have a fixed row number based on start time (even if an earlier-started call ends/arrives later) and identify when calls are not listed on the Web page

WEB PAGE NOTES:

  • It can be seen in use at http://www.rosecitytransit.org/radio/
  • examples/live.config.php is used to configure it
  • It is what I'd like to see OpenMHz look like (though I agree some of the finer details could just be shown in the info panel)
  • While it could use the JSON files, I'd really rather not have it open 100's of them to get all the data.
  • Highlighting of played calls inspired by http://scanner.rachelbythebay.com/main
  • I originally had a version done back in 2017 at https://github.com/rosecitytransit/trunk-recorder/commits/kcmscanner

rosecitytransit avatar May 24 '21 02:05 rosecitytransit

I've always been confused by the php files in this project. Are they a functional requirement for the core basis of trunk-recorder? If not, can they be moved somewhere else as a new repository?

I know there's really no concrete policy, but as a matter of form, I think utility-type tools and scripts are fine for inclusion within the project (such as build scripts, or theoretical examples like tuning utilities or channel/source calculators), but anything beyond that, for example playback systems (like the audio player, or OpenMHz!) deserve their own repositories.

leee avatar May 24 '21 09:05 leee

@rosecitytransit Thanks for the PR! I agree having some method for aggregating activity would be helpful. Instead of baking ing a single specific method for aggregating, I would like to have a more flexible option. I can see this creeping into having to add in support for hourly logs, and sending logs to splunk.

Things are looking pretty good with the plugin system. Instead of merging this into the core code, let me see if I can build this out as a plugin. It would be a good test to make sure the plugins can be useful.

Alternatively, could this just be done using the websocket interface? Something like this ( https://github.com/robotastic/trunk-recorder/issues/462#issuecomment-835384259 ) would allow for aggregation and saving data in what ever level/method is needed.

robotastic avatar May 26 '21 11:05 robotastic

I know you said you have ideas about it, and feel free to do what you want with it.

As for logging, the unitScript process may be good for that (though not running at the end of a call it doesn't have the details, and is currently set to ignore messages w/radio IDs of 0)

rosecitytransit avatar May 26 '21 17:05 rosecitytransit

I've always been confused by the php files in this project. Are they a functional requirement for the core basis of trunk-recorder? If not, can they be moved somewhere else as a new repository?

I know there's really no concrete policy, but as a matter of form, I think utility-type tools and scripts are fine for inclusion within the project (such as build scripts, or theoretical examples like tuning utilities or channel/source calculators), but anything beyond that, for example playback systems (like the audio player, or OpenMHz!) deserve their own repositories.

They could be, and probably should be.

@rosecitytransit Thanks for the PR! I agree having some method for aggregating activity would be helpful. Instead of baking ing a single specific method for aggregating, I would like to have a more flexible option. I can see this creeping into having to add in support for hourly logs, and sending logs to splunk.

Something like a SQLite database that sits along side this might be a good idea. I'm currently saving all of the file information into an SQLite database to keep track of everything. It generally makes the query time for a page load much, much smaller.

-- SNIP --

I move the rest of this to the issue linked below.

Dygear avatar May 27 '21 06:05 Dygear

I've always been confused by the php files in this project. Are they a functional requirement for the core basis of trunk-recorder? If not, can they be moved somewhere else as a new repository? I know there's really no concrete policy, but as a matter of form, I think utility-type tools and scripts are fine for inclusion within the project (such as build scripts, or theoretical examples like tuning utilities or channel/source calculators), but anything beyond that, for example playback systems (like the audio player, or OpenMHz!) deserve their own repositories.

They could be, and probably should be.

I thought I mentioned this, but my thought was the Web page shouldn't be a problem since it's just a couple files and don't require compiling or anything besides PHP. I put them in the examples/ directory to show that they are optional.

@rosecitytransit Thanks for the PR! I agree having some method for aggregating activity would be helpful. Instead of baking ing a single specific method for aggregating, I would like to have a more flexible option. I can see this creeping into having to add in support for hourly logs, and sending logs to splunk.

Something like a SQLite database that sits along side this might be a good idea. I'm currently saving all of the file information into an SQLite database to keep track of everything. It generally makes the query time for a page load much, much smaller.

How about having the information passed on as an extra parameter (or parameters?) to the uploadScript? This way, you could do whatever you want with the data, from appending it to a log file to loading it into a database. As a plus, you could make sure the data is written once audio encoding is complete, and avoid cases of entries being consumed before the file is ready (the Web page won't link to it if it's not there at the moment).

As I said in my original comment, if I were creating the project, I'd have all data writing be external, and the default uploadScript be something like (<encode $1> && <write $2 ($3, $4, etc?) to daily log or JSON> && <curl -header=$2(,$3,$4,etc) $1 uploadServer>) || <write to error log>

This is super useful for me when I'm dispatching as I can quickly go back and see what a unit has said when / if I miss it.

You don't have an official playback function?

BTW, as set in var loadcalls=setInterval(getcalls, 20000); the Web page only requests new calls every 20 seconds, but that line is easily changed.

rosecitytransit avatar May 27 '21 20:05 rosecitytransit

@rosecitytransit agreed - I want to try and move the writing external to the core program. I am hoping that the plugins will let that happen, both for completed calls, but also for system events. An upload script could probably get the job done. Things started out simple though, and then the complexity trickled in :) I originally just used scp in the upload script to transfer the file to openmhz back when it was just for the DC system.

The upload script should get all the details now, via the JSON file. Python or Node have good tools for parsing the JSON information. It would be a little tough to correctly encode some of the lists, like Freq or Source as command line arguments.

I am reworking the end_call() function to take care of things more sequentially. Right now, it is safe to call the upload script or use one of the built-in uploaders... but not both.

robotastic avatar May 28 '21 02:05 robotastic

It would be a little tough to correctly encode some of the lists, like Freq or Source as command line arguments.

I don't think so, as long as a quoted/escaped string is used and the format is defined. The simplest idea would be to just include the entire JSON now being created as a single (second) argument, and expect the encodeUpload script to write it to a file.

rosecitytransit avatar May 28 '21 05:05 rosecitytransit

Planning on re-doing this to either output the existing JSON creation to a daily file, or create a plugin

rosecitytransit avatar Apr 16 '23 03:04 rosecitytransit

Can you create a plugin? It would be a good example for others and they could customize for specific formats and output time periods. I am out on spring break be will be checking on things. Let me know if the plugin system needs any additions!On Apr 16, 2023, at 5:26 AM, Jason McHuff @.***> wrote: Planning on re-doing this to either output the existing JSON creation to a daily file, or create a plugin

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you commented.Message ID: @.***>

robotastic avatar Apr 16 '23 06:04 robotastic

I have it now so it can output the exiting JSON to a daily file either in addition to or instead of the per-call files. Can look into moving things into a plugin.

rosecitytransit avatar Apr 23 '23 03:04 rosecitytransit

Here's what I have so far. If you don't mind, I moved the existing JSON creation to the plugin, so it could also be put in a per-day file if desired. Also, note that the CSV creation as is does not store the position or length of each transmission.

Issues: -Is there a better way to get the directory of where the call is stored (keeping in mind it won't be the current one for calls that span midnight)?

-It doesn't have to be a part of this, but how about removing the frequency section from the JSON like you did for OpenMHz uploading, and just keeping track of spikes and errors per call, instead per transmission and then adding them together on call conclusion?

-One issue I've experienced is that sometimes my Web page will fail to include (a link to) the actual call when someone is listening "live" and the page fetches new calls while the audio file is being converted. I think one option would be to put a non-blocking delay in the log file creation or to move the per-day logging to use plugman_call_end presuming that the per-day log is likely to be used locally while the per-call files are likely to be sent elsewhere.

-When the logging format is settled, I can edit the live Web page to match and include it here.

rosecitytransit avatar Apr 24 '23 08:04 rosecitytransit

@robotastic When you have time, can you give your thoughts on this and the issues I raised in my last comment?

rosecitytransit avatar May 07 '23 15:05 rosecitytransit

Also, one idea is to have creating the JSON files be a config option, then I could put the daily log creation in an external plugin. It would just be nice to not have to duplicate the JSON creation code, to be able to write the same info to two different places (single file and daily log). Unless there's a desire to have daily log entries to be different than the single files.

rosecitytransit avatar May 21 '23 16:05 rosecitytransit

Trying to catch up on where things are with this. I think all of the daily call logging should be in an external plugin, in its own repo. Plugins do support reading from the Config file, so a user can configure aspects of the plugin.

What are the "duplex, mode, priority" fields?

If you can carve this into a few minimal set of PRs that would help track what is changing, and why.

robotastic avatar May 21 '23 17:05 robotastic

I think all of the daily call logging should be in an external plugin

The only issue I have is that I'd like to optionally disable creation of the individual JSON files, and possibly be able to put the same JSON output in a daily log file. Should I create a config option that does this (having people set up the daily log separately and maybe duplicate the code into the plugin)?

What are the "duplex, mode, priority" fields?

They come from the Service Options field which is various P25 messages. See section 2.3.24 of TIA-102.AABC. image image

rosecitytransit avatar May 21 '23 18:05 rosecitytransit

The only issue I have is that I'd like to optionally disable creation of the individual JSON files, and possibly be able to put the same JSON output in a daily log file. Should I create a config option that does this (having people set up the daily log separately and maybe duplicate the code into the plugin)?

My initial read on this is that just about all of this can be done without much change to t-r. All a logging plugin would need to do is wait for a call_end() and concatenate the resulting .json file from call_info.status_filename to a daily file of your choosing.

I get that there's an overhead inherent in creating a file and then deleting it, but compared to the intermediate .wav files that are being generated an extra few hundred bytes of json isn't much. So long as callLog is set to false in the config, the files should get cleaned up after the plugins are done and leave only the daily logs, right?

taclane avatar May 21 '23 19:05 taclane

I get that there's an overhead inherent in creating a file and then deleting it, but compared to the intermediate .wav files that are being generated an extra few hundred bytes of json isn't much. So long as callLog is set to false in the config, the files should get cleaned up after the plugins are done and leave only the daily logs, right?

This is correct. If there's not a desire to eliminate that overhead, and there's not a desire to have a daily log (and better Web page) be a part of the core Trunk Recorder code, I see 3 options:

  • create an external plugin that creates a custom per-day CSV file (that's not verbose and much smaller than JSON) and includes the Web page and an optional patch to disable JSON file creation, and use this PR to provide a link to it
  • include the Web page in this PR and provide a cat command for the encode/upload script
  • put the Web page somewhere else and use this PR to provide a link to it

Also, as I mentioned a good thing about the plugin hook is that it's called after the upload script, so if one is using that to convert files they're guaranteed to be ready by then.

rosecitytransit avatar May 21 '23 21:05 rosecitytransit

@robotastic do you have thoughts on this? Should I submit a config option that disables the creation of JSON files or just go with one of the 3 options above?

rosecitytransit avatar May 26 '23 14:05 rosecitytransit

Let's leave the JSON file alone for now. It would probably be worth revisiting JSON files and the upload script as a whole sometime, but lets leave that as a new, clean PR because there is so much tangled up in this PR.

It would be best to have the Web Page be external to Trunk Recorder and just have a link to it. That will make it easier to keep it up to date. Options 1 or 3 seem like good approaches to me.

robotastic avatar Jun 03 '23 20:06 robotastic

OK, I created an external plugin that does option 1 and have added a link to README.md. If someone wants to test it, that would be desirable.

rosecitytransit avatar Jun 04 '23 05:06 rosecitytransit