maxlath

Results 214 issues of maxlath

(just putting it here for later, but probably doesn't need to be solved right now) In the current data model (see [entities map](https://inventaire.github.io/entities-map/), a series is constituted of works, which...

entities

In https://inventaire.io/api/activitypub?action=outbox&name=wd-Q60786968 `totalItems=5` but there is no item to be seen in https://inventaire.io/api/activitypub?action=outbox&name=wd-Q60786968&offset=0

bug
activitypub

Something to get to see what entities got a lot of activity lately, possibly filtered by language, to encourage discovery and serandipity(?) Some ideas: * Top author/publisher/subject this [time period],...

feature request
entities

It's currently hard to know the level of activity of a group, having some stats could help. For instance: - date the group was created - number of exchanges between...

feature request
groups

Just documenting this technical possibility, as I have been considering it in the past for various practical reasons, while not necessarily recommending it now. Inspired by https://stackoverflow.com/a/10548919/3324977 ```sh git clone...

code maintainance

ex: * [Le Château des étoiles](https://inventaire.io/entity/wd:Q19955876): each tome is made of 3 volumes in the pre-publication * manga series chapters splitted in volumes differently in Japan, US, France, etc (ex:...

entities

I just did a bit of cleanup, and deleted some 150 spam accounts. The previous attempt to discourage SEO spam https://github.com/inventaire/inventaire-client/pull/389 is online since today, so let's see how it...

spam accounts

avatars.io was used to find avatars for authors from different social media platform. it was a very convenient abstraction over those platform closed or undocumented API, but it's now unfortunately...

entities

So far, only spam accounts have been reported, but even if we have no trace of it for now, some users might have tried to block or report other users,...

spam accounts
moderation
users

> maxlath : hello! Any recommendation on how to import a data dump of 40GB+ of newline delimited JSON in CouchDB? I assume I should go with the bulk import...