TvTimeToTrakt
TvTimeToTrakt copied to clipboard
FileNotFoundError: No such file or directory: 'tv-time-personal-data/seen_episode.csv
2024-04-16 09:00:17 [INFO] :: Processing both watched shows and movies.
Traceback (most recent call last):
File "/mnt/c/users/sanjeev/desktop/trakt/TvTimeToTrakt/TimeToTrakt.py", line 166, in <module>
start()
File "/mnt/c/users/sanjeev/desktop/trakt/TvTimeToTrakt/TimeToTrakt.py", line 159, in start
process_watched_shows()
File "/mnt/c/users/sanjeev/desktop/trakt/TvTimeToTrakt/TimeToTrakt.py", line 87, in process_watched_shows
with open(WATCHED_SHOWS_PATH, newline="", encoding="UTF-8") as csvfile:
FileNotFoundError: [Errno 2] No such file or directory: '../tv-time-personal-data/seen_episode.csv'
I dont have that file, but user_tv_show_data is there, which contains similar data. Is there any chance they changed the file name?
I thought so, and replaced the file name. then this happened
2024-04-16 09:02:28 [INFO] :: Processing watched shows.
Traceback (most recent call last):
File "/mnt/c/users/sanjeev/desktop/trakt/TvTimeToTrakt/TimeToTrakt.py", line 166, in <module>
start()
File "/mnt/c/users/sanjeev/desktop/trakt/TvTimeToTrakt/TimeToTrakt.py", line 152, in start
process_watched_shows()
File "/mnt/c/users/sanjeev/desktop/trakt/TvTimeToTrakt/TimeToTrakt.py", line 95, in process_watched_shows
tv_time_show = TVTimeTVShow(row)
File "/mnt/c/users/sanjeev/desktop/trakt/TvTimeToTrakt/searcher.py", line 108, in __init__
super().__init__(row["tv_show_name"], row["updated_at"])
KeyError: 'updated_at'
which is already an open issue #48
So, I guess they dropped seen_episode.csv and instead dumped everything in tracking-prod-records.csv and tracking-prod-records-v2.csv. Now, the data is fragmented as, the list of shows and their updated_at records are in the tracking-prod-records.csv and the specific season, episode progress along with episode ID are in tracking-prod-records-v2.csv.
Here the issue is that the updated_at record at tracking-prod-records.csv is just the last updated date, which is useless.
We can skip the date and we can mark them as a new watch with the current date and time so that we can at least migrate the show's progress. I'm gonna work on that
I juste received my data from TVTime, and can confirm that they changed the format of the export.
I do have a seen_episode.csv on my side but it contains 3 lines.
tracking-prod-records.csv and tracking-prod-records-v2.csv seems to contain all of my history.
@lukearran Anybody already started the work of adapting this projet to the new GDPR export format ?
anyone know what this ("51342700-04-23 16:33:20") date format is found in "tracking-prod-records-v2.csv" under "updated-at"?
Unsure if anyone else has exported their data recently, but it seems they've exported all of the date time data incorrectly in "tracking-prod-records-v2.csv" - at least in my case.
The correct date can be found by converting to unix, dividing by 10^6, then converting back.
Unsure if anyone else has exported their data recently, but it seems they've exported all of the date time data incorrectly in "tracking-prod-records-v2.csv" - at least in my case.
The correct date can be found by converting to unix, dividing by 10^6, then converting back.
I guess so, but many entries in my export didn't have a date at all. That made working with the script more difficult. Also, some entries have no episode or season number, but just series name. We have to rewrite the script from scratch to accommodate all these changes
I juste received my data from TVTime, and can confirm that they changed the format of the export.
I do have a seen_episode.csv on my side but it contains 3 lines.
tracking-prod-records.csv and tracking-prod-records-v2.csv seems to contain all of my history.
@lukearran Anybody already started the work of adapting this projet to the new GDPR export format ?
Not that I'm aware of @thelouisvivier, however I'm no longer in the loop with this project. If the is a suitable fork in the future, or a new script, which is adapted for the updated exported files then please let me know. I'll reference it in the README.
We have to rewrite the script from scratch to accommodate all these changes
@sanjeevstunner I created a script to convert the dates in v2 into the correct dates - however its in java. When I made it in python it was off by a couple days but I don't know if theres a better way to do it in python.
I found another issue however, some data is exclusive to one of the files and some is shared between both. I don't seem to have any missing episode or season numbers but I did have some missing a season name.
We have to rewrite the script from scratch to accommodate all these changes
@sanjeevstunner I created a script to convert the dates in v2 into the correct dates - however its in java. When I made it in python it was off by a couple days but I don't know if theres a better way to do it in python.
I found another issue however, some data is exclusive to one of the files and some is shared between both. I don't seem to have any missing episode or season numbers but I did have some missing a season name.
Share the script, I’ll give it a shot this weekend. Also, share the data with me if thats okay, so I can have a better sample size.
You can ping me here https://t.me/chandlerbyng
Hey guys,
Didn't have time to work on that recently...
Anyone had some progress on that new script ?
I tried, but no progress. I’m just hoping you’ll fix this some time lol
I told TVTime about the issue when I first encountered it and I receieved an email a couple days ago saying my data is being resent. I assume this means they've fixed the issue and I'll update here if so.
@AdClarky ok that's good to hear. I'll try the same then. I'll update this issue when I will receive the answer.
I told TVTime about the issue when I first encountered it and I receieved an email a couple days ago saying my data is being resent. I assume this means they've fixed the issue and I'll update here if so.
Send the email you sent plox
Received my data, had some issues still but actually usable now. Updated the repo so it should work (worked for me anyway) just need @lukearran to merge
Received my data, had some issues still but actually usable now. Updated the repo so it should work (worked for me anyway) just need @lukearran to merge
That's great, I'll review it by tomorrow.
@AdClarky You're the best ! I am still waiting for my data from support. Will try as soon as I received it.
Thanks ! 🤙🏻