android-app icon indicating copy to clipboard operation
android-app copied to clipboard

TTS: highlight current word/paragraph

Open foxmask opened this issue 7 years ago • 8 comments

Hi,

I discovered the "text to speech" feature that works very nice.

Could it be possible to highlightening each word that are pronunciated (or at least the paragraph)

That will be usefull for exemple for people with dyslexia.

foxmask avatar Feb 16 '17 14:02 foxmask

It may be a problem since we use WebView to display article content.

Off the top of my head, it may be possible with some kind of JavaScript hacks :/

di72nn avatar Feb 16 '17 14:02 di72nn

If you want to have a look at one app that does that very well it's "@Voice Aloud Reader" (https://play.google.com/store/apps/details?id=com.hyperionics.avar&hl=en). It's even possible to sync his Pocket account to get the article in the app to listen them :)

foxmask avatar Feb 16 '17 14:02 foxmask

@foxmask Did you find a solution to this problem?

yadavabhinav16 avatar Jun 15 '19 07:06 yadavabhinav16

I don't use wallabag anymore You can close the issue

foxmask avatar Jun 15 '19 07:06 foxmask

@foxmask I am not a member of the wallabag team. I was asking if you programmatically solved highlighting each word/sentence from TTS. Please ping me at : [email protected]

yadavabhinav16 avatar Jun 15 '19 08:06 yadavabhinav16

No

foxmask avatar Jun 15 '19 12:06 foxmask

Hi, I hope this is not double-posting, but I wanted to re-invigorate this issue. Wallabag does almost everything I need. I mainly use it in Android and the Text to Speech is essential for me. However, when it is reading, it is hard to know where it is reading from, e.g. if I want to hilighting that spot with an annotation. Other tts apps hilight the sentence being read. @Voice Aloud is a fantastic example:

  • it hilights the sentence being read,
  • you can double-click any sentence to jump to that sentence and continue reading from there,
  • it uses the current sentence/total sentences to calculate % complete,
  • if you close and come back, it resumes from that sentence (Wallabag does that with text reading, but stop and play from the nav bar does not - see #939.
  • it keeps track of the sentence number, even when the app is not in focus.

I feel like a lot of the TTS issues are all related to the same core issue - if Wallabag TTS could keep track of the sentence number (e.g. 13 of 45), it could be used to tell webview to update correctly when focus returns, and could be used to hilight the sentence.

So essentially this foundational work may be required before the javascript magic of hilighting the correct sentence is possible?

nosignal101 avatar Oct 09 '22 13:10 nosignal101

I feel like a lot of the TTS issues are all related to the same core issue - if Wallabag TTS could keep track of the sentence number (e.g. 13 of 45), it could be used to tell webview to update correctly when focus returns, and could be used to hilight the sentence.

I feel I should clarify - I would prefer to use Wallabag over @Voice if I could get the TTS working smoothly. This is because @Voice is not really a store of articles: they do not sync between devices properly, there is no full text search, it is not as user friendly, etc. Essentially, @Voice needs to be combined with another service like Instapaper or Pocket for the saving of articles, and I have not had luck with making that work, and even then it adds duplicate work as I don't think it would sync folders or annotations, etc.

nosignal101 avatar Oct 09 '22 13:10 nosignal101