Ziinc
Ziinc
sorry @ziyouchutuwenwu I only just saw this, must have missed the ping. Parsers are meant for commonly used logic that you want to reuse across spiders. A parser is simply...
I think rendering as a separate package like crawly_live_metrics, like how phoenix does it, would be good But instead of a full-fledge dashboard, i was thinking of widgets that can...
We can use the dependency injection pattern to avoid adding a specific html parser as a dep. On the dev side, we set Floki/meseeks as a dev dependency, and on...
The onus for managing the html parsing dep should be on the end user, as managing adaptors for both libraries would be too much work on our side and too...
I see three possible way to implement such helpers: ### 1. Through a user-defined parsing interface that implements required parsing callbacks ```elixir # User's config config :crawly, parser: MyHtmlParser #...
In my replies, i was talking about why option 1 is preferable as compared to 2 and 3.
No issues with the hybrid approach, it is what quite a few frameworks use for handling json parsing (phoenix for example, off the top of my head). I only worry...
Made all tests pass, the stop_all_spiders function just delegates to the existing `stop_spider/2` function. I also shifted down the on_spider_closed_callback execution position to be within the genserver call handler, so...
@oltarasenko I can't replicate the circle ci tests failure on my local machine, for some reason. All tests are passing on my side. In any case, do have a look...
going to let @oltarasenko decide on the new release :slightly_smiling_face: