govuk_crawler_worker icon indicating copy to clipboard operation
govuk_crawler_worker copied to clipboard

parseURLs function will return as soon as it encounters an error

Open alexmuller opened this issue 8 years ago • 0 comments

Will URLs further down the page get parsed?

for _, u := range urls {
  u, err := url.Parse(u)
  if err != nil {
    return parsedURLs, err
  }
  parsedURLs = append(parsedURLs, u)
}

alexmuller avatar May 26 '16 13:05 alexmuller