creep icon indicating copy to clipboard operation
creep copied to clipboard

🌚 A specialized image download utility, useful for grabbing massive amounts of random images.

creep

A specialized image download utility, useful for grabbing massive amounts of random images.

creep logo

Go

Creep can be used to generate gobs of random image data quickly given a single URL. It has no dependencies or requirements and is cross-platform.

  • creep
    • Install
      • Prebuilt Binaries
      • Build from Source
    • Usage
      • Options
      • Examples
      • Sample URLs
    • Why
    • Contributing
    • Author
    • License

Install

Prebuilt Binaries

Install a prebuilt binary from the releases page.

Build from Source

go get github.com/splode/creep/cmd/creep

Usage

Simply pass in a URL that returns an image to creep to download. Pass in a number of images, and creep will download them all concurrently.

Usage:
  creep [FLAGS] [OPTIONS] [URL]

URL:
  The URL of the resource to access (required)

Options:
  -c, --count int         The number of times to access the resource (defaults to 1)
  -n, --name string       The base filename to use as output (defaults to "creep")
  -o, --out string        The output directory path (defaults to current directory)
  -t, --throttle int      Number of seconds to wait between downloads (defaults to 0)

Flags:
  -h, --help              Prints help information
  -v, --version           Prints version information

Options

URL

Specifies the HTTP URL of the image resource to access. This is the only required argument.

--count

The number of times to access and download a resource. Defaults to 1.

--name

The base filename of the downloaded resource. For example, given a count of 3, a name of cat and url that returns jpg, creep will generate the following list of files:

cat-1.jpg
cat-2.jpg
cat-3.jpg

Defaults to "creep".

--out

The directory to save the output. If no directory is given, the current directory will be used. If the given directory does not exist, it will be created.

--throttle

Throttle downloads by the given number of seconds. Some URLs will return a given image based on the current time, so performing requests in very quick succession will yield duplicate images. If you're receiving duplicate images, it may help to throttle the download rate. Throttling is disabled by default.

Examples

Download 32 random images to the current directory.

creep -c 32 https://thispersondoesnotexist.com/image

Download 64 random images using the base filename random to the downloads folder, throttling the download rate to 3 seconds.

creep --name=random --out=downloads --count=64 --throttle=3 https://source.unsplash.com/random

Download a single random image to the current directory.

creep https://source.unsplash.com/random

Sample URLs

The following URLs will serve a random image upon request:

Why

I frequently find myself needing to seed application data sets with lots of images for testing or demos. Given a few minutes searching for a tool, I wasn't able to find something that suited my requirements, so I built one.

Why Go and not simply script curl or python? Go's concurrency model makes multiple HTTP requests fast, and being able to compile to a single, cross-platform binary is handy. Besides, Go's cool.

Contributing

Contributions are welcome! See CONTRIBUTING for details.

Author

Christopher Murphy

License

MIT