This script generates a dataset similar to the Multi-MNIST dataset described in .
 Eslami, SM Ali, et al. “Attend, infer, repeat: Fast scene understanding with generative models.” Advances in Neural Information Processing Systems. 2016.
sample_multi(num_digits, canvas_size, mnist)¶
mk_dataset(n, mnist, max_digits, canvas_size)¶
Load a dataset of hourly origin-destination ridership counts for every pair of BART stations during the years 2011-2018.
This downloads and preprocesses the dataset the first time it is called, requiring about 300MB of file transfer and storing a few GB of temp files. On subsequent calls this reads from a cached
Returns: a dataset is a dictionary with fields:
- ”stations”: a list of strings of station names
- ”start_date”: a
datetime.datetimefor the first observaion
- ”counts”: a
torch.FloatTensorof ridership counts, with shape
(num_hours, len(stations), len(stations)).
Create a tiny synthetic dataset for smoke testing.