If you want to train a machine learning model on a large dataset such as ImageNet, especially if you want to use GPUs, you’ll need to think about how you can stay within your GPU or CPU’s memory limits. Generators are a great way of doing this in Python. What is a generator? A generator is a function that behaves … Read More
How to use AWS EC2 GPU instances with BitFusion
If you want to train neural networks seriously, you need more computational power than the typical laptop has. There are two solutions: Get (buy or borrow) more computational power (GPUs or servers) or Rent servers online. GPUs cost over a hundred dollars each and top models like the NVIDIA TESLA cost thousands, so it’s usually easier and cheaper to rent … Read More