Really easy. Go to GCP or AWS. Find out the type of GPU compute instance you need. Remember, the TensorBook has 64G of memory. For me, it’s about $400/month and I run models constantly. My Lambda laptop cost ~$4,000. So the payback in this case is ~10 months.
There’s a bigger benefit. With the Lambda laptop I run PyCharm and can set breakpoints and inspect variables to debug my models. This saves tons of time over logging and printing.