After the Geo for Good Summit 2019, all my focus has been geared toward integrating Machine Learning Models using Tensorflow with the Satellite data.  One of the issues for exporting the training and testing data especially with a large number of features or huge areas for the neighborhood pixels if you are using the Fully Convoluted Neural Net is computed value too large error, others being time out error and memory error. To overcome the computed value too large, you might need to adjust the bounds of your polygon (or increase the tilesize or increase the scale of computation).

Here I will show a simple trick that can be applied to decrease the size of the polygon on the fly so you don’t need to draw the training and testing polygons and export again.

var N = 100;
var geometry = ee.Feature(trainingPolygon).geometry();

var perimeter = geometry.perimeter({maxError: ee.ErrorMargin(1)}).getInfo();
var region = geometry.buffer(-1 * Math.sqrt(perimeter / 16) * N);
// or alternatively calculate the centroid and buffer it.
// here the resulting region will be circle
region = geometry.centroid({maxError: ee.ErrorMargin(1)}).buffer(N*1000); // in km
// then sample as
var sample = arrays.sample({
    region: region,
    scale: 30,
    numPixels: 20,
    seed: N