Small Data to Big Data by Jeremy Ellis

Oct 1st, 2016

Why use this?

I have been training magenta and other Tensorflow Recurrent Neural Networks for a few months now. See web page at http://rocksetta.com/tensorflow-teacher/3d-print-tensorflow/
I have taken a liking to text based neural network machine learning. As an example that I have not yet done, would be entering short poems and have the deep learning network create it's own short poems. The standard approach would be to make a text file of many short poems. This web page allows you to expand that set of poems into a much larger random dataset. I would use this page to randomize the short poems into a much larger file of the same poems written many times in different order. This seems to increase the speed of training a neural network by a huge amount. I was training networks for 3 or 4 days but with this new system I have found the network seems to learn the data in a few minutes!

What to do?

Put your data in the left box and add a seperator to your data and click the "Small to Big" button then click the "Input to Output" button and repeat

Put your small data here (first line blank)

Seperator
Random Iterations
Your present big data here








The final output goes here. File has lines!