machine learning – Convolution layer input size error

I have network which I want to take in a $6 times 2761$ ‘image’ and output into one of three classifications. I have 240 of these images that I would like to train with. My training list therefore has dimension $240 times 6 times 2761$. I modified the LeNet code from the repository, which is used for $28 times 28$ images. I have the code

net = NetChain[{ConvolutionLayer[20, {3, 10}, 
    "Input" -> {1, 6, 2761}], 
   ElementwiseLayer[Ramp, "Input" -> {20, 4, 2752}], 
   PoolingLayer[{2, 10}, 2, "Input" -> {20, 4, 2752}], 
   ConvolutionLayer[50, {2, 10}, "Input" -> {20, 2, 1372}], 
   ElementwiseLayer[Ramp, "Input" -> {50, 1, 1363}], 
   PoolingLayer[{1, 10}, 2, "Input" -> {50, 1, 1363}], 
   FlattenLayer["Input" -> {50, 1, 677}], 
   LinearLayer[1000, "Input" -> {33850}], 
   ElementwiseLayer[Ramp, "Input" -> {1000}], 
   LinearLayer[100, "Input" -> {1000}], 
   ElementwiseLayer[Ramp, "Input" -> {100}], 
   LinearLayer[3, "Input" -> {100}], SoftmaxLayer["Input" -> {3}]}]

NetTrain[net, trainlist, ValidationSet -> testlist]

But then I get the error:

NetTrain::invindim3: Data provided to port "Input" should be a non-empty list of 1*6*2761 arrays, but was a 240*6*2761 array of real numbers.

Afterwards, I realized that the training list for MNIST seems encode each example as one image object instead of a $28 times 28$ greyscale list. I think this is where the problem is arising.