Why does the size of fields in a collection in mongodb database affect query performance?

I had a collection that was growing and with it’s growth had a crazy increase in query response times.

The collection records looked something like this:

    orderStat: { type: String },
    items: { type: Array },
    docBigValue: { type: String },
    detailsBigValue: { type: Object }

I did everything I could think of like ensure I was indexing on queried fields, use .lean(), excluded the fields (docBigValue, detailsBigValue) with huge values from queries, limit the number of records returned per query, stopped allowing fuzzy search using like. Everything i did made an improvement but the rate of new records quickly made the value behind each improvement lost.

Today I looked at the collections avgObjSize and found it was over 73,000 while most collections had a value under 500. I took a backup and then deleted one of the large fields from all the records. My query response time went from over 6 seconds to under half a second.

Why does the size of a field in a collection affect query response times even when that field is excluded from the query? Is this a bug or normal?

Mongodb 4.x