Why do I have to download all SharePoint content to get just a piece of data?

In my environment we have thousands of SharePoint sites. Something as “simple” as Project Online creates a separate SharePoint site for every project. So when I want to find some information across all projects, i.e. all sites, I use PowerQuery in Excel or Power BI to trawl across these site collections and retrieve the data I need.

The problem is, that I have to basically download a few gigabytes of data to produce a result that measures less than 1Mb.

For example: All sites should have a certain list named “ABC”. Do they? Well, using OData REST API calls, because of single sign on etc. my queries which call the list by name, take about 3 hours to run, if they succeed at all. The little indicator tells me “2.09Gb from …sharepoint.com” as it downloads all the data to produce a table with 3 columns and under 2,000 rows.

  1. I don’t understand what all the downloaded data is needed for.
  2. I don’t understand where that data goes – I think it’s just tossed
    aside – so why do it in the first place?
  3. Is there a more efficient way to do this?