Most efficient method to import bulk JSON data from different sources in postgresql?

I need to import data from thousands of URLs, here is an example of the data:


Since COPY doesn’t support JSON format, i’ve been using this to import the data from some of the URLs:

CREATE TEMP TABLE stage(x jsonb);

COPY stage FROM PROGRAM 'curl https://.....';

insert into test_table select f.* from stage,
   jsonb_populate_recordset(null::test_table, x) f;

But it is inefficient since it creates a table for every import and it imports a single url at a time.
I would like to know if it is possible (through a tool, script or command) to read a file with all the URLs and copy their data into the database.