Postgresql group by aliased jsonb field

I’m trying to group by the result of a jsonb operation on an aliased field, but getting an error I would not expect.

The following work as expected:

select jsonb_build_object('x', 1) as a group by a;
select jsonb_build_object('x', 1) as a group by jsonb_build_object('x', 1)#>>'{x}';

    a     
----------
 {"x": 1}

But this gives me an error:

select jsonb_build_object('x', 1) as a group by a#>>'{x}';
ERROR:  column "a" does not exist

Is this a bug in postgresql (13.3)? Is there any way around it other than repeating the entire select expression in the group by?

postgresql – Postgres 12 JSONB key selection with array value

I have a column of a database table that is of type JSONB and I’m wanting to get some data from that column. For the most part the column is a flat list of key value pairs.

Ex:

{ s_key: 'value', s_key1: 'value', s_key2: 'value' ...etc }

However, the key I’m after contains an array of JSON data (or can be null/nil):

key: ( {first_name: 'Hugo', last_name: 'Grant', ind: true }, 
       {first_name: 'Larry', last_name: 'Larson', ind: false },
       {first_name: 'Rick', last_name: 'Flair', ind: 'true' } )

Now, what I want I do is have a sub select that gives me the concat’d name string (first_name + last_name) based on the ind (whether or not it’s true/’true’). So, I want an output of:

( 'Hugo Grant', 'Rick Flair' )

I’ve achieved this — to a degree — with this PSQL Snippet:

    select t.id, array_agg(t._name) as _board
    from (
        select 
            d.id,
            jsonb_extract_path_text(jsonb_array_elements(
                case jsonb_extract_path(d.data, 'board_members') 
                    when 'null' then '({})'::jsonb 
                    else jsonb_extract_path(d.data, 'board_members') 
                end
            ), 'first_name') || ' ' || jsonb_extract_path_text(jsonb_array_elements(
                case jsonb_extract_path(d.data, 'board_members') 
                    when 'null' then '({})'::jsonb 
                    else jsonb_extract_path(d.data, 'board_members') 
                end
            ), 'last_name') as _name
        from my_table d
        group by d.id
    ) t
    group by t.id

Is there a way to simplify the SQL statement?

postgresql – Join on Jsonb Path Query with filter condition

I have a jsonb column with rows like this. “regions” array contains up to 60 elements. All rows contain the same list of ids but ‘result’ differs.

{
"a": (
        {
            "key": "a1",
            "value": (
                {
                    "regions": (
                        {                            
                            "id": 1000,
                            "result": "Good"
                        }
                    )
                }
            )
        },
        {
            "key": "a2",
            "value": (
                {
                    "regions": (
                        {                            
                            "id": 8,                            
                            "result": "Good"
                        },
                        {
                            "id": 0,
                            "result": "Bad"
                        },
                        {
                            "id": 5,                            
                            "result": "Good"
                        }
                    )
                }
            )
        }
    )
}

When I do

SELECT 
 q.q->>'id' AS "Id",
 q.q->>'result' AS "Result" 
FROM my_table,
 LATERAL jsonb_query_path(my_table.jsonbcol, '$.a(*).value(*).regions ? (@.id > 2 and @.id < 10)') q(q)

No index is used for ‘id’ comparison. It says ‘Function Scan’ for jsonb_query_path. Is there a way to improve the speed of this query execution?

postgresql – Extract JSONB column into a separate table

I have a table with roughly the following columns:

CREATE TABLE records (
    id integer NOT NULL,
    ...
    metadata jsonb DEFAULT '{}'::jsonb
)

... includes the rest of the columns which have either integer, varying or timestamp columns. metadata column holds information about the current state of the system, which means that the properties might change over time and are not the same for all the records.

Currently, records table receives a lot of writes and reads, but very few updates. Current data size is over 70GB (+ 30GB index size) with over 160 million rows.

Recently, the query planner on the records table queries and joins (also few queries with anti-joins) is showing bad estimates and it’s generally very slow. I’ve found that more than 52% of the data is stored under metadata column:

SELECT sum(pg_column_size(metadata)) AS total_size,
       avg(pg_column_size(metadata)) AS average_size,
       sum(pg_column_size(metadata)) * 100.0 / pg_relation_size('records') AS percentage
FROM records;
 total_size  |     average_size     |     percentage      
-------------+----------------------+---------------------
 40108195852 | 218.9854922110055888 | 52.7480535656110357
(1 row)

This column is almost never queried (except rare troubleshooting cases) and only needs to be read few times a week.

Does it make sense to extract this into a separate table and have as many new columns needed there?

CREATE TABLE record_metadata (
  record_id integer,
  column1 ...,
  column2 ...

Will this change ultimately improve the performance of regular read queries on the records table?

json – How to keep array fields in PostgreSQL jsonb

It seems that PostgreSQL (12+) cannot keep the PostgreSQL array type in jsonb objects. If a PostgreSQL array is stored in a jsonb, it seems that it’s converted to a jsonb array.

For example, given the following array:

=> SELECT ARRAY('world');
  array  
---------
 {world}
(1 row)

If one stores it into a jsonb field and then extract that field as a string:

=> SELECT (jsonb_build_object('location', ARRAY('world'))::text::jsonb)->>'location';
 ?column?  
-----------
 ("world")
(1 row)

, the output isn’t a PostgreSQL array (or string representation of an array) anymore.

The array’s type seems to have been erased when it’s stored into the jsonb. So one cannot use the array as it is anymore.

For example, the format_x() function/extension works correctly on simple types:

=> SELECT format_x('Hello %(location)s!', jsonb_build_object('location', 'world'));
   format_x   
--------------
 Hello world!

, but breaks down (produces incorrect result) on array types.

=> SELECT format_x('Hello %(location)L!', jsonb_build_object('location', ARRAY('world')));
      format_x      
--------------------
 Hello '("world")'!

In above, the array is no longer represented as an PostgreSQL array, which breaks things if you use this to build a dynamic SQL.

My question is:

Is there a way to store PostgreSQL arrays faithfully in JSONB objects?

json – How to keep array fields in PostgreSQL jsonb

It seems that PostgreSQL (12+) cannot keep the PostgreSQL array type in jsonb objects. If a PostgreSQL array is stored in a jsonb, it seems that it’s converted to a jsonb array.

For example, given the following array:

=> SELECT ARRAY('world');
  array  
---------
 {world}
(1 row)

If one stores it into a jsonb field and then extract that field as a string:

=> SELECT (jsonb_build_object('location', ARRAY('world'))::text::jsonb)->>'location';
 ?column?  
-----------
 ("world")
(1 row)

, the output isn’t a PostgreSQL array (or string representation of an array) anymore.

The array’s type seems to have been erased when it’s stored into the jsonb. So one cannot use the array as it is anymore.

For example, the format_x() function/extension works correctly on simple types:

=> SELECT format_x('Hello %(location)s!', jsonb_build_object('location', 'world'));
   format_x   
--------------
 Hello world!

, but breaks down (produces incorrect result) on array types.

=> SELECT format_x('Hello %(location)L!', jsonb_build_object('location', ARRAY('world')));
      format_x      
--------------------
 Hello '("world")'!

In above, the array is no longer represented as an PostgreSQL array, which breaks things if you use this to build a dynamic SQL.

My question is:

Is there a way to store PostgreSQL arrays faithfully in JSONB objects?

postgresql – Postgres Case-Insensitive search for Jsonb column tag keys

I have a requirement to obtain the rows which match given tags key and value case-insensitively.

Here: Key search should be case insensitive and values may be String or Array of String.

Right Now, I am using following Query :

Database : Postgres

select * from my_table_name where jsonb_contains(lower(to_jsonb(jsonb_extract_path(tags,'key1'))::TEXT)::jsonb, to_jsonb('"value1"'::jsonb))

But it is searching key as case sensitive manner.

For Example:
above query should return all records having key is (key1,Key1,KEY1) and value is ‘Value1’
Can some one help me on this?

postgresql insert into jsonb key failing with syntax error at or near “->>”

Running the following query auto-generated by Eloquent’s upsert function is throwing a syntax error and I’m not sure why. I couldn’t find a supporting statement that says postgres supports the following syntax, looking for some expert advice on whether this would work.

insert into "plugin_positions" ("created_at", "positions"->>"test", "slug", "tag", "updated_at") values ('2021-04-10 17:30:40', 0, 'contact-for-telegram', 'rrss', '2021-04-10 17:30:40');

Here’s the query that works (which uses the simple column name, and a valid json value):

insert into "plugin_positions" ("created_at", "positions", "slug", "tag", "updated_at") values ('2021-04-10 17:30:40', '{"test":0}', 'contact-for-telegram', 'rrss', '2021-04-10 17:30:40');

Does postgresql allow inserting into a table if we specify the column as "positions"->>"test"?