magento2 – Reference column block_id in reference table cms_block has no index

I'm trying to create a foreign key Magento 2.3 above db_schema.xml to create a reference to the CMS block.

db_schema.xml



To run magento setup: upgrade give me this error and finish the upgrade:

Reference column block_id in reference table cms_block does not have
index

I found in almost the same code Provider / magento / module-cms / etc / db_schema.xml: 39



Where is the difference / the problem?

Thank you very much

Use vanilla Javascript to display the output in an HTML table

function main() {

     let orderRepo = new OrderRepository();

     let yesterdaysOrders = (

    {
    id: 1,
    orderLines: (
        { itemName: "Item 01", quantity: 1 },
        { itemName: "Item 02", quantity: 3 },
        { itemName: "Item 03", quantity: 25 },
        { itemName: "Item 04", quantity: 12 },
     ),
    },
    {
    id: 2,
    orderLines: (
        { itemName: "Item 01", quantity: 1 },
        { itemName: "Item 08", quantity: 42 },
        { itemName: "Item 09", quantity: 13 },
        { itemName: "Item 12", quantity: 37 },
    ),
    },
    {
    id: 3,
    orderLines: (
        { itemName: "Item 12", quantity: 16 },
    ),
    },
    {
    id: 4,
    orderLines: (
        { itemName: "Item 10", quantity: 11 },
        { itemName: "Item 11", quantity: 10 },
    ),
    },
    {
    id: 5,
    orderLines: (
        { itemName: "Item 06", quantity: 7 },
        { itemName: "Item 07", quantity: 2 },
        { itemName: "Item 12", quantity: 14 },
    ),
    },
    {
    id: 6,
    orderLines: (
        { itemName: "Item 05", quantity: 17 },
    ),
    },
    {
    id: 7,
    orderLines: (
        { itemName: "Item 03", quantity: 5 },
        { itemName: "Item 07", quantity: 2 },
    ),
    },
    {
    id: 8,
    orderLines: (
        { itemName: "Item 02", quantity: 13 },
        { itemName: "Item 07", quantity: 7 },
        { itemName: "Item 09", quantity: 2 },
    ),
    },
    {
    id: 9,
    orderLines: (
        { itemName: "Item 01", quantity: 4 },
        { itemName: "Item 06", quantity: 17 },
        { itemName: "Item 07", quantity: 3 },
    ),
    },
    {
    id: 10,
    orderLines: (
        { itemName: "Item 11", quantity: 12 },
        { itemName: "Item 12", quantity: 1 },
    ),
    }
    ),
    result = Array.from(
      yesterdaysOrders.reduce((acc, { orderLines }) => {
        orderLines.forEach(({ itemName, quantity }) => acc.set(itemName, (acc.get(itemName) || 0) + quantity));
        return acc;
        }, new Map), ((itemName, quantity)) => ({ itemName, quantity }));

   
   
      result.sort((a, b) => {
      if (a.quantity > b.quantity) {
        return -1;
      } else if (a.quantity < b.quantity) {
        return 1;
      } else {
        return 0;
      }

     });
     
     displayShelfItemPair(shelfName, itemName) {

         let table = document.createElement('table');

         let div = document.createElement('div');

         div.id = "data-list";

         document.getElementById("data-list").appendChild(table); 

       } 

       displayShelfItemPair();

     }

SQL Server – Create and update a parent SQL column with values ​​extracted from the same table

I have a table named Name and Name. I need to add the new Parent column to the table and include the generic values. NameId is currently the PK.

table

Enter image description here

I have created a temporary table that contains the parent values. Could not figure out how to update the "parent" column values. Please give advice on what I can do to achieve this.

Thank you very much

Field syntax "Google Sheets / Pivot Table Editor / Filter / Filter By Condition / Custom Formula Is"

When I create a pivot table in Google Sheets, there is one Pivot Table Editor I can use to configure the pivot table.

The bottom of the editor's interface is filter (which allows me to add filters). If I add a filter, I have to select a column from the source data. After I have selected a column, I am able Filter by condition and Filter by values,

By values is simple enough – it presents a checklist. Under the condition works well with options like Contains text and Text ends with, But the last option in the by values List is Custom formula is, This option interests me, but is not fully documented on the Help page (https://support.google.com/docs/answer/7572895?hl=en).

What is the syntax of the formula? How do I reference the line in the source area of ​​the pivot table to which the filter is applied? I did a test that I attend = M3 = 5 and if I set the value 5 to cell M3 of the worksheet where the pivot table is located (not the worksheet that contains the source area), then the filter is all-passed, otherwise it is not passable. What is the meaning of it? How do I relate to the value of the cell in question? That would be very useful! Then I could use a formula like = AND(!="Financial", !="Income") and other such constructs, but at the moment it is (the Custom formula is Option) is almost useless.

mysql – To insert all entries into a map table

If you never delete from it table1. table2, or table3

STEP 1

Use a Cartesian join to generate a script that makes individual inserts

SQL="SELECT CONCAT('INSERT IGNORE INTO keymap VALUES"
SQL="${SQL} (',col1,',',col2,',',col3,');') FROM"
SQL="${SQL} (SELECT col1 FROM table1) A,"
SQL="${SQL} (SELECT col2 FROM table2) B,"
SQL="${SQL} (SELECT col3 FROM table3) C;"

read -s pw ; echo ${pw} | less
DB=mydb
echo "SET foreign_key_checks = 0;"         > All_Key_Combinations.sql
mysql -uroot -p${pw} -D${DB} -ANe"${SQL}" >> All_Key_Combinations.sql

STEP 2

Check the script and make sure it's ok

head All_Key_Combinations.sql
echo
tail All_Key_Combinations.sql
echo
sleep 15
less All_Key_Combinations.sql

STEP 3

Run All_Key_Combinations.sql

DB=mydb
mysql -uroot -p${pw} -D${DB} < All_Key_Combinations.sql

If you ever delete from it table1. table2, or table3

STEP 1

Use a Cartesian join to generate a script that makes individual inserts

SQL="SELECT CONCAT('INSERT INTO keymap VALUES"
SQL="${SQL} (',col1,',',col2,',',col3,');') FROM"
SQL="${SQL} (SELECT col1 FROM table1) A,"
SQL="${SQL} (SELECT col2 FROM table2) B,"
SQL="${SQL} (SELECT col3 FROM table3) C;"

read -s pw ; echo ${pw} | less
DB=mydb
echo "SET foreign_key_checks = 0;"         > All_Key_Combinations.sql
echo "TRUNCATE TABLE keymap;"             >> All_Key_Combinations.sql
mysql -uroot -p${pw} -D${DB} -ANe"${SQL}" >> All_Key_Combinations.sql

STEP 2

Check the script and make sure it's ok

head All_Key_Combinations.sql
echo
tail All_Key_Combinations.sql
echo
sleep 15
less All_Key_Combinations.sql

STEP 3

Run All_Key_Combinations.sql

DB=mydb
mysql -uroot -p${pw} -D${DB} < All_Key_Combinations.sql

jquery – Saves the HTML data of the WP list table of the current page in a PHP variable

I am trying to save my renderpost table HTML data only to the table in a PHP variable after the table is completely loaded.

I tried to save:

$table = new Custom_Table_Example_List_Table()
$tablehtml = $table->display

However, this returns an array, not the fully rendered table. (Eg …).

I tried to use JS var table = $(#tableid").html and trigger with document.write, but always comes back as undefined. But when I associate it with a click action of a button, the HTML data is displayed well.

It seems I have to capture the data after the system knows that the table and the query are fully loaded.

javascript – How do I sort columns in a data table in Angular?

angular.module('appOPEE')
       .directive('legalPersonTable', ('$rootScope','$window', '$location', '$translate', '$filter', 'appSettings', 'AuthService', '$http', function ($rootScope, $window, $location, $translate, $filter, appSettings, AuthService, $http) {

           function formatString(input) {
               var args = arguments;
               return input.replace(/{(d+)}/g, function (match, capture) {
                   return args(1 * capture + 1);
               });
           };

           return function (scope, element, attrs) {

               var dataTable = element.dataTable({
                   "bDestroy": true,
                   "bFilter": false,
                   "autoWidth": false,
                   "searching": false,
                   "bLengthChange": false,
                   "fnDrawCallback": function () { $('(data-toggle="tooltip")').tooltip(); },
                   "fnRowCallback": function (nRow, aData, iDisplayIndex) {

                       if (aData.IdcAtivo == false || aData.IdcAtivo == null) {
                           $(nRow).addClass('disabledLine');
                           jQuery('td:eq(0)', nRow).html(aData.NomInstituicaoCredenciada + ' (' + $translate.getTranslationTable().LABEL_DISABLED + ')');
                       }

                       var htmlAcoes = '';

                       if (AuthService.userHasPermission('73.3')) {
                           htmlAcoes = '
' + '' + '' + '' + '
'; } if (AuthService.userHasPermission('73.2')) { htmlAcoes += '
' + '' + '' + '' + '
'; } if (AuthService.userHasPermission('73.2')) { htmlAcoes += (aData.IdcAtivo ? '' + '' + '' : '' + '' + ''); } jQuery('td:eq(5)', nRow).html('
' + htmlAcoes + '
'); jQuery('#details', nRow).bind('click', function () { $location.path('LegalPerson/Details/' + aData.IdePessoaJuridica, '_blank'); scope.$apply(); }); jQuery('#edit', nRow).bind('click', function () { $location.path('LegalPerson/Update/' + aData.IdePessoaJuridica); scope.$apply(); }); jQuery('#deactivate', nRow).bind('click', function () { scope.setStatus(aData.IdePessoaJuridica, false); }); jQuery('#activate', nRow).bind('click', function () { scope.setStatus(aData.IdePessoaJuridica, true); }); return nRow; }, "aaSorting": (), "sPaginationType": "full_numbers", "aoColumns": ( { "mDataProp": "NomInstituicaoCredenciada", sDefaultContent: "ND", bSearchable: false, bSortable: true }, { "mDataProp": "CodInstituicaoCredenciada", sDefaultContent: "ND", bSearchable: false, bSortable: true }, { "mDataProp": "DscCnpj", sDefaultContent: "ND", bSearchable: false, bSortable: true }, { "mDataProp": "DscTelefone", sDefaultContent: "ND", bSearchable: false, bSortable: true }, { "mDataProp": "NomUnidadeFederativa", sDefaultContent: "ND", bSearchable: false, bSortable: true }, { "mDataProp": "OPTIONS", sDefaultContent: "ND", bSearchable: false, bSortable: false } ), "oLanguage": { "sProcessing": $rootScope.translationTable.PROCESSING + "...", "sLengthMenu": formatString($rootScope.translationTable.LABEL_SHOW_RECORDS, "_MENU_"), "sZeroRecords": $rootScope.translationTable.MSG_NO_RECORDS_FOUND, "sInfo": formatString($rootScope.translationTable.LABEL_SHOWING_RECORDS, "_START_", "_END_", "_TOTAL_"), "sInfoEmpty": formatString($rootScope.translationTable.LABEL_SHOWING_RECORDS, "0", "0", "0"), "sInfoFiltered": formatString($rootScope.translationTable.LABEL_MAX_FILTERED_RECORDS, "_MAX_"), "sInfoPostFix": "", "sSearch": $rootScope.translationTable.SEARCH + ":", "sUrl": "", "oPaginate": { "sFirst": $rootScope.translationTable.FIRST, "sPrevious": '', "sNext": '', "sLast": $rootScope.translationTable.LAST } } }); scope.$watch(attrs.aaData, function (value) { var val = value || null; if (val != null) { if (val.length == 0) { dataTable.fnClearTable(); } else { dataTable.fnClearTable(); dataTable.fnAddData(scope.$eval(attrs.aaData)); } } }); }; }))

postgresql – The most efficient way to insert data into an archiving table without affecting production?

For example, suppose we have Table A:

create table if not exists T(
column1 int,
column2 varchar,
column3 date
);

and archiving table Archive:

create table if not exists TArchive(
column1 int,
column2 varchar,
column3 date
);

What would be the best way to insert dates that are older than x enter into an archive without locking the table T in the production? Assuming that the table T contains a large number of lines.

I did research for hours. SQL Server has several approaches: https://www.brentozar.com/archive/2018/04/how-to-delete-just-some-rows-from-a-really-big-table/
But I can hardly find anything in PostgreSQL.

Should you simply take the data directly from the T-table and import it into TArchive?

Should you import your data first into a temporary table and then into the archive table? And if so, why is this approach better if you run the inserts twice for the same data?

How many functions should you do? A function to govern all? Or one function for archiving and another for deleting the old data?

Are there any other approaches?

Partitioning – Archiving a large SQL Server table with change data collection and foreign key constraints for multiple columns

We're trying to archive a huge table of ~ 170 million rows partitioned for which change data collection (CDC) is enabled and for which there are ~ 5 foreign key constraints.

We opted for archiving because the daily full backup takes a long time. Therefore, we plan to leave the archive table on a different drive.

We tried the following methods, and they all take a long time:

  • We can not paste the most recent data into another table, and then rename the tables accordingly based on foreign key constraints.

  • Inserting the SSIS package into the old data archive table takes about 6 days in the batch insert

  • BCP also needs ~ 7 days.

Please suggest the best ideas / suggestions considering CDC and foreign keys.