python – Updating a list of tuples

Lets say i have a list of tuples

tl = (("x","a"),("y","a"),("z","a"))

and for every tuple in the list I want to replace the occurrence of the first item in the tuple in the rest of the tuples except the tuple I’m currently working with so it would be something like this.

First iteration: (“x”,”y”)

tl = (("x","y"),("y","z"),("y","a"))

Second iteration: (“y”,”z”)

tl = (("x","z"),("y","z"),("z","a"))

Third iteration: (“z”,”a”)

tl = (("x","a"),("y","a"),("z","a"))

What would be the most efficient way to achieve this? and will it be easier to achieve it if I have a dictionary instead?

iphone – How do I get an app from my old phone onto a new phone without updating the app (via IPA file)?

This can be done from Configurator 2 if and only if you already have the .ipa file.

iTunes used to sync .ipa files back from the phone as part of the backup strategy. This practise stopped some years ago & has since relied entirely on OTA downloads of current versions from the App Store.

It may be possible to use the older & now unsupported last version of iTunes that allowed this, to extract an .ipa from the current phone, which you could then re-sync to a newer phone… assuming it will run on the newer phone.

The link to the last version of this 12.6.x.x is here – Apple KB: Deploy apps in a business environment with iTunes however it appears it will not run in later macOS versions without some effort. See MacRumors: iTunes 12.6.5.3 on MacOS Catalina and also apparently will still work for an iPhone 8, but not an 11 – Reddit: Does iTunes 12.6.5.3 work with iOS 14?

All of this does imply it might be a tougher job than it once was. I cannot test any of this empirically, I’m afraid.

As a long shot, you may have the .ipa buried deep within Time Machine, if it goes back far enough. This was actually how I achieved a similar task a few years ago.

rest api – 403 Forbidden when Updating (I presume POST?) but not Creating (PUT?) an image

I’m using the WordPressPCL library for .NET alongside WooCommerce (although my problem is with the WordPress API, not the WooCommerce API), and trying to upload an image alongside my product

My current process is

  1. Upload the image using WordPressPCL, using the Media.Create() method which I presume PUTs the image
  2. Create the product using WooCommerceNET, with the image’s URL as the product URL
  3. Update the image, setting the Media’s “Post” ID to the WooCommerce product ID

Steps 1 and 2 work fine and I can upload dozens of products, but on 3 I get a 403 Forbidden response which doesn’t make sense to me since I was able to upload the image.

The above order is important to me, because I absolutely do not want to create a product without the image already being on the server – therefore I need to guarantee the process fails if there’s a problem with either the image upload or the product upload. Hence I update the image to attach to the post afterwards.

I know that you don’t strictly need to attach the image to a post, but it’s a requirement from the customer, who uses the attachment to manage deletion of images.

Simplified code below (I’ve removed the details of error handling, object creation etc)

// Create the image
try
{
    uploadedImage = await wordPressClient.Media.Create(imagePath, fileName);    
}
catch
{
    // Image upload failed. Fail the process
}
            
// Image uploaded successfully, now create the product
try
{
    productImage = new ProductImage()
    {
        src = uploadedImage.SourceUrl;
    };
    
    product.images.Add(productImage);                

    // Replace with the product recieved from the server so that 
    product = await wooCommerceClient.Product.Add(product);

}
catch(Exception e)
{
    // Product creation failed, fail the process (and try to remove the image, but don't worry about it)
}

// Now try to link the image to the post
try
{
    uploadedImage.Post = (int)product.id;
    await wordPressClient.Media.Update(uploadedImage);
}
catch
{
    // Problem here
}

Is there some kind of API permission I need to set to allow my API user to update the data for an existing media item? Or something else I’m missing?

design – Continuously updating a state object bad practice?

Say I have a state machine which returns a State object. Depending on the State (“Home”, “Lost”, “Known Location”), I would like to enforce some logic, e.g. if State is “Lost” I need to enforce dropping a breadcrumb.

So I have a function goForWalk that is maybe a MDP, I am debating between the two following designs:

def goForWalk():
   walk_state = State()
   for numSteps in walk:
      current_state = calculateState().
         if current_state == "Known Location":
            walk_state.updateState("Home")
         elif current_state == "Lost":
            walk_state.updateState("Lost")
   return walk_state

class State:
   def __init__(self):
      self.state = None
   def updateState(state):
      if state == "Lost":
         drop_breadcrumb()
      self.state = state

Under the constraint that TerminalState needs to execute some logic based on its state, which could be continuously updated, (I may go from “Lost” to “Known Location” to “Lost” again). I feel like there is a better way to do this if there are suggestions/pointers!

ubuntu – Fail2Ban is not updating iptables rules

I have set up fail2ban to protect my ssh port using these rather old instructions: https://www.digitalocean.com/community/tutorials/how-to-protect-ssh-with-fail2ban-on-ubuntu-14-04

I tested my set-up by botching a bunch of log-ins from another computer and fail2ban does manage to block the IP. I even confirmed as shown here:

$ sudo fail2ban-client status sshd
Status for the jail: sshd
|- Filter
|  |- Currently failed: 0
|  |- Total failed:     10
|  `- File list:        /var/log/auth.log
`- Actions
   |- Currently banned: 1
   |- Total banned:     2
   `- Banned IP list:   x.x.x.x

However, the aforementioned link also mentions that new rules should get added to iptables, but when I check this I don’t see anything:

$ sudo iptables -S | grep fail
$

Is this a problem? If so, any idea what I could be doing wrong?

database – What is the standard method for updating client views of data?

I’m working with a client who currently uses a database system that they’ve outgrown, so we’re re-building it using a different platform that can handle their growth. One feature of their current system, built into the platform, is that updates to data are automatically propagated to others viewing the data in real time. So if multiple users see a particular record on their screen and one of them updates that record, everyone viewing the record will see the updated data immediately. There is record locking, so only one user can be editing the record at a time.

This feature is important to the client. Currently the most likely replacement for their current system is a web-based client accessing a back-end database. Obviously this auto-update feature isn’t built into such systems, so we need to recreate it. But we’re unsure of the best way to do so. I’ve thought of a few possibilities, such as each client tracking which records they’re viewing and periodically polling the server to see if those records have changed, or having the server do this tracking, and send a message to the clients if one of the records they are viewing changes.

But I’m pretty sure this is a solved problem. So is there a standard method in software engineering to deal with this?

updating – What is the best practice for handling abandoned packages required by the drupal core?

When I run composer update on a Drupal 9 website, I get this warning:

Package doctrine/reflection is abandoned, you should avoid using it. Use roave/better-reflection instead.

A similar question has alreday been asked, answered and accepted on StackOverflow.

To summarise the accepted answer:

  • edit your composer.json and replace the abandoned package with the recommended replacement
  • then run composer update again

However, that answer doesn’t apply in the case of “doctrine/reflection”. When looking into my composer.json, there is no mention of it, so there is nothing to edit.

So I am checking why it is required:

$ composer why doctrine/reflection
doctrine/common          2.13.3  requires  doctrine/reflection (^1.0)   
doctrine/persistence     1.3.8   requires  doctrine/reflection (^1.2)   
drupal/core              9.1.7   requires  doctrine/reflection (^1.1)   
drupal/core-recommended  9.1.7   requires  doctrine/reflection (1.2.2)  

So this abandoned package is required by drupal/core (and others).

Two questions:

  1. Why is the most recent version of drupal/core requiring an abandoned package (instead of its recommended replacement roave/better-reflection)?
  2. What is the best practice for handling abandoned packages required by the drupal core?

updating – Database Update Failed (entity_revision_metadata_bc_cleanup)

I’ve not exactly new to Drupal or anything. I’ve upgraded a lot of sites from 6 to 7/8/9. This is a site running 9.1 that was recently upgraded from 8.x.

When I run the database update script I get an InvalidArgumentException. I’m not exactly sure how to deal with this error. I don’t have a lot to go on. It seems to mostly be core stuff. It’s not like I have any “crop” modules or anything installed.

DrupalcropCropStorage does not implement DrupalCoreEntityContentEntityStorageInterface

Does anyone have tips on the best way to solve something like this?

system module
Update entity_revision_metadata_bc_cleanup
Failed: InvalidArgumentException: DrupalcropCropStorage does not implement DrupalCoreEntityContentEntityStorageInterface in DrupalCoreEntityContentEntityType->checkStorageClass() (line 52 of /Users/jneel/Sites/work/indiana-furniture-portal/web/core/lib/Drupal/Core/Entity/ContentEntityType.php).

Steps to Reproduce

Steps to reproduce really aren’t much. I have run composer update --with-dependencies. I’ve tried running the update from update.php and via drush updatedb. All the same.

>  (notice) Update started: system_post_update_entity_revision_metadata_bc_cleanup
>  (error)  DrupalcropCropStorage does not implement DrupalCoreEntityContentEntityStorageInterface 
>  (error)  Update failed: system_post_update_entity_revision_metadata_bc_cleanup 
 (error)  Update aborted by: system_post_update_entity_revision_metadata_bc_cleanup 
 (error)  Finished performing updates. 

Composer.json

I’ll add my composer.json file too, just in case…

{
    "name": "drupal-composer/drupal-project",
    "description": "Project template for Drupal 8 projects with composer",
    "type": "project",
    "license": "GPL-2.0-or-later",
    "authors": (
        {
            "name": "",
            "role": ""
        }
    ),
    "repositories": (
        {
            "type": "composer",
            "url": "https://packages.drupal.org/8"
        }
    ),
    "require": {
        "php": ">=7.3",
        "composer/installers": "^1.2",
        "cweagans/composer-patches": "^1.6.5",
        "drupal/admin_toolbar": "^2.0",
        "drupal/administerusersbyrole": "^3.0",
        "drupal/anonymous_login": "^2.0",
        "drupal/antibot": "^1.4",
        "drupal/console": "^1.0.2",
        "drupal/core": "^9.0.0",
        "drupal/core-composer-scaffold": "^9.0.0",
        "drupal/core-recommended": "^9.0.0",
        "drupal/devel": "^4.1",
        "drupal/ds": "^3.4",
        "drupal/easy_breadcrumb": "^1.12",
        "drupal/editor_advanced_link": "^1.4",
        "drupal/editor_file": "^1.4",
        "drupal/file_management": "1.x-dev@dev",
        "drupal/fontawesome": "^2.14",
        "drupal/google_tag": "^1.2",
        "drupal/honeypot": "^2.0",
        "drupal/menu_admin_per_menu": "^1.0",
        "drupal/paragraphs": "^1.9",
        "drupal/paragraphs_edit": "^2.0",
        "drupal/pathauto": "^1.4",
        "drupal/quick_node_clone": "^1.12",
        "drupal/redirect": "^1.5",
        "drupal/redirect_after_login": "^2.5",
        "drupal/redis": "^1.1",
        "drupal/restui": "^1.19",
        "drupal/search_api": "^1.15",
        "drupal/search_api_page": "^1.0@beta",
        "drupal/sendgrid_integration": "^1.2",
        "drupal/simple_styleguide": "^1.5",
        "drupal/simplify_menu": "^2.0",
        "drupal/stage_file_proxy": "^1.0",
        "drupal/swiftmailer": "^2.0",
        "drupal/system_status": "^2.8",
        "drupal/taxonomy_menu": "3.x-dev@dev",
        "drupal/twig_tweak": "^2.4",
        "drupal/webform": "^6.0",
        "drush/drush": "^10.0.0",
        "platformsh/config-reader": "^2.2",
        "vlucas/phpdotenv": "^2.4",
        "webflo/drupal-finder": "^1.0.0",
        "webmozart/path-util": "^2.3"
    },
    "require-dev": {
        "drupal/coder": "^8.3",
        "drupal/core-dev": "^9.0.0",
        "drupal/drupal-extension": "^4.0",
        "mglaman/drupal-check": "^1.1",
        "mglaman/phpstan-drupal": "^0.12.8",
        "phpstan/phpstan": "0.12.64",
        "phpstan/phpstan-deprecation-rules": "^0.12.6",
        "phpunit/phpunit": "^8.4.1 || ^9",
        "squizlabs/php_codesniffer": "*",
        "zaporylie/composer-drupal-optimizations": "^1.1"
    },
    "conflict": {
        "drupal/drupal": "*"
    },
    "minimum-stability": "dev",
    "prefer-stable": true,
    "config": {
        "sort-packages": true
    },
    "autoload": {
        "classmap": (
            "scripts/composer/ScriptHandler.php"
        ),
        "files": ("load.environment.php")
    },
    "scripts": {
        "pre-install-cmd": (
            "DrupalProject\composer\ScriptHandler::checkComposerVersion"
        ),
        "pre-update-cmd": (
            "DrupalProject\composer\ScriptHandler::checkComposerVersion"
        ),
        "post-install-cmd": (
            "DrupalProject\composer\ScriptHandler::createRequiredFiles"
        ),
        "post-update-cmd": (
            "DrupalProject\composer\ScriptHandler::createRequiredFiles"
        )
    },
    "extra": {
        "composer-exit-on-patch-failure": true,
        "patchLevel": {
            "drupal/core": "-p2"
        },
        "installer-paths": {
            "web/core": ("type:drupal-core"),
            "web/libraries/{$name}": ("type:drupal-library"),
            "web/modules/contrib/{$name}": ("type:drupal-module"),
            "web/profiles/contrib/{$name}": ("type:drupal-profile"),
            "web/themes/contrib/{$name}": ("type:drupal-theme"),
            "drush/Commands/{$name}": ("type:drupal-drush")
        },
        "drupal-scaffold": {
            "locations": {
                "web-root": "web/"
            },
            "file-mapping": {
                "(web-root)/sites/development.services.yml": false
            },
            "initial": {
                ".editorconfig": "../.editorconfig",
                ".gitattributes": "../.gitattributes"
            }
        }
    }
}

Updating a SQL Server Extended Event session to add data storage – is it possible?

I have an extended event session to track deadlocks, and have the data storage set up for Event_File, max file size of 20 MB, max number of files is 5, enable file rollover is true.

It looks like I can’t change the data storage, even after I stop the session. It’s all grayed out. I need to be able to increase the number of files, because for some reason it’s not actually going to the max file size of 20 MB. I have 4 files already for this session, and the largest size is 47 KB.

How can I increase this? Do I just need to re-create the session? I already have history in it that I want to keep.

openGL – Updating instanced model transform in vbo every frame

I am using OpenGL to render a large number of models by instanced rendering (using LWJGL wrapper).
As far as I can tell I have implemented the instancing correctly, although, after profiling, I’ve come upon an issue.

The program is able to render a million cubes at 60fps when their model (world) transformations are not changing. Once I make them all spin though, the performance drops significantly. I deduced from the profiler that this is due to the way I write the matrix data to the VBO.

My current approach is to give each unique mesh a new VAO (so all instances of cubes come under 1 VAO), have 1 VBO for vertex positions, textures, and normals and 1 instance array (VBO) for storing instance model matrices. All VBOs are interwoven.

In order to make the cubes spin, I need to update the instance VBO every frame. I do that by iterating through every instance and copying the matrix values into the VBO.

The code is something like this:

float() matrices = new float(models_by_mesh.get(mesh).size() * 16);

for (int i = 0; i < models.size(); i++){
    Model cube = models.get(i);
    float() matrix = new float(16);
    cube.getModelMatrix(matrix);    //store model matrix into array
    System.arraycopy(matrix, 0, matrices, i * 16, 16);
}

glBindBuffer(GL_ARRAY_BUFFER, instance_buffers_by_mesh.get(mesh);
glBufferData(GL_ARRAY_BUFFER, matrices, GL_STATIC_DRAW);

//render

I realise that I create new buffer storage and float array every frame by calling glBufferData instead of glBufferSubData but when I write:

//outside loop soon after VBO creation
glBufferData(GL_ARRAY_BUFFER, null, GL_DYNAMIC_DRAW); //or stream

//when updating models
glBufferSubData(GL_ARRAY_BUFFER, 0, matrices)

nothing is displaying; I’m not sure why, perhaps I’m misusing subData but that’s another issue.

I have been looking at examples of particle simulators (in OpenGL) and most of them update the instance VBO the same way as me.

I’m not sure what the problem could be and I can’t think of a more efficient way of updating the VBO. I’m asking for suggestions / potential improvements with my code.

Many thanks 🙂