woocommerce offtopic – Is it safe to delete from db orphaned posts i.e. whose post_parent no longer exists?

I inherited a couple of WP + WooCommerce shops with roughly 30,000 products for sale each, and a wp_postmeta which is over a million lines. The former webmasters ran away.

In an effort to clean up old stuff, I noticed a post_parent field in wp_posts, and queried the database for orphans:

SELECT ID FROM wp_posts WHERE NOT post_parent IN 
  (SELECT ID FROM wp_posts) AND post_parent>0 

and found thousands of records.
All of these records have a post_parent which no longer exists.

Just out of curiosity, I checked wp_postmeta:

SELECT * FROM wp_postmeta where post_id in
  (SELECT ID FROM wp_posts where not post_parent in 
     (select ID from wp_posts) and post_parent>0 ) 

and found 60,000 records.

Is it safe to delete them, along with any references from the tables wp_postmeta, wp_comments, wp_commentmeta, wp_term_relationships, wp_wc_product_meta_lookup ?

Else, can you suggest a strategy to clean up the database from spurious data?

development – Make SetIsOriginAllowed Safe for SharePoint WebAPI?

I’m working on a test WebAPI for SharePoint that is secured with Azure AD via bearer tokens. At first, I manually specified every allowed origin with a statement like this inside ConfigureServices of Startup.cs:

services.AddCors(options =>
    {
        // https://docs.microsoft.com/en-us/aspnet/core/security/cors?view=aspnetcore-5.0
        // https://www.yogihosting.com/aspnet-core-enable-cors/
        options.AddPolicy(
            "SharePointOnline",
            builder =>
            {
                builder.WithOrigins(
                    "https://myExampleTenantName-3a3324d2e332a3.sharepoint.com",
                    "https://myExampleTenantName-3a3324d2e332a4.sharepoint.com",
                    "https://myExampleTenantName-3a3324d2e332a5.sharepoint.com",
                ).AllowAnyHeader().AllowAnyMethod().AllowCredentials();
            }
        );
    }
);

Unfortunately, this got to be annoying pretty quick. Every time I would install a SharePoint Add-in, my Add-in would get a new AppHash similar to the “3a3324d2e332a3”, “3a3324d2e332a4” and “3a3324d2e332a5” shown above. I would then have to go into the WebAPI and add an origin entry for the newly deployed Add-in. I thought about reading the origins from a config file or database, but then I stumbled across a wildcard method by Granger (and others) over in SO https://stackoverflow.com/questions/8197812/how-do-i-configure-notepad-to-use-spaces-instead-of-tabs

So then I tried this, but I’m wondering if it’s “safe”:

services.AddCors(options =>
{
    // Granger: 
    // https://stackoverflow.com/questions/36877652/
    //    configure-cors-to-allow-all-subdomains-using-asp-net-core-asp-net-5-mvc6-vnex
    // https://docs.microsoft.com/en-us/aspnet/core/security/cors?view=aspnetcore-5.0
    // https://www.yogihosting.com/aspnet-core-enable-cors/
    options.AddPolicy(
        "SharePointOnline",
        builder =>
        {
            builder.SetIsOriginAllowed(
                o => Regex.IsMatch(
                    o, 
                    "https://myExampleTenantName.*\.sharepoint\.com"
                )
            ).AllowAnyHeader()
            .AllowAnyMethod()
            .AllowCredentials();
        }
    );
}

While the RexEx “https://myExampleTenantName.*.sharepoint.com” will match all of my Add-ins regardless of AppHash, I’m afraid someone can go into SharePoint and create a tenant named myExampleTenantNameEvil and fool my RegEx into allowing requests from their “Evil” origin.

I’m thinking that I can tighten my RegEx by using something like https://myExampleTenantName-(a-f0-9).*.sharepoint.com , but I’m still concerned even that might not be “safe”.

To make matters worse, I thought I’ve read in various places such as StackOverflow that you can’t rely on Same-Origin policy to protect against Cross-Site Request Forgery XSRF/CSRF attacks anyway…

Is there a RegEx or some C# I can use inside the SetIsOriginAllowed function to only allow origins from SharePoint Add-ins running on tenant myExampleTenantName?

NOTE: I know this question is very focused on the WebAPI aspect of the application chain, but the problem stems from that fact that SharePoint insists on appending that irritating AppHash to my tenant name to form the FQDN for installed Add-ins.

magento2 – Magento 2 – Is it safe to remove the DEFINER=.. statement from sqldump?

I have a web server that serves Magento2 and a MySQL Server that hosts the database and both of these servers are in a single ubuntu instance.

I would like to move the MySQL server to a separate RDS MySQL instance, so I have dumped the database with mysqldump as root user and while uploading it to RDS MySQL instance I get:

"ERROR 1227 (42000) at line 2873: Access denied; you need (at least one of) the SUPER privilege(s) for this operation"

I have checked for the solution and removed DEFINER with sed 's/sDEFINER=(^)*@(^)*//g' -i dumpedfile.sql and after removing DEFINER the upload worked without any error.

My question is, what exactly definer is, is it safe to remove it from the dumped file of a Magento 2 database?

I will manually do 45 PR9 + 20 EDU/GOV Safe SEO High Pr Backlinks 2021 Best Results for $10

I will manually do 45 PR9 + 20 EDU/GOV Safe SEO High Pr Backlinks 2021 Best Results

I will manually do 45 PR9 + 20 EDU/GOV Safe SEO High Pr Backlinks 2021 Best Results.


Manually create 45 PR9 and 20 EDU/GOV back-links from BIGGEST INTERNET AUTHORITY SITEs- LIKE TED.com or MIT.edu. Successfully SELLER ON SEOClerks. Get touch from AaronSEO you will never UNSATISFIED. 100% trusted and POSITIVE backlinks SERVICEs.PR9-7: I will create 45 PR9 backlins using some of the Biggest PR9 – PR7 Authority Domains.(High Pagerank is on root domain, not page) anchored, and non-anchored, some with about me text, some without, which is the most natural, search engine friendly technique to use as it doesn’t look spammy, I’ll then send your URLS to my premium indexer.EDU/GOV: I will create 20 EDU/GOV backlinks using some Biggest PR9-2 Authority Domains.(High Pagerank is on root domain, not page) anchored, and non-anchored, some with about me text, some without, which is the most natural, search engine friendly technique to use as it doesn’t look spammy?PR9-7 BACKLINKS FROM:APPLE.COMWORDPRESS.ORGOPENSTREETMAP.ORGLIVEINTERNET.RUAPACHE.ORGMOZILLA.ORGQUALTRICS.COMAdobe.comEDU/GOV BACK-LINKS FROM:FUDAN.EDU.CNMit.eduAcademia.eduBerkeley.eduREMEMBER it’s not about throwing a ton of low quality urls to your site, that just doesn’t work, a handful good high quality links from trusted domains like these will do more good for you SEO efforts.

.(tagsToTranslate)Pr9(t)Profilebacklin(t)Edu(t)Gov(t)Edubacklinks(t)Highpr

I Will Manually Do 25 Pr9 Da 80 Safe Seo High Authority Backlinks

I will manually do 25 pr9 da 80 safe SEO high authority backlinks

We all know how important links are to everyone’s website and Google loves backlinks from authority sites..

Our service here exclusively on SEO is to create you 25 Pr9 DA 80+ manual high pr back-Iinks to your website from authority domains.

Manually create 25 Pr9 DA 80+ and backlinks from BIGGEST INTERNET AUTHORITY SITES- LIKE APPLE website. Get touch from SEO_MAXIMUS you will never UNSATISFIED. 100% trusted and POSITIVE backIinks SERVICEs.

PR9-7: We will create 25 PR9 backIinks using some of the Biggest PR9 – PR7 Authority Domains.(High Pagerank is on root domain, not page) anchored, and non-anchored, some with about me text, some without, which is the most natural, search engine friendly technique to use as it doesn’t look spammy, I’ll then send your URLS to my premium indexer.


Some Sample Link:

** WordPress.com

** Microsoft.com

** Amazon.com

** About.me

** Behance.net

** Issuu.Com

** Pinterest. Com

** Disqus.Com

** Flickr.Com

** Nature.Com & Others……..

PRICE: $15
Paypal

Manually Do 50 Pr9 DA 80+ Safe SEO High Authority Backlinks 30+ Domain HIGH QUALITY BACKLINKS for $5

Manually Do 50 Pr9 DA 80+ Safe SEO High Authority Backlinks 30+ Domain HIGH QUALITY BACKLINKS

Manually Do 50 Pr9 DA 80+ Safe SEO High Authority Backlinks 30+ Domain HIGH QUALITY BACKLINKS.

My gig through Google, where I rank #1 for very competitive SEO keywords, or through seoclerk.com results, where I’m also #1, are you wondering why?

IT’S SIMPLY BECAUSE I KNOW HOW TO DOMINATE SEO!

Let me do the same for YOU!

These days Google doesn’t mess around and you shouldn’t either. If you want a real boost in your rankings, then you absolutely NEED high PR/DA backlinks!

I will Manually create and execute a building Backlinks strategy for Maximum PROVEN Algorithmic Movement in The SERPS To SEO for Your site. exclusively on seoclerk!

As part of this limited offer, I will give you 10 (temporarily 30!) powerful permanent links from some of the biggest PR 7 – 10 (DA 80+) Authority sites on the planet like:

Amazon

Adobe

Microsoft

Sony

Java

IBM

TED

Flickr

Disqus

Dell

Intel and more!

I work with White Hat manual methods, 100% Google Panda 3.0, Penguin 4.1& Hummingbird Safe, based on the latest Google updates. With natural High PR Backlinks and SEO Technic, your site start ranking for sure which is loved by Google algorithm !

You Can Use This Service For Your Money,Blog Or video Channel.

★Benefits of our service★

✓ 93% saw rankings increase!

✓ 100% SAFE, White Hat and Manual work.

✈ Fast Delivery with 24 hours.

✔ Full report with Usernames & Passwords to control these Links!

✔ 100% whitehat

✓ 54% (Low comp.) Ranked on the first page of Google and began receiving high-quality traffic!

.(tagsToTranslate)Seo(t)Profile(t)Backlinks(t)Edu(t)Socialboolmark(t)pr9

Safe deployment for database content

My application is deployed on several servers that all read data from a single database.

Following safe deployment practices, code deployments happen first on a subset of servers and only continue to the remaining servers if no issues are detected.

I’d like to also employ safe deployment practices for some critical data that is stored in the database. When appending new rows to a particular table, I’d like those rows to only be consumed by a subset of servers. The remaining servers will only consume the data if no issues are detected for a period of time.

How can I do this given that i only have a single database?