air travel – What’s the use of Schengen Airside Transit visas if the country list isn’t updated regularly?

So, the Schengen area has a list of 12 countries included for mandatory airside transit visas throughout the schengen area. From close inspection, the list was made in 2010 (probably from data from previous years), and hasn’t been updated since. What’s the use of having a list if it’s not updated in a decade? The situations of these countries in 2020 are very different from 2010. There may be new countries which warrant being added compared to the ones there. For example, Syria is not in this list!! Even after numerous asylum seekers in the past decades, the schengen bureaucracy has still not added Syria to the list! That’s how efficient they are. On the other hand, a country like Sri Lanka is still on the list even while Sri Lankans not being among the top countries of asylum claimants now (Sri Lanka had issues which ended in 2009, and that’s probably why they were put there. Unfortunately, the placement seems to be permanent)!

It begs the question why these countries have asylum policies in the first place if they try desperately to prevent those asylum seekers coming in to apply. Makes no sense. Who are they trying to fool? Why not just abolish granting asylum requests to people arriving by air? Things will be much easier for everyone concerned. Asylum seekers won’t be traveling, while legitimate visitors will face much less hassle! I’m talking about airside transit visas here, and not regular entry visas where people can overstay.

Last updated posts shortcode in functions.php

I’m searching to create a shortcode inside function.php child theme, to show last updated posts with thumbnail.
I have this code that works into page template:

    <ol class="list-numbered">
    // Show recently modified posts
    $recently_updated_posts = new WP_Query( array(
        'post_type'      => 'post',
        'posts_per_page' => '13',
        'orderby'        => 'modified',
        'no_found_rows'  => 'true' // speed up query when we don't need pagination
    ) );
    if ( $recently_updated_posts->have_posts() ) :
        while( $recently_updated_posts->have_posts() ) : $recently_updated_posts->the_post(); ?>
            <li><a href="<?php the_permalink(); ?>" rel="bookmark" title="<?php the_title(); ?>"><?php the_title(); ?>
$size = 'thumbnail'; 
 $attachments = get_children( array(
        'post_parent' => get_the_ID(),
        'post_status' => 'inherit',
        'post_type' => 'attachment',
        'post_mime_type' => 'image',
        'order' => 'ASC',
        'orderby' => 'menu_order ID',
        'numberposts' => 1)
    foreach ( $attachments as $thumb_id => $attachment ) {
        echo wp_get_attachment_image($thumb_id, $size);
        <?php endwhile; ?>
        <?php wp_reset_postdata(); ?>
    <?php endif; ?>

But I would insert into functions.php file and call the shortcode.
Do you have some ideas?
Thanks in advance.

stream processing – How to keep a data warehouse updated?

Suppose there is a system ( like an ERP ) that writes to a database ( not too big, less than 100GB ). You need to export the data from this database to a data warehouse ( like RedShift or BigQuery ) as many times in a day as you can, what would be a good solution for that?
There is this feature in the system that exports only the delta, so this is what I was thinking:

1 – Write an ETL script to query the delta, format in Avro and save it in a bucket ( GCS or S3 )
2 – Trigger a function when the object is inserted, get the object and insert into a staging table ( one for each table in the origin DB )
3 – Trigger a function to merge the staging table into the main table

I’m not too happy with this approach, because it feels so limited. I think I’m missing something here. Should data in a DW be so hard to maintain? I see a lot of examples on how you can insert data into a DW, but very few on how to keep it updated.

Also, suppose that this delta mechanism didn’t exist and we had to use a streaming solution ( like Kinesis ). That would make things even harder, because data will be inputed into the bucket much faster, generating lots of files, so how could I handle a scenario like this given that DW are slow to update row by row ( BigQuery even limits the amount of updates/day )?

bitcoincore development – SCAM WARNING: Bitcoin Private Key Finder V1.2 01.04.2020 Updated and Bitcoin Private Key Generator V2.4 advertised on and other sites

Is anyone aware of this scam?

There are many scams associated with “bitcoin“. Please be aware of the scam on

This site offers a tool to help compile information, not necessarily find a private key. However, a possible useful tool “if” the software even works.

Note: On the website, “” the price is clearly advertised as, “0.0008 BTC (Promotional Price)” as follows,

“This is the real BITCOIN PRIVATE KEY FINDER, used in generating private key for any bitcoin address Pay 0.008 BTC (Promotion Price) Pay to the following wallet address. Send us your transaction ID by entering your e-mail information in the contact form below. After the payment is received, the download link will be sent to your e-mail address.” Scammers Advertised Price of .0008 BTC

Then, the website also says, “No Hidden Fees” and “Money Back Guarantee” at on the third flashcards at the top of the page.


You want to pay Only Software cost. not a any Hidden Fee. because It comes with 100% Money Back Guarantee
we offer full support after purchase.
all our software is full version.
Now buy bitcoin private key finder.” Scammers Guarantee of "No Hidden Fess" and 100% Money Back Guarantee"

Except, the scam works like this,

Pay “0.0008 BTC” for the software with no hidden fees.
Then, the extortion of a “Hidden Fee” comes after download by demanding for an additional “0.0018 BTC” for the password to an AES 256 encrypted RAR File to use the software after you have paid for the only advertised price.

The software does not work unless you have the password by paying the “Hidden Fee” and they do not provide a “money back guarantee” since the hidden fee exist and the software does not work without being extorted for more coins.

Anything associated with these scammers should not be tolerated. The sight offers, both, a BİTCOİN PRİVATE KEY FİNDER V1.2 01.04.2020 UPDATED and a BİTCOİN PRİVATE KEY GENERATOR V2.4 UPDATED on and other sites.

I would assume because the BİTCOİN PRİVATE KEY FİNDER V1.2 01.04.2020 is a scam to make the software work that the other BİTCOİN PRİVATE KEY GENERATOR V2.4 UPDATED on is also a scam.

This scammer also users a crawler to advertise their site throughout the entire search engine, always leading back to the culprit.

I have uploaded the, still locked, rar password protected file in case anyone wants to try a hack at it because, after all, the scammer is screwing all of us, so why not screw them back.

Have a nice day, and stay away, from this scammer!


Link to rar file, AES 256 password protected (Good luck, share if you break).

Link to rar file, AES 256 password protected (Good luck, share if you break).

Additional notes and lies by the scammer, contrary to the website advertisement…

My proof of payment “.0008 BTC” and advertised, txid: 1df0f4a755d7a8c005dca0aac35f89ba64cc6bf8afa30eb9cf77b6fd7c060bc2

Scammer remarks 1

Scammer remarks 2

algorithms – Do Predecessors Ever Get Updated in Depth-First Search?

Is it true to say that in a Depth-First Search that uses a dictionary of predecessors to find the path to its destination, once a node has been pushed onto the stack and been added to the predecessors dictionary, there will never be a need to update the predecessor for that particular node? If so, why is that?

The reason I’m not clear is that in my understating the algorithm can backtrack to a previous node and start searching from there. When it does this, can we assume that the node that the algorithm “gave up on” will never be part of the solution path?

That is to say, is it the case that, once visited, a cell is either part of the solution path or never will be?

Below is my implementation in Python. It seems correct, but I’m niggled by the fact that predecessors are never updated. Are there any situations where this would prevent the algorithm giving a correct path to a reachable goal?

offsets = {
    "right": (0, 1),
    "left": (0, -1),
    "up": (-1, 0),
    "down": (1, 0)

def is_legal_pos(maze, pos):
    i, j = pos
    num_rows = len(maze)
    num_cols = len(maze(0))
    return 0 <= i < num_rows and 0 <= j < num_cols and maze(i)(j) == 0

def get_path(predecessors, start, goal):
    current = goal
    path = ()
    while current != start:
        current = predecessors(current)
    return path

def dfs(maze, start, goal):
    stack = (start)
    predecessors = {start: None}

    while stack:
        current_cell = stack.pop()
        if current_cell == goal:
            return get_path(predecessors, start, goal)
        for direction in ("up", "right", "down", "left"):
            row_offset, col_offset = offsets(direction)
            neighbour = (current_cell(0) + row_offset, current_cell(1) + col_offset)
            if is_legal_pos(maze, neighbour) and neighbour not in predecessors.keys():
                predecessors(neighbour) = current_cell

    return None

maze = ((0) * 3 for row in range(3))
maze(1)(0) = 1 # obstacle
start_pos = (0, 0)
goal_pos = (2, 2)
result = dfs(maze, start_pos, goal_pos)

Has anyone updated 20.04 with new kernel 5.4.0-39 yesterday resulting in nvidia driver problems?

Yesterday Ubuntu upgraded driver 5.4.0-37 to Upon reboot lost the second monitor, mouse jerky, display sluggish. Downgraded to 5.4.0-37 and all issues resolved. The nvidia driver is 440. The graphics adapter is GeForce 1050 Ti. Updated my other machine with intel graphics with no issue.