reductions – Proving B-Min-Cost Strongly connected Subgraph is NP-Complete

We have a strongly connected directed graph where each edge has positive integer weights. We are also given a $B in mathbb{N}$. Does there exist a strongly connected subgraph where sum of edge weights $leq B$?

I’m not sure what NP-Complete problem to use. I tried subset sum, but it takes exponential time to build the graph I had in mind (yes/no tree with edge cost being the numbers in subset sum problem, and a final edge to root at end of every decision sequence). Any ideas? Is there a better NP-Complete problem to use?

microsoft remote desktop blank screen when host PC’s (win10) monitor is off or not connected. Also LAN connections don’t work

as per title, everytime I try to connect using the android, iOS clients (official MS ones), it always shows an unresponsive black screen when the host PC’s monitor is off or disconnected. All my power settings are OK (no sleep, no hibernation, no auto turning off network card to save power etc etc). I’m really at a loss as it doesn’t give any error messages.

Another strange issue I’m having is all local network connections don’t work. It fails to connection everytime saying the PC can not be found regardless if I use IP address or PC name. I’ve disabled windows firewall too. Works fine over the internet though?

Any ideas?

Thanks a bunch!

rooting – Block IP/Domain (youtube.com, facebook) and other distracting sites connected via phone’s hotspot

Is there a way to block youtube.com and other distracting site like facebook.com when my Laptop is connected via android hotspot at my phone’s end i.e. blocking sites via android hosts file. After reading the following techrepublic article, makes me think it’s possible but not sure so asking this question.

https://www.techrepublic.com/article/edit-your-rooted-android-hosts-file-to-block-ad-servers/

To edit phones hosts file, android phone must be rooted so I want to be sure before rooting my phone,

If this is not the correct approach then what other options are available.

Thanks

graphs – Representation of connected components in the $O(|E|)$ time/space variant of Karger’s algorithm

I’m trying to understand the various optimizations given in the original 1992 paper on Karger’s algorithm. Specifically, looking at section “3.1 Unweighted Graphs”, I don’t understand what data structure is used to represent the variant of the algorithm that works in $O(|E|)$ time and space (the other variant, which uses union-find and takes $O(|V|)$ space and $O(|E| log |E|)$ time, makes sense).

Specifically: how would you represent the edges so that it is possible to

  1. track the connected components induced by a sequence of $m$ edges in a random order in $O(m)$ time?
  2. contract a sequence of $m/2$ edges in a random order in $O(m)$ time?

I must be missing something obvious, but I couldn’t find any implementation for this variant (everyone seems mostly interested in implementing the union-find variant of the algorithm).

TensorFlow 2 Python 3 Fully Connected Neural Network

I would really appreciate constructive feedback and suggestions for this fully-connected neural network I have written with TensorFlow 2, Python 3. It estimates the period and amplitude of a sine curve given 100 sample y-points on the curve.

I am mainly interested in how this can be optimised for speed, improving the styling & readability, if I have made any strange programming choices, if there’s a powerful DL technique I am missing which might be useful (etc). Snippets of this script will be used in a pedagogical document to introduce DNN concepts alongside the TensorFlow methods so improvements given this context would be very beneficial.

One immediate flaw I will can address later is that currently the model size is far too big so it will not actually generalise outside the range of parameters in the training examples. That’s fine; for now I just wanted to get everything working and will tune hyperparameters and do regularisation later.

Any advice would be deeply appreciated. Additionally, I am yet to implement a normalisation layer (I would like this to normalise (in the context of them being curves however) using the whole training data and automatically inputs to the model once trained. I am also yet to vectorise the make_curve function. Suggestions for either of these next steps would be fantastic also.

This is of course a toy problem and I will adapt the network to a different problem in which I will be interested in the efficiency and high-dimensional inputs. I have access to both cluster CPU and GPU cores, as well as my fairly laptop with GeForce GTX 1050 Ti Max-Q GPU so I would be interested in optimising this for taking advantage of the parallel computing availability.

The 3d plot is just for fun and shows how the squared error of a prediction blows up for degenerate cases such as zero period or amplitude. Would I be right to assume that a network which has generalised well would have better error specifically at the boundaries?

With the current settings, this take 2.5 mins to run on my 2018 Dell XPS laptop (‘s CPU?) with the following output:

Average test loss:  8.497130045554286
Average val loss:  7.136056077585638
Time taken:  146.38214015960693

Here is the code:

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras.layers import Dense, Dropout, BatchNormalization
import tensorflow.keras.backend as kb
from tensorflow import math as tm

import math, time
import numpy as np
from datetime import datetime

#import warnings
#warnings.filterwarnings("ignore")
kb.set_floatx('float64')

start = time.time()

num_sample_pts = 100
train_size     = (5*10**2)**2  # Funky format so it is a square and easily configurable during development
train_sqrt     = int(train_size**0.5)
epoch_nums     = 2**3
minibatch      = 2**6          # Remember to have minibatch << train_size*epochs
callbacks      = False         # TensorBoard logging is much slower than the learning itself 

learning_rate  = 0.00063
num_layers     = 28
reg1_rate      = 0.001
reg2_rate      = 0.001
act_func       = 'elu'
dropout        = 0.2
units          = 37

def make_curve(period, amp, aug=False):
    if False:
        a = 2
        # Make this vectorised so randoms are a vector of randoms.
    return ( amp * np.sin(( np.linspace(-4,4, num_sample_pts) + aug*np.random.rand() ) * (2*np.pi/period))
            + aug*np.random.rand() ).reshape(num_sample_pts)

def data(sample_interval = (-4, 4), amp_interval = (0,30), aug=False, gridsize=train_sqrt):
    sample = np.linspace(*sample_interval, num_sample_pts)
    mesh   = np.meshgrid(0.05 + 10*np.pi*np.random.rand(gridsize),          # Periods
                         amp_interval(0) + (amp_interval(1) - amp_interval(0)) *
                         np.random.rand(gridsize))                          # Amplitudes
    pairs  = np.array(mesh).T.reshape(-1, 2)
    curves = np.array((make_curve(w,a,aug) for w,a in pairs))               # Change when make_curve is vectorised
    glob_centre, glob_max = np.mean(curves), max(amp_interval)              # Globally centre all curves within pm 1
    curves = (curves - glob_centre) / glob_max                              # - Replace with a normalisation layer
    return (curves, pairs)
# Returns list of 2 arrays (x,y):
# - x is an array of each curve sample list
# - y is an array of the period-amplitude pairs corresponding to curve samples in x

df = data(aug=True)

# Not used. Another possibility could be rmse against a sample from predicted curve
def custom_mean_percentage_loss(y_true, y_pred): #Minimax on the percentage error
    diff = y_pred - y_true
    non_zero = y_true + 10**-8
    res = tm.reduce_mean(tm.abs(tm.divide(diff,non_zero)))
    return res

if callbacks:
    logdir = "logs\scalars\" + datetime.now().strftime("%Y%m%d-%H%M-%S")
    tensorboard_callback = (keras.callbacks.TensorBoard(log_dir=logdir, update_freq='epoch'))
else:
    tensorboard_callback = ()
kb.set_floatx('float64')


def model_builder():
    initializer = keras.initializers.TruncatedNormal(mean=0., stddev=0.5)
    model = keras.Sequential()
    model.add( Dense( units = num_sample_pts,                           # Number of input nodes equals input dimension
                                   kernel_initializer = initializer,    # Initialize weights
                                   activation = act_func,
                                   kernel_regularizer = keras.regularizers.l1_l2(reg1_rate, reg2_rate),
                                   dtype='float64'))
    BatchNormalization()

    for layer in range(num_layers):
        Dropout(dropout)
        model.add( Dense( units = units,                                # Number of nodes to make number of inputs
                         activation = act_func,
                         kernel_regularizer = keras.regularizers.l1_l2(reg1_rate, reg2_rate),
                         dtype='float64'))
        BatchNormalization()
        
    model.add( Dense(units = 2, activation = 'linear', dtype='float64'))
    # Outputting amplitude-period pair requires 2 nodes in the output layer.
    
    model.compile(
        optimizer = keras.optimizers.Adam(learning_rate = learning_rate),
        loss = 'mse',
        metrics = ('mse') ) # Measures train&test performance
    return model

model = model_builder()
training_history = model.fit(*df,
                             batch_size = minibatch,                               # Number of data per gradient step
                             epochs = epoch_nums,
                             verbose = 0,
                             validation_split = min(0.2, (train_sqrt**2)/5000),    # Fraction of data used for validation set
                             callbacks=tensorboard_callback)
                             
print("Average test loss: ", np.average(training_history.history('loss')(:10)))
print("Average val loss: ", np.average(training_history.history('val_loss')(:10)))
print('Time taken: ', time.time()-start)
print(' ')
    
import winsound
for i in range(2):
    winsound.Beep(1000, 250)

This is the first neural network I have written. Thank you very much for your thoughts, improvements and contributions.

iMac 5k (2017) does not always boot when connected to UPS

Ill start by saying that this particular imac is mainly for music production so it only goes online for software updates if not its always offline. I’m currently on high sierra as it is very stable for all my music stuff.

I bought an EATON 5S 1500VA about 4 months ago and twice now out of nowhere the iMac did not boot when connected to the UPS unless i change in which socket it’s plugged (there are 8 sockets total..4 surge protected and 4 with battery).

It was connected to the battery socket number 1 and it worked fine until suddenly it did not boot anymore, so i plugged it in battery socket number 3 and it worked…then today it did not boot again on socket number 3 so i plugged it in socket number 2 and it booted. Both time this happened i immediately tried connecting directly bypassing the UPS and both time the mac started fine.

Now you would think its a problem with the UPS except that my old 2009 mac pro boots on all 4 sockets everytime i’ve had that problem.

I switched the UPS off, disconnected everything before connecting everything again but its still the same…i really don’t know what to do…

Hope someone here may have an idea of what the heck is going on and can guide me.

Thank you

python – Figuring out how two elements are connected

At Daisy’s party there are fruit skewers that the guests put together themselves. There are six types of fruit to choose from: apple, banana, blackberry, strawberry, plum and grape. All
Pieces of one kind are in their own bowl. Daisy loves surprises and all that
she covered the six bowls and only provided them with numbers so that the guests wouldn’t
Know beforehand what type of fruit to expect when they get to the top of the queue
before a certain number.
Donald doesn’t like Daisy’s surprises at all. He only wants to eat certain types of fruit
but does not like to admit this. Therefore he observes other guests and makes a note of which ones
Queue them up before putting their skewers together, and how the skewers end up
look. In this way, however, he does not find out which piece of fruit came from which bowl
is. Here are his first observations:

• Mickey has a skewer with apple, banana and blackberry, the fruit pieces from the
Bowls 1, 4, and 5 come (but not necessarily in that order).
• Minnie’s skewer is made from banana, plum and grape and was made from the bowls
3, 5 and 6 put together.

• Gustav has a skewer with apple, blackberry and strawberry and has bowls 1, 2
and 4 visited.

Now Donald wants to make himself a fruit skewer with grapes, blackberries and apples. Unfortunately, from the information available, he cannot clearly determine in which bowls these types of fruit can be found. Therefore, he also observes at the end
Daisy.
• Daisy has a strawberry and plum skewer and was on bowls 2 and 6.
task
a) Help Donald and tell him which bowls to use. Sketch how
you determined the number of bowls.
b) After the party, Daisy is thrilled: It was a complete success! She is determined
To organize brilliant fruit skewer happenings, with umpteen types of fruit and many guests.
Donald wants to be there, of course. But now he’ll have to make an even bigger effort to
put together the skewer you want.
Write a program that can help Donald. It should read in the following data:
• the amount of fruits available;
• Information about some fruit skewers: for each skewer

  • the amount of fruit on the skewer and
  • the same number of numbers of the bowls from which these types of fruit originate;
    • a lot of desired varieties.

I have the following result for the question:

Plum => 6
Strawberry => 2
Banana => 5
grapes => 3

How can I implement this in Python?

I will get the input as a .txt file which has:

-The number of available types of fruit (as an upper limit: not all available types of fruit have to appear in the file) in the first line,

In the second line Donald’s preferred version,

n the third line the number N of observed fruit skewers and
in the following 2N lines in two lines one after the other one observation; in the first line the set of bowl numbers from which the types of fruit specified in the second line come (although it is unclear which type comes from which bowl).
There are a maximum of 26 different types of fruit. Each variety name begins with a different letter of the alphabet.

Example for .txt file

10
Clementine Strawberry Grapefruit Raspberry Cassis 
4
6 10 3 
Banana Fig Ginger
2 1 9 8 4 
Apple Clementine Jujube Strawberry Raspberry
5 8 10 4 2 
Apple Strawberry Fig Raspberry Cassis
6 4 2 5 9 
Jujube Strawberry Raspberry Ginger Cassis

wifi – Can I make/receive calls on macbook if connected to same iphone’s personal hotspot

wifi – Can I make/receive calls on macbook if connected to same iphone’s personal hotspot – Ask Different

DreamProxies - Cheapest USA Elite Private Proxies 100 Private Proxies 200 Private Proxies 400 Private Proxies 1000 Private Proxies 2000 Private Proxies ExtraProxies.com - Buy Cheap Private Proxies Buy 50 Private Proxies Buy 100 Private Proxies Buy 200 Private Proxies Buy 500 Private Proxies Buy 1000 Private Proxies Buy 2000 Private Proxies ProxiesLive Proxies-free.com New Proxy Lists Every Day Proxies123